Copilot is the AI assistant coming soon to Microsoft 365.* Hailed as a productivity gamechanger, Microsoft Copilot sets out to fundamentally change the way we interact with software.
Instead of learning a set of controls, and adapting your style to suit some software, Copilot uses natural language processing for a more intuitive way of working.
So… does that mean Microsoft’s AI will learn about us? In this article, we’ll discuss how Copilot works, how Copilot will use your organisation’s data, and how well Microsoft’s data and privacy safeguards stack up against alternative AI tools, such as ChatGPT.
*Microsoft Copilot is an upcoming service. While Microsoft have shared many exciting details so far, some assumptions may change ahead of the public launch – watch this space for updates!
How does Microsoft Copilot work?
Right now, the average user of an app like PowerPoint will only use around 10% of the available functionality. Microsoft argue that Copilot will unlock the remaining 90%. Sounds impressive, but how does it work?
We asked Charles Burkinshaw, one of our expert Microsoft 365 trainers at The Inform Team to break it down for us:
“On the surface, Copilot will look like ChatGPT. You’ll have a chat window, type in a prompt, and the AI will come back with a response, generating content to suit what you’ve asked for. ChatGPT uses OpenAI’s GPT4 large language model (LLM) to generate content. With Copilot, there’s more going on behind the scenes.
“Copilot uses Azure OpenAI, which is unique to Microsoft. It also uses Microsoft Graph to search your organisation’s files for relevant information to answer your request. So rather than the bland responses we sometimes see from ChatGPT, Copilot can generate content that contains specific information about your company and services, delivered in a style that makes sense for your workplace.”
Privacy and security: Copilot respects user-level permissions
If Copilot searches everything on our company tenant, how do we control what’s shared with individual users? What stops Copilot from showing people files that aren’t relevant, or sensitive information they aren’t meant to see?
It’s natural to have reservations about letting AI loose on your files. But in many ways, Copilot is an extension of what’s already available in M365.
Viva Insights already shows you documents for your upcoming meetings
Delve lets you search across your organisation for subject matter experts
SharePoint can help you track down what you need using a simple keyword search
According to Charles, it’s all down to what permissions are set. “This is an important security feature within Microsoft Graph,” says Charles. “Although Copilot is searching across your organisation’s files looking for data, Microsoft Graph validates this data at user level. You only get to see information that you’re allowed to access. None of the permissions are changed by Copilot in the back end.”
Who can see specific documents depends entirely on the user permissions you set as individuals and as an organisation.
AI and privacy: Does Copilot learn about your organisation over time?
It’s logical to think that as you and your colleagues add more information into Copilot, Microsoft might amass huge amounts of information about your organisation. But as Charles explains, that’s not really the case:
“Within a conversation, Copilot can use your prompts to provide the most relevant results, but this information isn’t fed back into the AI. It’s not learning more about you or your organisation and doesn’t grow with time. After every conversation, it resets to the original static version.”
Of course, Copilot can search new files in Microsoft Graph as things change, but it doesn’t accumulate information.
Your proprietary business data isn’t sent to Microsoft and it’s not used to train the AI model. While ChatGPT gathers your chat data by default – posing a data security risk – Copilot uses Microsoft 365’s built-in data security features, and nothing leaves your secure tenant.
Can organisations see Copilot search history?
This is an interesting question, and one we don’t have confirmation on yet.
In workplace cultures where knowledge is tied to value, it’s understandable that some people may worry about Copilot, and what it means for how others perceive them at work.
We know from our experience in digital change management and technology adoption that people are often wary of new tools – even if they’re designed to help. For example, one of the most frequently asked questions around Viva Insights is whether other people can see the working hours and meeting insights it generates.
There’s a balance to be struck between privacy for individuals and transparency for leaders about how people are using AI tools.
At an initial price of $30 per user per month, Copilot isn’t going to be cheap. Many organisations will want to see analytics around adoption, usage and ROI. Although details aren’t out yet, it’s feasible that Microsoft may offer Copilot analytics as they do with Viva Insights, showing team-level data around usage and trends, without identifying individual users.