Blog Not Using Microsoft Copilot Yet?
Let’s Get You Started.

Microsoft Copilot graphic

According to Microsoft’s latest work-trend index report, 75% of global knowledge workers are using generative AI. This begs the question: Have you provided generative AI solutions to 75% of your end users? If the answer is no, it’s time to acknowledge that your end users are bringing their own generative AI solutions to work. In fact, according to the same study, 78% of AI users bring their own solutions to work.

Right now, businesses are focusing efforts on AI strategy and all the concerns about privacy, security, and compliance that go with it. At the same time, end users are using unsanctioned AI tools that leave organizations vulnerable. Many worry about the risks of providing generative AI tools to users who are already using them regardless of any restrictive policies, creating more risk to their organizations in the long run.

Generative AI in your organization is proliferating whether you’ve approved solutions or not — along with the risk it creates. Thankfully there are a couple of very easy things you can do to mitigate these concerns. In addition to learning more about responsible AI use in your organization, you can figure out the particulars of your AI strategy.

1. Craft an end-user policy and adoption strategy

Start by engaging your AI council to create a general end-user policy. Don’t have an AI council? Microsoft has a great approach to help you build one. The first step to mitigating AI-related risk created by end users is to clearly communicate your expectations.

When you announce your policy, make sure you give end users access to the right tools and resources. Position them to learn more about AI solutions that you want them to use. This includes a full adoption plan that teaches them about your policies, responsible AI principles, prompting, and more.

Be sure to also give users a place to collaborate with experts and ask questions. Expect users to be hungry for official guidance from their leaders and ready to share their ideas about AI and what they have learned. This is a great way to identify early adopters who will help drive successful adoption in the future.

2. Use what you have – Microsoft Copilot with Enterprise Data Protection (EDP)

Microsoft recently announced changes to their Microsoft Copilot (with Commercial Data Protection) service, offering enterprise data protection for any user signed in with an Entra account at no additional cost. Microsoft Copilot addresses concerns over public AI solutions in several ways. All commitments security, privacy, and compliance that Microsoft makes for Microsoft 365 Copilot now also apply to Microsoft Copilot. This means that:

  • Data is secured through rest/in-transit encryption, physical security controls, and data isolation between tenants.
  • Any interactions your users have with Microsoft Copilot while signed in with an Entra account along with any of your data are private.
  • Purview capabilities apply to end-user prompts, which are logged and retained.
  • Microsoft helps safeguard against things like harmful content (go ahead, try to ask it anything controversial!) and prompt injections.
  • Your data does not go into any public LLM and isn’t used to train any models.

Microsoft Copilot provides a secure solution to your organization, while giving your users access to an image generation tool, Designer, which adheres to copyright laws. Check out the full announcement here.

Microsoft Copilot with EDP is a great low-cost, low-risk tool in your end-user AI journey. It’s built for organizations still crafting their AI strategy, along with those that have more mature AI practices. Microsoft even provides great adoption resources to help you out with Microsoft Copilot adoption.

Here are a few things you should know before officially rolling out Microsoft Copilot with EDP:

  • There are several ways to access Microsoft Copilot with EDP detailed in Microsoft’s Copilot documentation. Select one to communicate to end users to eliminate confusion.
  • There is a public version of Microsoft Copilot that does not offer the same protections. Be sure to read this article thoroughly, which provides important information about enforcing enterprise data protection. 
  • Entra ID signin is required. Our recommendation for enforcing Entra ID login is device-based policy enforcement in the Edge browser.
  • Technical and end-user learning resources can be found here.
Headshot of Stream Author

Anna Donnelly

Services Product Manager, Insight

Specializing in communications and collaboration technology with more than 10 years of experience in the Microsoft space, Anna strives to help organizations understand how modern workplace technology can positively impact their businesses and employees.

Information in this blog is current as of Sept. 2024 and is subject to change over time.