A technology professional in the context of Microsoft Business Applications can differ from one company to another. For some in a small/mid-market business (SMB), a technology professional could be someone who’s been handed the role of a Dynamics 365 or Power Platform Admin as they are tech-savvy and understand the business – and you may be that person!
In a large-scale enterprise organization, a technology professional could be someone on a team that has the responsibility of managing security (and only security) for the entire company and across multiple geographies.
In both cases, some things need to be understood and managed when it comes to Microsoft Copilot from the implementation to the licensing, from the security considerations to the configuration, and much more. This post will outline just a few of the things you need to know and what you should consider.
Note: Any references to technology requirements, standards, or needs are subject to change by Microsoft. Please refer to Microsoft’s documentation for details.
Prerequisites
Before diving into your Copilot implementation, there are a few prerequisites that you should be aware of. Further, these prerequisites are scattered across various Microsoft sites. So, I’ve assembled a few of the key items you should be aware of before implementation.
Additionally, in a blog post, Microsoft indicated that “your organization must meet some technical requirements and have some features enabled. Copilot users must have either a Microsoft 365 E3 or E5 license and an Azure Active Directory account, which gives them access to the Microsoft 365 apps and services that work with Copilot including Word, Excel, PowerPoint, OneDrive, Outlook, Loop, and more. Once available to your organization, your users will need to be on the Current Channel or Monthly Enterprise Channel for Microsoft 365 apps to have access to Copilot in desktop clients.”
Enable Users
Once you have set up the proper license types and Azure Active Directory, you will need to select which users will be allowed to use Copilot within your organization. However, I would suggest that you ask these questions when considering your user setup.
- Which users should have access to Copilot?
- Ensure you’ve implemented the proper app permission policies
- Copilot setup guide (Admin Center access required)
- Based on the users’ roles, which Copilot should they have access to?
- Are all the users trained to use Copilot properly?
- If not, I would highly recommend that training sessions be set up for users based on their roles and the type of Copilot they will be using.
In summary, multiple Copilot options can allow you to set up the one that is right for specific users. Further, I would suggest that users are not only trained (as indicated above) but understand the purpose of Copilot and its intended use within your organization. By combining these elements, you can personalize the internal training and help increase user adoption.
Security, Privacy, and Governance
In my prior post “Security and Microsoft Copilot – What You Need to Know”, I shared three key concepts that you need to understand when you implement AI/Copilot. However, as a person in a technology role, there are additional areas that need to be considered.
So, what do I mean by this?
First, most see security, privacy, and governance as a hindrance to new technology rollouts or adoption, while others see them as enablers for long-term success. This means rethinking security as not just a means to restrict, but by implementing the right policies, you and your organization can:
- Provide safe usage of the Copilot/AI applications.
- Protect internal data, IP, or other critical digital assets from being exposed to publicly accessible.
- Mitigate potential security risks (internal and external) that could be detrimental.
- Reduce future potential technical debt.
- Foster innovation and creativity.
Second, all security decisions should be rooted in a zero-trust mindset. This mindset is based on providing access when and where it’s needed, and no one can be trusted. This may seem contradictory to my last point above, but one bad actor can cause irreparable harm.
However, there is a caveat that you should be aware of when you are implementing security policies. Microsoft noted that there are occasions when your data could potentially leave service boundaries.
“When using Microsoft Copilot for Microsoft 365, your organization’s data might leave the Microsoft 365 service boundary under the following circumstances:
- When you allow Microsoft Copilot for Microsoft 365 chat experiences to reference public web content. The query sent to Bing might include your organization’s data. For more information, see Microsoft Copilot for Microsoft 365 and public web content.
- When you’re using plugins to help Microsoft Copilot for Microsoft 365 to provide more relevant information. Check the privacy statement and terms of use of the plugin to determine how it will handle your organization’s data. For information, see Extensibility of Microsoft Copilot for Microsoft 365.”
After digging a bit further, I found that you can control if Copilot can access the public web content or not. However, the tradeoff is that the AI grounding may not provide the best outcomes as Copilot will not continuously learn from public data, semantics, sentiment, and phrasing. This could result in a longer “learning curve” if Copilot is solely based on internal data.
My suggestion, considering all the above information, is to:
- Turn on Copilot with public web access for non-critical data locations to test the pros/cons.
- Reevaluate the degree of risk your company can take – Risk Appetite vs. Risk Tolerance.
Extended Copilot
Yes, you can be very creative working within the boundaries outlined above. However, many companies need the option to extend Copilot to connect to custom-built apps and services or create custom connections to SaaS apps that are not provided out of the box.
According to Microsoft, “When you extend Copilot for Microsoft 365, you maximize the efficiency of your apps and data with AI, by:
- Enriching the data estate of your enterprise with industry-leading AI.
- Keeping your users in the flow of their work, start to finish.
- Inheriting world-class security, compliance, and privacy policies.”
In short, the intent of this section is not to tell you how to extend Copilot. Rather, it’s intended to help create awareness of certain things as you plan your extensibility. As such, here are a few things to consider:
- Referencing the above image, I would recommend putting a security layer after the “External data” steps. This means that the data hits your security policies/tools (whichever you use) before the data goes into the Microsoft Graph Connect and/or the Plugin.
- Joining the “Microsoft 365 Developer Technology Adoption Program (TAP)” could
- Help you learn about upcoming features.
- Allow you to provide technical feedback.
- Connect with other like-minded developers.
- Don’t extend Copilot just because you can. While extending a service or app might be easy for you to do or your “go-to” method, consider other alternatives before you decide to go down the extensibility path.
Closing Thoughts
While I covered only three areas in the post, these are critical considerations for every IT pro to dig into when it comes to Copilot – no matter if you are a business applications system administrator, a cybersecurity specialist/architect, or a professional developer.
Further, I would recommend closely collaborating with other business decision-makers to ensure the configurations and policies that are implemented align with the company’s goals and objectives. Especially when it comes to understanding the degree of risk that the company can tolerate.
Lastly, a couple of things:
- Keep yourself up-to-date through ongoing training and/or certifications to ensure your decisions aren’t based on old information.
- Join the MS Copilot User Group community to learn, share, and collaborate with many others as they navigate their Copilot journey.
The post Microsoft Copilot Considerations for IT Pros appeared first on Dynamics Communities.