Microsoft 365 Copilot is starting to feel more personal. Instead of replying like a blank chatbot each time, it is being designed to use more context from your day-to-day work.
That is exciting, because it can make Copilot more helpful. But it also comes with a simple truth we see again and again. Copilot will only be as smart and safe as the data foundations in your Microsoft 365 environment.
In this blog, I will walk through what “personalised Copilot” means, what the Microsoft Graph work graph has to do with it, and why information structure, permissions, and governance matter more than ever. I will also share practical steps we recommend so businesses can move forward with confidence.
From Generic AI to “My” AI
Copilot is moving from generic answers to answers that are more tailored to the person asking.
Microsoft has been introducing enhancements to Microsoft 365 Copilot that use signals from your Microsoft Graph work data to return more relevant, more contextual responses. Put simply, Copilot is getting better at understanding your projects, documents, and working relationships, instead of acting like it is starting fresh every time.
Microsoft is also adding settings that make it easier for users to see and manage what Copilot “remembers”, and to adjust personalisation preferences if something feels off.
So before we get into the “how”, let’s make sure we are clear on the “what”.
What Is the Microsoft Graph Work Graph?
Every day, your people create a “work graph” just by using Microsoft 365. It is not something they build on purpose. It is the natural trail of work that happens across email, meetings, files, and collaboration.
What the Work Graph Includes
The work graph can include signals from:
Emails, meetings, and calendar invites in Outlook
Files in OneDrive and SharePoint
Chats, channels, and meetings in Teams
People, roles, and relationships across your business
Copilot already uses this work graph to answer questions and draft content, based on each user’s existing permissions. What is changing is the depth and continuity of that experience.
Microsoft has signalled that enhanced Copilot memory can learn patterns over time, which can make follow-up prompts richer and more specific.
And here is the part many teams miss. If your information structure is messy, your work graph is messy too. When that happens, the AI experience can quickly feel noisy, confusing, or even risky.
So what is actually changing for users?
What’s New in Copilot Personalisation?
Microsoft’s recent and upcoming updates point to three key shifts for end users. Each one sounds simple on its own. Together, they change how Copilot fits into daily work.
1) Deeper Context From Your Work
Copilot can use more Microsoft Graph signals to tailor responses, such as which documents you have been working on recently and who you collaborate with most often.
That means prompts like “draft an update for our ESG project” are more likely to reference the right files and the right people, rather than pulling in random content from across your Microsoft 365 environment.
2) Enhanced Copilot Memory and User Controls
Microsoft is introducing “enhanced memory” so Copilot can remember things like ongoing topics, preferences, and tasks, not just what is in the current chat window.
Microsoft is also adding updated settings that can let users view and manage what Copilot remembers, and adjust how personal or general their Copilot experience should be.
This is where many business leaders start asking the right questions. If Copilot can remember more, what should it remember? And how do we keep that helpful without crossing a line?
I will come back to that in the governance section, but first there is a third change that matters for accuracy.
3) Richer Ways to Bring in Context
Microsoft has been expanding the ways users can feed context into Copilot.
For example, users are gaining the ability to upload additional file types such as .eml and .msg into Copilot Chat and the Copilot app, so email threads can be used as reference without manual copying.
Copilot Chat is also gaining “content source” controls, allowing users to scope answers to chosen sources like particular sites or files. This is a big deal, because it helps keep answers grounded in the right place.
What This Shift Really Means
Taken together, these changes move Copilot from “ask a question, get an answer” to something closer to an ongoing working relationship with each person.
That is exciting. It also means your data quality and governance no longer sit in the background. They directly shape the AI experience your people get every day.
So before businesses race ahead with personalisation, it is worth getting the foundations right.
Why Your Data Foundations Matter More Than Ever
When Copilot becomes more personal, poor data hygiene does not just frustrate IT teams. It frustrates everyone.
Here are some of the most common issues we see in Microsoft 365 environments:
Teams and SharePoint sites created for one-off projects, then never cleaned up
Files with vague names (like “Final_v5_REAL_FINAL.docx”) stored in random locations
Sensitive content sitting in spaces with overly broad permissions
No clear pattern for how departments structure and store information
With enhanced memory and deeper use of Microsoft Graph signals, these problems can show up in very practical ways:
Irrelevant suggestions, because Copilot can see too many similar documents
Confusing answers, because Copilot blends old work with current work
Real risk, where Copilot draws from content that should have been locked down
On the other hand, businesses that invest in a clean information structure and clear access controls often see faster adoption and higher trust in Copilot.
In plain terms, the less noise in your environment, the more likely Copilot will feel like a helpful colleague, not a random content generator.
And that naturally leads into the next topic. If personalisation is going to increase, governance needs to keep pace.
Governance: Making Personalisation Safe
As Copilot becomes more personal, strong governance and security are not optional. They are part of using Copilot responsibly.
Microsoft has been expanding controls across the Microsoft 365 stack, including capabilities tied to sensitivity labels, access controls, and protections across apps. Microsoft has also talked about AI watermarking options for some media types.
The key point is this. If Copilot is going to work across your content and collaboration, your rules for content and collaboration need to be clear.
Key Governance Questions to Answer
Here are the questions I recommend business and IT leaders work through early:
What should Copilot be allowed to remember?
Decide which types of “memory” are acceptable for your industry and your risk appetite, and where boundaries are needed.
Who controls personalisation settings?
Agree how much control sits with users versus central policy. This is especially important in regulated sectors.
Are permissions truly “need to know”?
Review SharePoint, Teams, and OneDrive permissions and align them to reality before Copilot starts surfacing summaries and recommendations across them.
How will you manage AI-generated content?
Consider policies for labelling and protecting AI-generated documents, presentations, and recordings, especially where sensitive data may be involved.
For many Australian businesses, these governance decisions also sit alongside local privacy, security, and compliance expectations. In some environments, frameworks like the Essential Eight will shape what “good” looks like.
The good news is you do not need to solve everything in one week. You just need a practical starting point and a clear plan. That is where the next section helps.
Practical Steps to Get Ready for Personalised Copilot
To make the most of personalisation, we usually recommend five practical steps. These steps are designed to improve the quality of Copilot results while reducing risk.
1) Run a Quick Information Architecture Health Check
Start by reviewing how key departments store and share content today, focusing on Teams, SharePoint, and OneDrive.
Look for “high-risk mess” areas where sensitive content is mixed with general content, or where old project sites are still active and still broadly accessible.
A simple goal here is to identify where Copilot could accidentally “see too much” and start responding in ways that cause confusion or concern.
2) Tidy Up Permissions Before You Turn on More AI
Use Microsoft tools and reports to find overly broad access, such as “Everyone except external users” on critical sites.
Then tighten access in priority areas first. In many businesses, that usually means:
Finance
HR
Board and executive content
Major client projects
You do not need to rebuild everything from scratch. But you do want to reduce the obvious risk areas before personalisation goes further.
3) Define Your Copilot Personalisation Policy
Work with business, IT, and risk stakeholders to define what “good” looks like for Copilot memory and context use.
Decide how much user control you are comfortable with, and where admin-level defaults should apply.
This is also a good moment to agree on simple internal guidance, like when staff should use content scoping and when they should avoid using sensitive content in prompts.
4) Create Simple Patterns for Sites and Teams
This step is about making the “good way” the easy way.
Standardise how you name Teams and SharePoint sites, and where different types of content live. Then provide templates for common needs, like:
Client projects
Internal initiatives
Department workspaces
When new spaces start in a good state, you spend less time fixing problems later. You also improve the work graph that Copilot depends on.
5) Invest in User Education, Not Just Licences
This is a big one. Copilot results improve when people know how to work with it.
Teach staff how to use content scoping in Copilot, such as “use only this site” or “use this file as context”, so answers stay grounded in the right source.
Then build short, role-based training that shows what personalisation looks like in real work. For example:
A salesperson preparing an account update
A project manager drafting a status report
A finance analyst summarising a month-end pack
When people can see the value in their own workflow, adoption becomes easier and safer at the same time.
These steps help ensure enhanced memory feels helpful, not uncomfortable, and that AI-powered outputs stay aligned with your governance standards.
So where does CG TECH fit into all this?
How We Help Australian Businesses
Many Australian businesses do not have the time or in-house capacity to line up information structure, security, and change management before switching on more Copilot capability.
That is where we can help.
At CG TECH, we focus on practical work that improves outcomes and reduces risk.
Typical Ways We Support Customers
Microsoft 365 and Copilot readiness assessment We review your current environment, roadmap, and risk profile against Microsoft 365 and Copilot capability changes.
Information architecture and security uplift We help clean up sites, set sensible structures, and apply the right mix of sensitivity labels and access controls.
AI governance design We help you set clear policies for Copilot memory, content handling, and compliance expectations, in a way that works in real life.
Role-based Copilot enablement We deliver targeted training and adoption support so people change how they work, not just watch a launch video.
Because we work across Microsoft 365, Azure, security, and data platforms, we can also connect your Copilot work to broader efforts like data platform modernisation and analytics, where that makes sense.
Bringing It All Together
Enhanced memory and deeper use of the Microsoft Graph work graph are clear signs of where Microsoft is heading.
AI that understands your business context, not just generic content.
For business leaders, that means the value you get from Copilot is now closely tied to the quality, structure, and safety of your Microsoft 365 environment.
If you get the foundations right, including information structure, permissions, governance, and change, Copilot can start to feel like a digital colleague who understands your context and your priorities.
If you ignore those foundations, you can end up with confusion, low trust, and compliance headaches.
Now is a smart time to review your Microsoft 365 environment, set a clear approach to personalisation, and plan how your people will work alongside AI every day. If you want a clear roadmap and practical safeguards, we are ready to help.
About the Author
Carlos Garcia is the Founder and Managing Director of CG TECH, where he leads enterprise digital transformation projects across Australia.
With deep experience in business process automation, Microsoft 365, and AI-powered workplace solutions, Carlos has helped businesses in government, healthcare, and enterprise sectors streamline workflows and improve efficiency.
He holds Microsoft certifications in Power Platform and Azure and regularly shares practical guidance on Copilot readiness, data strategy, and AI adoption.
Microsoft 365 Copilot is starting to feel more personal. Instead of replying like a blank chatbot each time, it is being designed to use more context from your day-to-day work.
That is exciting, because it can make Copilot more helpful. But it also comes with a simple truth we see again and again. Copilot will only be as smart and safe as the data foundations in your Microsoft 365 environment.
In this blog, I will walk through what “personalised Copilot” means, what the Microsoft Graph work graph has to do with it, and why information structure, permissions, and governance matter more than ever. I will also share practical steps we recommend so businesses can move forward with confidence.
From Generic AI to “My” AI
Copilot is moving from generic answers to answers that are more tailored to the person asking.
Microsoft has been introducing enhancements to Microsoft 365 Copilot that use signals from your Microsoft Graph work data to return more relevant, more contextual responses. Put simply, Copilot is getting better at understanding your projects, documents, and working relationships, instead of acting like it is starting fresh every time.
Microsoft is also adding settings that make it easier for users to see and manage what Copilot “remembers”, and to adjust personalisation preferences if something feels off.
So before we get into the “how”, let’s make sure we are clear on the “what”.
What Is the Microsoft Graph Work Graph?
Every day, your people create a “work graph” just by using Microsoft 365. It is not something they build on purpose. It is the natural trail of work that happens across email, meetings, files, and collaboration.
What the Work Graph Includes
The work graph can include signals from:
Copilot already uses this work graph to answer questions and draft content, based on each user’s existing permissions. What is changing is the depth and continuity of that experience.
Microsoft has signalled that enhanced Copilot memory can learn patterns over time, which can make follow-up prompts richer and more specific.
And here is the part many teams miss. If your information structure is messy, your work graph is messy too. When that happens, the AI experience can quickly feel noisy, confusing, or even risky.
So what is actually changing for users?
What’s New in Copilot Personalisation?
Microsoft’s recent and upcoming updates point to three key shifts for end users. Each one sounds simple on its own. Together, they change how Copilot fits into daily work.
1) Deeper Context From Your Work
Copilot can use more Microsoft Graph signals to tailor responses, such as which documents you have been working on recently and who you collaborate with most often.
That means prompts like “draft an update for our ESG project” are more likely to reference the right files and the right people, rather than pulling in random content from across your Microsoft 365 environment.
2) Enhanced Copilot Memory and User Controls
Microsoft is introducing “enhanced memory” so Copilot can remember things like ongoing topics, preferences, and tasks, not just what is in the current chat window.
Microsoft is also adding updated settings that can let users view and manage what Copilot remembers, and adjust how personal or general their Copilot experience should be.
This is where many business leaders start asking the right questions. If Copilot can remember more, what should it remember? And how do we keep that helpful without crossing a line?
I will come back to that in the governance section, but first there is a third change that matters for accuracy.
3) Richer Ways to Bring in Context
Microsoft has been expanding the ways users can feed context into Copilot.
For example, users are gaining the ability to upload additional file types such as .eml and .msg into Copilot Chat and the Copilot app, so email threads can be used as reference without manual copying.
Copilot Chat is also gaining “content source” controls, allowing users to scope answers to chosen sources like particular sites or files. This is a big deal, because it helps keep answers grounded in the right place.
What This Shift Really Means
Taken together, these changes move Copilot from “ask a question, get an answer” to something closer to an ongoing working relationship with each person.
That is exciting. It also means your data quality and governance no longer sit in the background. They directly shape the AI experience your people get every day.
So before businesses race ahead with personalisation, it is worth getting the foundations right.
Why Your Data Foundations Matter More Than Ever
When Copilot becomes more personal, poor data hygiene does not just frustrate IT teams. It frustrates everyone.
Here are some of the most common issues we see in Microsoft 365 environments:
With enhanced memory and deeper use of Microsoft Graph signals, these problems can show up in very practical ways:
On the other hand, businesses that invest in a clean information structure and clear access controls often see faster adoption and higher trust in Copilot.
In plain terms, the less noise in your environment, the more likely Copilot will feel like a helpful colleague, not a random content generator.
And that naturally leads into the next topic. If personalisation is going to increase, governance needs to keep pace.
Governance: Making Personalisation Safe
As Copilot becomes more personal, strong governance and security are not optional. They are part of using Copilot responsibly.
Microsoft has been expanding controls across the Microsoft 365 stack, including capabilities tied to sensitivity labels, access controls, and protections across apps. Microsoft has also talked about AI watermarking options for some media types.
The key point is this. If Copilot is going to work across your content and collaboration, your rules for content and collaboration need to be clear.
Key Governance Questions to Answer
Here are the questions I recommend business and IT leaders work through early:
What should Copilot be allowed to remember?
Decide which types of “memory” are acceptable for your industry and your risk appetite, and where boundaries are needed.
Who controls personalisation settings?
Agree how much control sits with users versus central policy. This is especially important in regulated sectors.
Are permissions truly “need to know”?
Review SharePoint, Teams, and OneDrive permissions and align them to reality before Copilot starts surfacing summaries and recommendations across them.
How will you manage AI-generated content?
Consider policies for labelling and protecting AI-generated documents, presentations, and recordings, especially where sensitive data may be involved.
For many Australian businesses, these governance decisions also sit alongside local privacy, security, and compliance expectations. In some environments, frameworks like the Essential Eight will shape what “good” looks like.
The good news is you do not need to solve everything in one week. You just need a practical starting point and a clear plan. That is where the next section helps.
Practical Steps to Get Ready for Personalised Copilot
To make the most of personalisation, we usually recommend five practical steps. These steps are designed to improve the quality of Copilot results while reducing risk.
1) Run a Quick Information Architecture Health Check
Start by reviewing how key departments store and share content today, focusing on Teams, SharePoint, and OneDrive.
Look for “high-risk mess” areas where sensitive content is mixed with general content, or where old project sites are still active and still broadly accessible.
A simple goal here is to identify where Copilot could accidentally “see too much” and start responding in ways that cause confusion or concern.
2) Tidy Up Permissions Before You Turn on More AI
Use Microsoft tools and reports to find overly broad access, such as “Everyone except external users” on critical sites.
Then tighten access in priority areas first. In many businesses, that usually means:
You do not need to rebuild everything from scratch. But you do want to reduce the obvious risk areas before personalisation goes further.
3) Define Your Copilot Personalisation Policy
Work with business, IT, and risk stakeholders to define what “good” looks like for Copilot memory and context use.
Decide how much user control you are comfortable with, and where admin-level defaults should apply.
This is also a good moment to agree on simple internal guidance, like when staff should use content scoping and when they should avoid using sensitive content in prompts.
4) Create Simple Patterns for Sites and Teams
This step is about making the “good way” the easy way.
Standardise how you name Teams and SharePoint sites, and where different types of content live. Then provide templates for common needs, like:
When new spaces start in a good state, you spend less time fixing problems later. You also improve the work graph that Copilot depends on.
5) Invest in User Education, Not Just Licences
This is a big one. Copilot results improve when people know how to work with it.
Teach staff how to use content scoping in Copilot, such as “use only this site” or “use this file as context”, so answers stay grounded in the right source.
Then build short, role-based training that shows what personalisation looks like in real work. For example:
When people can see the value in their own workflow, adoption becomes easier and safer at the same time.
These steps help ensure enhanced memory feels helpful, not uncomfortable, and that AI-powered outputs stay aligned with your governance standards.
So where does CG TECH fit into all this?
How We Help Australian Businesses
Many Australian businesses do not have the time or in-house capacity to line up information structure, security, and change management before switching on more Copilot capability.
That is where we can help.
At CG TECH, we focus on practical work that improves outcomes and reduces risk.
Typical Ways We Support Customers
We review your current environment, roadmap, and risk profile against Microsoft 365 and Copilot capability changes.
We help clean up sites, set sensible structures, and apply the right mix of sensitivity labels and access controls.
We help you set clear policies for Copilot memory, content handling, and compliance expectations, in a way that works in real life.
We deliver targeted training and adoption support so people change how they work, not just watch a launch video.
Because we work across Microsoft 365, Azure, security, and data platforms, we can also connect your Copilot work to broader efforts like data platform modernisation and analytics, where that makes sense.
Bringing It All Together
Enhanced memory and deeper use of the Microsoft Graph work graph are clear signs of where Microsoft is heading.
AI that understands your business context, not just generic content.
For business leaders, that means the value you get from Copilot is now closely tied to the quality, structure, and safety of your Microsoft 365 environment.
If you get the foundations right, including information structure, permissions, governance, and change, Copilot can start to feel like a digital colleague who understands your context and your priorities.
If you ignore those foundations, you can end up with confusion, low trust, and compliance headaches.
Now is a smart time to review your Microsoft 365 environment, set a clear approach to personalisation, and plan how your people will work alongside AI every day. If you want a clear roadmap and practical safeguards, we are ready to help.
About the Author
Carlos Garcia is the Founder and Managing Director of CG TECH, where he leads enterprise digital transformation projects across Australia.
With deep experience in business process automation, Microsoft 365, and AI-powered workplace solutions, Carlos has helped businesses in government, healthcare, and enterprise sectors streamline workflows and improve efficiency.
He holds Microsoft certifications in Power Platform and Azure and regularly shares practical guidance on Copilot readiness, data strategy, and AI adoption.
Sources
Recent Posts
Popular Categories
Archives