Headline
Creating Insecure AI Assistants With Microsoft Copilot Studio Is Easy
Microsoft claims 50,000 organizations are using its new Copilot Creation tool, but researcher Michael Bargury demonstrated at Black Hat USA ways it could unleash insecure chatbots.
Source: Brain Light via Alamy Stock Photo
BLACK HAT USA – Las Vegas – Wednesday, Aug. 7 – Enterprise usage of Copilot Studio, Microsoft’s automated chatbot creation tool, has grown considerably since its release less than nine months ago. But despite opening the floodgates for anyone to create so-called “copilots,” all bots created or modified with the service aren’t secure by default.
So says security researcher Michael Bargury, a former senior security architect in Microsoft’s Azure Security CTO office, now a project leader for the OWASP Low-Code/No-Code Top 10 Security Risks project and CTO at Zenity.
On Wednesday at Black Hat USA in Las Vegas, Bargury demonstrated how developers could unwittingly build copilots that inadvertently exfiltrate data or bypass policies and data loss prevention controls.
“It’s very, very easy to make a mistake with this no-code tool and introduce a serious security vulnerability into a copilot,” Bargury tells Dark Reading. “Actually, it’s very difficult to get everything right because there are so many ways to get it wrong.”
Rapid Adoption of Copilot Studio
The drag-and-drop, wizard-based Copilot Creation tool is available to all users of the Microsoft 365 productivity suite and has quickly introduced an easy way for power users in lines of business to create copilots, or AI assistants, that are designed to automate workflows and enable more efficient meetings, among other capabilities.
The addition of Copilot Studio to the mix further allows customers to extend the capability of these bots and build custom copilots.
During Microsoft’s fourth quarter, 2024 fiscal year earnings call on July 30, chairman and CEO Satya Nadella touted that the usage of Copilots grew 60% in the last quarter. And the number of organizations that have used Copilot Studio grew by 70% last quarter, reaching 50,000 shops, including Carnival, Cognizant, Eaton, KPMG, Majesco, and McKinsey.
Initial Release Was “Way Overpermissioned”
Among some of the initial problems Bargury spotted were in the default settings in the copilot bots created by Copilot Creator. Specifically, many of the bots were publicly accessible without requiring authentication. Also, they could impersonate a user with ease.
“You could create a copilot that bakes your identity into it,” he explains. “Now, I talk to that copilot over the Internet without logging in, and I’m using your identity. It was way overpermissioned.”
Bargury says he discovered a variety of other security faults following the release of Copilot Studio. For example, someone trying to create a copilot designed to call on public SharePoint sites could also tap a private SharePoint site on the same network. Because the copilots could easily be discovered outside of an organization, they could become conduits for remote attacks.
“I could search the Web for open Copilot Studio bots, and we found tens of thousands of them,” Bargury says. “And then I could fuzz those copilots with AI and find out which copilots I could talk to, and find out whether it’s willing to talk to me and what kinds of information and operations it could perform. And then I could grab information from them.”
He says that Microsoft has fixed the issues and has introduced new admin controls designed to strip away the ability to inadvertently create insecure actions, such as allowing an administrator to disallow users from creating bots that can be shared publicly. Admins should update their implementations to protect their organizations.
Low Code & Chatbots: Balancing Productivity & Security
The appeal of Microsoft Copilot and similar bots is that they enable users to be more productive, and they automate many routine tasks. But as this research shows, they can also be a weak link inside an organization. Bargury says he believes Microsoft is committed to ensuring Copilot Studio has better security from here on out.
“I think they are going to continue investing in giving admins more control,” he says. “But they are in a tough spot because they need to balance productivity with security. And we know where the balance ends.”
In his Black Hat session, Bargury also demonstrated CopilotHunter, a new module for the Power Pwn security tool set for Microsoft’s low-code/no-code Power Platform that scans for open Copilot Studio bots and fuzzes them to access the data behind them. It is available on GitHub.
15 Ways to Break Copilot Studio
In his conclusion, Bargury broke down the 15 security issues he discovered with Copilot Security.
Unreliable and untrusted input
Multiple data leakage scenarios
Oversharing sensitive data
Unexpected execution path
Unexpected execution path and operations
Data flowing outside org’s compliance and geo boundaries
Sensitive data oversharing and leakage
Destructive, unpredictable copilot actions
Gain unintended data access
Hardcoded credentials might be supplied as part of a copilot answer
Oversharing copilot access through channels
Oversharing copilot ownership with members
Oversharing copilot ownership (and more) with guests
About the Author
Jeffrey Schwartz is a journalist who has covered information security and all forms of business and enterprise IT, including client computing, data center and cloud infrastructure, and application development for more than 30 years. Jeff is a regular contributor to Channel Futures. Previously, he was editor-in-chief of Redmond magazine and contributed to its sister titles Redmond Channel Partner, Application Development Trends, and Virtualization Review. Earlier, he held editorial roles with CommunicationsWeek, InternetWeek, and VARBusiness. Jeff is based in the New York City suburb of Long Island.