Technology

Microsoft Invests Heavily in AI App Plug-ins

Microsoft Invests Heavily in AI App Plug-ins

Microsoft intends to expand its ecosystem of AI-powered apps and services, dubbed “copilots,” with third-party developer plug-ins.

Microsoft announced today at its annual Build conference that it is adopting the same plug-in standard that its close collaborator, OpenAI, introduced for ChatGPT, its AI-powered chatbot — allowing developers to create plug-ins that work across ChatGPT, Bing Chat (on the web and in the Microsoft Edge sidebar), Dynamics 365 Copilot, Microsoft 365 Copilot, and the recently launched Windows Copilot.

“I think over the coming years, this will become an expectation for how all software works,” Microsoft’s CTO Kevin Scott stated in a blog post shared with TechCrunch last week.

Bold statements aside, the new plug-in framework enables interaction between Microsoft’s family of “copilots” and a variety of other programs and services. Copilots are AI-powered apps that help users with tasks like email writing and image generation. Developers can create plug-ins that retrieve real-time data, include a corporation or other business data, and act on behalf of users using IDEs like Visual Studio, Codespaces, and Visual Studio Code.

Microsoft Invests Heavily in AI App Plug-ins

The Microsoft 365 Copilot, for instance, might use a plug-in to ask a website like WolframAlpha to solve an equation, make travel plans in accordance with a company’s travel policy, or respond to inquiries about how certain legal concerns at a company were handled in the past.

In the next weeks, users of the Microsoft 365 Copilot Early Access Program will be able to access additional plug-ins from partners like Atlassian, Adobe, ServiceNow, Thomson Reuters, Moveworks, and Mural. In the meanwhile, Bing Chat will add additional plug-ins from Instacart, Kayak, Klarna, Redfin, and Zillow to its current collection, and the same Bing Chat plug-ins will be made available on Windows through Windows Copilot.

The OpenTable plug-in, for example, enables Bing Chat to search across restaurants for available bookings, whereas the Instacart plug-in enables the chatbot to take a dinner menu, convert it to a shopping list, and place an order to have the supplies delivered. Meanwhile, the new Bing plug-in integrates the Bing site and search data into ChatGPT, replete with citations.

A new framework: Scott defines plug-ins as a connection between an AI system, such as ChatGPT, and data that a third party wants to keep private or proprietary. A plug-in grants an AI system access to certain confidential files, allowing it to answer questions regarding business-specific data, for example.

There is undoubtedly an increasing demand for such a bridge as privacy becomes a key concern with generative AI, which has a tendency to leak sensitive data from the datasets on which it was trained, such as phone numbers and email addresses. To reduce risk, businesses such as Apple and Samsung have prohibited employees from using ChatGPT and similar AI applications, citing fears that employees would mishandle and disclose secret data to the system.

Teams message extensions, which allow users to connect with a web service using Teams buttons and forms, are not new. Power Platform connectors, which operate as a shell around an API that lets the underlying service “talk” to apps in Microsoft’s Power Platform portfolio (for example, Power Automate), are also not supported. However, Microsoft is broadening its reach by allowing developers to use new and existing message extensions and connectors to enhance Microsoft 365 Copilot, the company’s assistant feature for Microsoft 365 apps and services such as Word, Excel, and PowerPoint.

For instance, Microsoft’s “Dataverse,” a service that stores and manages data used by internal business apps, can be populated with structured data using Power Platform connectors, which Microsoft 365 Copilot can then access. Microsoft demonstrated how Dentsu, a public relations company, used Microsoft 365 Copilot along with a Jira plug-in and information from Atlassian’s Confluence without having to build new code during Build.

Developers will be able to create and debug their own plug-ins in a variety of ways, according to Microsoft, including through its Azure AI family of apps, which is providing features to run and test plug-ins on private enterprise data. Azure OpenAI Service, Microsoft’s managed, enterprise-focused product that provides businesses with access to OpenAI technologies while also providing additional governance features, will also enable plug-ins. In addition, the Teams Toolkit for Visual Studio will receive support for piloting plug-ins.

Transitioning to a platform: According to Microsoft, developers will be able to configure, publish, and manage plug-ins through the Developer Portal for Teams, among other places. They will also be able to monetize them, though the business did not specify how pricing will work.

In any event, Microsoft is playing for keeps in the extremely competitive generative AI competition with plug-ins. Plug-ins essentially turn the company’s “copilots” into aggregators, putting them on track to become one-stop shops for both commercial and consumer consumers.

Microsoft undoubtedly sees the lock-in prospect as becoming more and more important as the firm deals with competition from startups and tech giants alike who are developing generative AI, such as Google and Anthropic. As apps and services increasingly rely on generative AI, one might see plug-ins developing into a lucrative new revenue stream in the future. Additionally, it might ease the concerns of corporations who assert that generative AI developed using their data infringes on their legal rights; companies like Getty Images and Reddit have taken action to stop this from happening.

Competitors should come out with their own plug-ins frameworks to counter Microsoft’s and OpenAI’s. But like OpenAI with ChatGPT, Microsoft benefits from being the first to market.