Microsoft Build 2023, Microsoft’s annual flagship event for developers, showcased AI-centric announcements across the Microsoft portfolio. The company provided a wealth of information over two days, much of which focused on what the company has done with its OpenAI investment.
Microsoft got the jump on its generative AI (GAI) initiatives at a much smaller February event it hosted in Redmond, with just a handful of analysts and media in attendance. Although the company’s integration of AI into Bing was exciting, I was somewhat skeptical about how GAI would meaningfully take hold, especially in the enterprise.
In the months since then, Microsoft has expanded its use of AI across its apps and services, including Microsoft 365 (which I wrote about here), Windows 11 integrated Bing Chat, Bing and Edge and more. Moving forward, I don’t imagine there will be any part of Microsoft that won’t have some element of AI. With the massive investments the company has made into OpenAI, it makes sense that it would continue to go all in on AI integrations.
AI was certainly front and center—and placed squarely where people do their work—at Microsoft Build, with integrations embedded at the point of need rather than merely being scattered across disparate apps. In a nutshell, GAI incorporated across Microsoft’s platform now acts as a centralized assistant that empowers users to collaborate and complete tasks regardless of which application they’re using. With this approach, Microsoft is working to solidify its AI first-mover advantage, meaningfully taking its story from buzzwords to business value.
This article outlines some of the highlights from Microsoft Build. I’ll also examine how the company’s latest AI developments fit into two general themes from Build 2023: plugins and copilots.
Plugging into the developer community
Microsoft announced it would adopt the same open plugin standard that OpenAI introduced for ChatGPT, growing the AI plugin ecosystem to leverage ChatGPT services within Microsoft. This means that developers can now use one platform to build plugins that work across consumer and business touchpoints, including ChatGPT, Bing, Dynamics 365 Copilot and Microsoft 365 Copilot.
Any plugins for AI applications built on the Azure OpenAI Service will be interoperable with this same plugin standard. This ups the ante for developers to create experiences that enable people to interact with apps using text and language prompts the same way they would use a chatbot. This is another example of meeting people where they are and providing the tools they need to drive better outcomes.
Microsoft also announced support for new plugins for Bing Chat. These add-ons interact with a wide range of platforms including Atlassian, Adobe, Instacart, Zillow, Klarna and many others, along with the already announced OpenTable and WolframAlpha. Microsoft expects thousands of plugins by the time Copilot is generally available. The vast user experience improvement of “interacting” with an app in this way has yet to be realized. Still, once the models are better trained—particularly with tenant data—having a chatbot in OpenTable will be like having a concierge that can make recommendations and reservations. Similarly, an Atlassian chatbot could become a scrum master (or scrum copilot) capable of organizing dev teams’ workflows.
Building the groundwork for enterprise generative AI with Copilots
As I mentioned, one central theme of Microsoft’s announcement was copilots. At the event, Microsoft showcased updated features for copilots that cater to a wide range of users. These include Dynamics 365 Copilot, Microsoft 365 Copilot and Copilot for Power Platform.
Microsoft’s approach to each copilot builds on the belief that AI’s current place in the workforce is to complement people in their roles rather than replace them. Integrating copilots directly into users’ workflows makes access to information readily accessible in the context of use rather than requiring the user to toggle through tools to accomplish an AI-assisted task.
Microsoft’s plan for copilots in workflows is nicely illustrated (pun intended) with the DALL·E-powered Bing Image Creator now functioning within Bing Chat. The company has opened a full public preview of the platform so that anyone with a Microsoft account can create images using a text prompt.
Microsoft also announced the expansion of a new AI-powered Bing for the Windows 11 taskbar, mobile and Skype.
Windows Copilot: It goes to (Windows) 11
Throughout the past year, Windows has experienced remarkable growth, primarily driven by the widespread adoption of Windows 11. Particularly noteworthy in fueling this growth has been developer engagement. Microsoft reported a notable 24% year-over-year increase in the usage of devices dedicated to development purposes.
Building on the integration into Windows 11 back in February that brought the new AI-powered Bing to the taskbar, Windows Copilot now makes Windows the first PC platform to centralize AI assistance. Using Bing Chat and first-party and third-party plugins, users can concentrate on realizing ideas, completing projects and collaborating effectively rather than expending energy searching for, launching and working with multiple applications.
Copilot in the Windows 11 taskbar opens the Copilot sidebar, which can help with tasks such as summaries and explanations. It provides a productivity boost and offers rudimentary (at least for now) IT support, as users can ask it to adjust their computer’s settings. Microsoft will start testing the Windows Copilot for Windows 11 publicly in June before a wider rollout.
Dev Home makes Windows dev machines easier to use
In one of many announcements catering to the development community, Microsoft showed that it is making it easier for developers to set up and use Windows dev machines. Dev Home is designed to allow developers to get a quick overview of their projects through GitHub widgets that surface GitHub issues and pull requests. Microsoft said it would eventually add the Xbox GDK to Dev Home to expand functionality to game developers.
The whole thing is essentially self-contained for developers, so that hopefully, spinning up a dev environment on a personal system will be much less clunky and less likely to foul up the system. A new Dev Home section of Windows 11 is now available in preview.
Microsoft also announced that Windows Terminal (a developer tool that enables multiple command-line apps or shells to run side-by-side in a customizable environment) would have an AI-powered chatbot. Through an integration with GitHub, developers who use GitHub Copilot can now use the chatbot directly within Windows Terminal to receive code recommendations and explanations for errors as well as to perform other actions. Microsoft says it’s also exploring integrating GitHub Copilot with other developer tools.
Consumer announcements from Microsoft Build 2023
Microsoft made several consumer announcements not specific to developers at Build, including the announcement of Bing as ChatGPT’s default search engine. ChatGPT Plus’s paid users will now see citations for the chatbot’s responses when surfaced by Bing. This is not surprising, given Microsoft’s multi-billion-dollar investment in OpenAI. With all the talk and media hysteria about chatbots “hallucinating,” Bing citations will help users discern real information and increase their confidence in ChatGPT’s results.
Microsoft is also bringing 365 Copilot to its Edge web browser. 365 Copilot will live within the browser’s sidebar to use content from the web for projects in Microsoft 365 apps. Again, this allows for less toggling and more focused work—something any Microsoft 365 user should appreciate.
Developments for the cloud
Microsoft also announced the implementation of its Hybrid Loop—initially introduced at last year’s Microsoft Build—designed to enhance AI development across different platforms. Hybrid Loop uses ONNX Runtime as a gateway to Windows AI and Olive, Microsoft’s toolchain that makes it easier to optimize models for different devices. With ONNX Runtime, third-party developers can use the same tools Microsoft uses to run AI models on Windows or other devices, whether it’s using CPU, GPU, NPU or hybrid with Azure. The goal is to support AI development from Azure to client devices, enabling hybrid AI mode to build for both ends of the spectrum. Hybrid inferencing scenarios refers to the use of local resources when possible, with the ability to switch to the cloud when needed.
Although I expected to see more on-device AI integrations from Microsoft, the Qualcomm partnership to deploy the Qualcomm AI Engine to deliver efficient machine learning at the edge highlights the increasing adoption of hybrid AI, distributing inference between the cloud and edge. This trend is driven by the demand for security, low latency and high performance, as well as the growing trend of AI at the edge where data is collected. Microsoft has also partnered with AMD, Intel and Nvidia for new silicon support.
Microsoft showcased its Azure AI Content Safety service that facilitates the establishment of secure online environments. Leveraging AI models, it identifies and categorizes offensive, violent, sexual and self-harm content in images and text, assigning severity scores to aid businesses in restricting content and prioritizing moderation. Azure AI Content Safety can comprehend nuance and context, minimizing false positives and alleviating the burden on content moderation teams. This is especially important as regulators try to figure out what constitutes “responsible” use of AI and how to regulate it via policy.
Microsoft Fabric to unify analytics stacks
Microsoft introduced Microsoft Fabric, an end-to-end unified analytics solution. Fabric is designed to help enterprises eliminate data silos and duplication and reduce the time it takes to turn raw data into business intelligence. A unified solution consolidating the necessary data provisioning, transformation, modeling and analysis services into one UI is a smart move for Microsoft, one that will help enterprises extract more value from their data while laying a foundation for the AI era.
Microsoft will continue to offer enterprise-grade PaaS solutions for data analytics. More than just repackaging existing tools, Fabric’s value proposition represents an evolution of those offerings in the form of a simplified SaaS solution (Fabric) that can connect to existing PaaS offerings such as Azure Synapse Analytics and Azure Data Factory. At the core of the new platform is Microsoft’s OneLake data lake. However, the platform can integrate data from Amazon S3 and will soon support data from Google Cloud as well. I think enterprises will appreciate streamlining their data infrastructure without being forced to rely exclusively on one cloud vendor.
The GAI race started with search, but that was just the tip of the iceberg. Once a mild sceptic, I’m now convinced AI is going to change nearly everything, particularly workflows. With these latest announcements, Microsoft is focusing on targeted use cases and UX refinement for developers and consumers. Microsoft did an excellent job showing the potential to unlock new opportunities that all these evolving technologies bring. The sheer number of use cases for developers addressed by Microsoft’s services, devices and applications made this year’s event one of the most exciting Microsoft Builds I’ve seen.
Note: This analysis contains significant contributions from Melody Brue, Modern Work Vice President and Principal Analyst.