Would an OpenAI work assistant put the company in conflict with Microsoft?

 

By Mark Sullivan

On Monday, The Information reported that OpenAI plans to market a “personalized assistant for work,” a move that would seemingly put the buzzy startup in competition with its chief backer, Microsoft. Such an assistant would seem to be in direct competition with the AI “copilots” Microsoft is integrating into its enterprise products as fast as it can. 

Microsoft invested $10 billion in OpenAI in January and owns an estimated 49% of the company. Microsoft has also based much of its enterprise AI offering on OpenAI research. But there may be room in the world for both an OpenAI workbot and Microsoft’s copilots. 

The Information talked to unnamed sources saying that OpenAI CEO Sam Altman privately told investors of a plan to turn ChatGPT into a “supersmart personal assistant for work.” That assistant might do things like write document drafts or summarize meetings. The trick is that the assistant would also know a lot about its user/master (the way they write, for example) as well as the work stuff they write about (proprietary corporate data or plans, perhaps).

And there’s the rub. An AI assistant is only as good as the information it can tap into. 

ChatGPT is trained only on mountains of data scraped from the public internet. Microsoft’s “copilots” combine that generalized knowledge (and great language skills) with nonpublic workplace data. That’s why the partnership between a leader in business software and cloud services and a leader in bleeding-edge generative AI makes sense. 

Let’s say a company that uses the Microsoft cloud and productivity apps holds a meeting in which sales data is discussed and product strategy decisions are made. That company may be comfortable capturing that data in the Microsoft cloud and even making it available to a large language model that’s hosted there. It might trust that the data would never go outside the Microsoft infrastructure. The company might feel far less comfortable sharing the data with an LLM hosted outside those walls, perhaps on (Azure) servers controlled by OpenAI.

There’s no doubt that OpenAI is looking at ways to access the lucrative enterprise market. Lots of people already use ChatGPT at work. The company already sells access to its GPT language models via an API, and it’s undergoing a process of building out the safety and security aspects of its enterprise offering. Companies that aren’t invested in Microsoft’s cloud and productivity suite might be more willing, for example, to train a ChatGPT assistant with some of its proprietary data, and even allow it to be hosted on OpenAI servers (given certain security requirements are met). 

 

Time will tell how high on OpenAI’s already-long priority list a ChatGPT work assistant really is. It’s doubtful that OpenAI’s plans for such a product are any surprise to Microsoft. It’s common knowledge that LLM chatbots will mainly be used either for work or for consumer applications. And at least right now, there’s more money in selling to enterprises. So the issue of an OpenAI-branded work assistant was probably discussed before Microsoft spent $10 billion on the partnership in January.

Fast Company

(12)