This article is part of our series that explores the business of artificial intelligence
Since the launch of GPT-3, OpenAI has been carefully crafting a business model centered around large language models (LLM). The release of ChatGPT offered a glimpse of the potential mass market for these models.
ChatGPT’s unprecedented growth rate made it the fastest-growing application in history, igniting a surge of interest in LLMs and inspiring a wave of competitors, including numerous open-source alternatives.
To fortify its position in this burgeoning market, OpenAI has employed a variety of strategies. These include the introduction of innovative features and offering free access to less advanced models, thereby broadening its user base. At its recent DevDay conference, OpenAI revealed its latest strategy, which hinges on leveraging network effects to dominate the LLM market. This approach, whether beneficial or detrimental, is poised to significantly shape the trajectory of the field in the coming months and years.
Network effects are phenomena where a product or service’s value to its users escalates with each additional user. Consider a messaging app: its utility to you is contingent on your friends and family also using it. The more contacts you have on the platform, the more you’re likely to use it. This is a direct network effect.
However, some network effects are more complex, involving multiple types of users. Take online shopping platforms as an example. A merchant is inclined to sell their goods where potential customers are. Accordingly, customers gravitate towards platforms offering the goods they need. This interdependence makes it challenging to establish a new online shopping platform, especially with giants like Amazon dominating the market. This is an example of a two-sided network effect.
Creating the right network effects requires speed and market capture before other players can establish a foothold. This is why companies often adopt aggressive growth strategies, even at the cost of significant financial losses for several years.
The goal is to establish their network effects swiftly and solidly. Once these network effects are in place, it becomes exceedingly difficult for competitors to erode their market share.
This strategy, while risky, can lead to a dominant position in the market, making the initial losses a worthwhile investment. Network effects are among the most effective “moats,” which protect companies’ market share against competitors.
Network effects for ChatGPT
In their basic form, large language models like GPT-4 don’t inherently possess network effects. Each user’s interaction with the model is independent, devoid of any influence from other users. This lack of network effects makes it easy for users to switch to another LLM offering similar features and accuracy or better pricing. Consequently, LLMs risk becoming a commodity, with competition hinging on factors like pricing and availability.
OpenAI initially held an edge over competitors due to the superior performance of its models and its substantial financial resources, largely courtesy of Microsoft’s funding. This allowed OpenAI to spend huge sums on improving its models and offer free access to some of its models, attracting a broad user base.
However, without a moat, OpenAI risked losing its market to other competitors. OpenAI planted the seeds of network effects with the introduction of ChatGPT plugins in March. This move enabled developers to integrate ChatGPT into their apps, expanding the model’s use cases.
Now, there are two key players: users and developers. As users discover more applications for LLMs, they’ll seek the service providers that offer access to more tools. Simultaneously, developers creating plugins or LLM tools will gravitate towards the services with more users. This dynamic lays the groundwork for network effects akin to those seen in the Apple App Store and Google Play Store.
But creating network effects is not a one-time effort. OpenAI must continue to enhance its models and introduce new features to attract more users and developers. To complete the network effect loop, OpenAI unveiled two significant features at the DevDay conference: the GPT Store and a revenue-sharing scheme.
The GPT Store provides developers with a platform to offer their LLM apps to ChatGPT users. These could be app integrations, assistants, or other innovative applications that we’re sure to see in the coming weeks and months. The GPT Store is a strategic move, mirroring the familiar pattern of the Apple App Store and Google Play Store.
The revenue-sharing scheme offers further incentives for developers to remain on OpenAI’s platform. However, its success hinges on users’ willingness to pay for GPT assistants. With OpenAI making it easy for users to create their own GPTs, it will be intriguing to see how much people are willing to pay for GPTs versus creating their own.
OpenAI might also consider limited-duration incentive programs, such as paying developers to create GPT agents on the platform, to generate momentum. Once the platform reaches a critical mass and establishes a full ecosystem, it will organically attract more users and developers, much like the App Store. It can also trigger data network effects, where OpenAI gets access to exclusive high-quality user interaction data that enables it to train superior models. This strategy, if successful, could solidify OpenAI’s position in the LLM market.
What happens next
The ultimate form of the AI assistant experience remains uncertain. Will it exist as a browser-based assistant, seamlessly integrating with all your apps, from email and shopping to health? Or will it manifest as an AI agent embedded within every app interface? Could it take an entirely different form, like Humane’s wearable AI Pin, or perhaps a specialized phone, such as the one OpenAI is developing in collaboration with former Apple chief designer John Ive? While the answers to these questions are yet to unfold, one thing is clear: OpenAI, backed by Microsoft, is strategically positioning itself to exert control over the entire AI assistant stack.