As large-language models become more widespread, who captures most of the value from these products? A brief look at some possibilities.

Incumbents that can layer in generative AI as a feature to existing application distribution.

The most obvious winners are incumbents that can leverage their existing distribution to enhance their products. Notion's new Notion AI. Microsoft Word's writing assistant super-powered by AI suggestions. Facets of Google Search. The incumbents are not only aware of the advances in AI, but are driving them and can afford the most R&D spend.

Hardware/cloud providers.

The first layer of "selling shovels". Model training and inference require significant amounts of specialized hardware – large machines and cutting-edge GPUs.

API providers

OpenAI, StabilityAI, Midjourney, and the numerous inference-as-an-API companies that will spring up are well positioned to capture value. Usage is already skyrocketing, and usage-based APIs are a well-understood business model.

New platforms

Like many of the developer-focused companies building platforms on AWS, there's value in the user experience. Platform companies can combine raw inference APIs into more useful building blocks.

Vertical solutions

While generative AI can solve a variety of problems, many industries will need tailored solutions for their workflows.