AI Pricing: Operating Costs will play a big role in pricing AI functionality

Steven Forth is a Managing Partner at Ibbaka. See his Skill Profile on Ibbaka Talio.

SaaS companies are used to having very high operating margins. Anything under 80% is frowned on and operating margins of more than 90% are common. This conditions many of the choices made by SaaS leaders and investors.

  • Prioritize revenue growth

  • ARR (Annual Recurring Revenue) is the key metric

  • Base valuations on ARR multiples

More recently a couple of things have been added to this.

  • Optimize Net Revenue Retention

  • Operate profitably (but continue to invest for top-line growth)

The changes of the past two years are forcing us to reconsider these priorities. Two things happened.

  1. Capital became more expensive and harder to come by

  2. AI and particularly generative AI emerged into the mainstream

Interest rates are moderating and with it fear of inflation, but we will not get back to the extremely low-interest rates we saw post the 2008 global financial crisis until global population decline really begins to bite, and that is not for a couple of more decades. Investors have other options and the surge of money into VC funds is subsiding.

VC investing is cyclical and this trend will no doubt right itself over the next decade. Innovation continues to drive the economy and innovation needs risk capital. There is no obvious substitute for the capital VCs provide.

Download the AI Monetization Research Report

The same cannot be said for the dominance of AI. Virtually all serious B2B SaaS companies are investing in some form of AI. Software companies that do not develop functionality leveraging AI will be mostly gone in five years. Those that remain will be relegated to niche markets and low margins.

AI is expensive. It is not just expensive to develop, VC can take care of that, but it is expensive to operate.

Boris Gamazaychikov at Salesforce has done some interesting analysis on the cost of training a large language model. See Reducing the Carbon Footprint of Generative AI.

LLMs are not just expensive to train. They are expensive to query. Prompts are computationally expensive to process. They require specially designed chips like those from Nvidia or Google’s TPUs (Tensor Processing Units). A lot of data is getting moved around, especially with advanced architectures like RAGs (Retrieval Augmented Generation) where context size. of the input has a big impact on results (cost is the main reason Open.ai has such strict limits on the size of a prompt).

This is not the world of applications built using the Model View Controller pattern with database back ends and lots of simple SQL queries. The models are orders of magnitude bigger. The queries (prompts) are more complex and can be more frequent. These systems are expensive to operate.

Depending on the configuration and use case we expect operating margins in B2B SaaS companies to trend down to 50% as AI-driven functionality comes to dominate.

Plenty of industries would be glad to have 50% operating margins, but this is not the world that most of us grew up in. Companies like Amazon, that are used to operating on thin margins, may have a cultural advantage. Companies that are native AI will be built around these margins. For the rest of us, it is going to be a shock to the system.

Leading VC’s have noticed the change. In Navigating The Cloud Landscape: A Recap of 2023 and Priorities for 2024 and Beyond Cota Capital founder, Bobby Yazdani says:

“Keeping in mind industry trends that crystallized in 2023, it is clear that the cloud sector is headed toward a future in which cost visibility, control, and optimization will become top priorities for companies of every size and at every stage.”

How does all of this impact Ibbaka and its customers? What does it mean when variable costs go up and gross margins go down?

The first thing to realize is that it is not just that costs have gone up but that the cost curve has changed. The cost curve is how costs change at different scales. It is generally assumed that costs are not linear and that in fact, they go down per unit of consumption with scale. The greater the proportion of fixed costs the more costs flatten off with scale. It is not clear that this will happen with AI. Fixed costs (training the model) will be high but will come down with time. Variable costs are also likely to be higher, much higher. AI actions will be more computationally expensive so that each time the AI is used there will be a meaningful cost. This requires a new way of thinking for SaaS executives.

Part of this will be cost management and AI cloud optimization. Microsoft has formed a team dedicated to bringing down the costs of generative AI. Expect to see many companies emerge offering to manage and reduce AI costs. An example is Charshift which has an article Understanding the True Cost of Large Language Models on its website. Bobby Yazdani emphasizes this in the above post.

But cost management is not going to be enough. AI-based SaaS solutions are going to need to

  1. Raise prices

  2. Have variable pricing

Raising prices requires that more value be delivered. The value cycle of value creation, communication, delivery, documentation, and then capture will need to be formally managed. The critical question for AI-based applications is what value am I creating, who am I creating it for, how can I document that value and how can I capture a fair part of that value?

Those are all generic questions. When investing in AI, which we must do if we are to survive, we have to also ask if we can create and capture enough value to cover the additional operating costs of AI.

Variable pricing will initially be driven by costs. We can see this in the pricing of large AI companies where input and output tokens are key pricing variables. (The other main pricing variable is model parameters.) Old-school industrial companies have been down this road and it does not work. Cost-based pricing gives the advantage to the low-cost provider and makes it difficult to raise prices.

There is a simple relationship that must be maintained across all scales of operation.

Value > Price > Cost

SaaS leaders transitioning to AI need to focus on value and cost. Value has to increase faster than costs (or if it does not then sales have to stop before the cost line crosses the value line). If you know your value and cost curves you can design pricing to fit. If you do not know these curves, and most B2B SaaS companies do not, then you are going to get your pricing wrong. You cannot afford to get pricing wrong. Your company will fail.

 
Previous
Previous

Interview with Dick Sobel on Pricing Acuity

Next
Next

Pricing and Planning: What will shape SaaS pricing in 2024?