We’re reviewing the earnings calls of 81 publicly traded SaaS companies to understand the impact AI is having on SaaS. Below we summarize the earnings calls of Snowflake, Salesforce, MongoDB, ServiceTitan, ServiceNow, Microsoft, Appfolio, and Palantir. We will release more blogs like this as we do more work. Below are the big takeaways from these first 8 calls.
-AI is not profitable yet; the goal for the moment is driving margin neutral revenue. Microsoft, Salesforce, ServiceNow and nearly every other company described margin pressures from deploying AI product. AI workloads are just very expensive at the moment.
-Direct revenue from AI is still in its very early stages at some very large players, but it is growing fast, especially amongst existing customers who are adopting AI products built by their already critical software providers.
-Some like Snowflake, ServiceNow, and MongoDB will win because AI is supported infrastructurally on their platforms. Others like ServiceTitan and Salesforce are building AI product which actually executes actions.
-Quite a few of these companies like MongoDB and ServiceNow closed some of largest deals ever in Q4.
-Moats for software are being built around data, workflows, governance, security, and operational knowledge of the existing customer. AI cannot standalone, but rather needs to sit on top of software which manages data and processes in an enterprise-friendly manner.
The actual calls are summarized below
Snowflake
Snowflake’s Q4 call told a story of a company in genuine transition, moving from being the place enterprises store and query data, to being the platform where they actually build and run AI. The numbers: product revenue grew 30% year-over-year, driven primarily by AI workloads, and accounts using AI features rose to more than 9,100, with Snowflake Intelligence, their flagship agentic product, scaling to over 2,500 accounts, nearly doubling quarter-over-quarter.
What makes Snowflake’s AI story somewhat distinct from peers is that AI is benefiting the business on both sides of the income statement. On the revenue side, larger and more strategic deals are getting done; Snowflake signed the largest deal in company history at over $400 million in total contract value, and closed seven nine-figure contracts in the quarter versus two in the same period last year. On the cost side, management reported 40% to 50% higher project margins and compressed delivery cycles through their own internal use of Snowflake Intelligence and Cortex Code, a credible proof point that software companies will be big beneficiaries of AI.
For customers, the value proposition is centered on two products. Cortex Code is used by over 4,400 customers, enabling faster development and deployment of AI workloads. Snowflake Intelligence, meanwhile, is the more agentic play, allowing enterprises to build AI-native workflows directly on top of their existing Snowflake data, without moving it elsewhere. The strategic logic is powerful: enterprises already trust Snowflake with their most sensitive data, and Snowflake is betting they’ll prefer to run AI on top of that data in place rather than pipe it out to a separate system.
The gross margin picture is the caveat. Management acknowledged that newly launched AI products currently carry lower margin profiles, and that margin expansion remains a near-term priority as efficiency improvements are realized. This is the same tension playing out across the AI infrastructure sector; serving AI workloads is compute-intensive, and the economics improve over time but aren’t fully mature yet. Snowflake guided FY27 product gross margin at 75%, roughly flat with FY26’s 75.8%.
The bigger strategic bet is on ecosystem. Snowflake expanded partnerships with Anthropic, OpenAI (a $200 million expansion), and Google Cloud, giving customers native access to leading AI models directly within the platform. Rather than picking a single model winner, Snowflake is positioning itself as the neutral, data-layer that works with all of them, a benefit to its enterprise customers who don’t want to be locked into a single AI vendor. If that positioning holds, Snowflake doesn’t just benefit from the AI wave; it becomes critical infrastructure beneath it.
Salesforce
Salesforce is at an inflection point with AI. Agentforce, Salesforce’s agentic AI platform, closed 29,000 deals in its first 15 months, with marquee enterprise customers like Amazon, Ford, AT&T, and Moderna signing on. The deals over $1 million were up 26% year-over-year, suggesting that AI may be driving larger more strategic contracts.
What’s most notable is how Salesforce is trying to reframe what AI does for customers. Rather than talking about AI in terms of features or capabilities, Benioff introduced a new metric, Agentic Work Units (AWUs), to show that AI agents on the platform are completing 2.4 billion discrete units of actual work: updating records, triggering workflows, making decisions. The message was deliberate: this isn’t AI that thinks or suggests, it’s AI that executes.
On the revenue side, Agentforce ARR hit ~$800 million, up 169% year-over-year, and the broader Agentforce and Data Cloud bundle crossed $2.9 billion ARR, up over 200%. Salesforce is monetizing AI through a mix of premium SKUs, new seat additions, and consumption-based flex credits, which gives them multiple vectors to capture value as usage scales.
The candid note was around gross margins. Token costs remain a real input expense, and Salesforce acknowledged it’s working to optimize efficiency to keep AI gross margins neutral in the near term. AI offerings are not profitable at the moment.
The bigger strategic picture is that Salesforce is betting AI transforms it from a system of record into a system of action where agents handle customer service, sales workflows, and operational decisions autonomously. If that holds, it strengthens their competitive moat considerably and raises switching costs for existing customers.
MongoDB
MongoDB is positioning to be the foundational data layer for the AI era, and the company’s job is to make sure the world knows it. Customers are excited about MongoDB’s platform strength and its ability to serve as an integrated data layer for AI agents, combining search, vector search, and embeddings in a single offering.
The most important thing to understand about MongoDB’s AI story is where they sit in the stack. Unlike Salesforce or Braze, MongoDB isn’t selling AI applications to end users. It’s providing the data infrastructure that AI applications run on. At its flagship MongoDB.local San Francisco event, the company announced the integration of its core database with embedding and reranking models from Voyage AI, creating a unified data intelligence layer for production AI allowing developers to build sophisticated applications at scale with reduced hallucination risk and no requirement to move or duplicate data. That no data movement angle is a meaningful competitive argument: enterprises with sensitive data in MongoDB don’t have to copy it to a separate vector store or AI pipeline, which simplifies architecture and lowers risk. For large financial institutions and regulated industries, who are among MongoDB’s most strategic customers, that matters enormously.
The candid acknowledgment from management, however, is that AI is not yet a material driver to results, though they are encouraged by the growth they are seeing with customers leveraging AI capabilities. The number of customers using vector search and Voyage embedding models nearly doubled year over year, and AI natives, digital natives, and large enterprises are all contributing to the growth across the customer base.
What’s actually driving the strong financials right now is a combination of AI-adjacent tailwinds and MongoDB’s core enterprise momentum. In Q4 they signed an approximately $90 million deal with a large tech company planning to expand both core and AI workloads on Atlas, and a greater than $100 million deal with a large financial institution for Enterprise Advanced, the largest total contract value deal in company history. The significance of that financial institution deal is hard to overstate: large banks are notoriously conservative with infrastructure decisions, and a nine-figure commitment to MongoDB speaks to how seriously enterprises are treating their data platform choices in the context of AI readiness.
AI actually amplifies the case for MongoDB’s multi-model, developer-friendly architecture. As enterprises move from traditional transactional applications toward AI-powered agentic workflows, the data requirements become more complex, requiring search, vector similarity, document storage, and real-time access all in one place. MongoDB’s goal is to become the generational data platform of choice in the AI and multi-cloud era, and if AI application development scales the way most expect, the demand for an integrated, performant, cloud-native data layer like Atlas should grow with it. The risk is that the revenue inflection from AI workloads takes longer to materialize than investors hope. but the Q4 numbers suggest the underlying business is strong enough to carry that bet comfortably while AI catches up.
ServiceTitan
The AI story here is unusually concrete. Where most SaaS companies are talking about AI in terms of platform strategy and future optionality, ServiceTitan is reporting actual customer outcomes from a live product, and those numbers are striking.
The centerpiece is Max, which ServiceTitan is positioning not as an AI feature but as an agentic operating system for the trades, meaning it’s designed to autonomously orchestrate end-to-end workflows across demand generation, dispatch, quoting, payments, and back-office operations for HVAC, plumbing, electrical, and other field service businesses. The first deployment cohort of Max customers saw a 50% increase in average ticket size, one customer achieved over 50% revenue growth in a single month, and another increased EBITDA margin from 18% to 30% while reducing office headcount. And management went further: customers on Max will about double their monthly subscription revenue when fully ramped, an effect not driven by technician expansion, meaning the value is coming from operational efficiency and higher revenue capture per job, not simply from adding more workers.
What makes ServiceTitan’s AI positioning particularly defensible is the data moat underlying it. The company leverages proprietary structured data from over $80 billion in annual transaction volume to drive automation and outcome improvements. This is a critical competitive point that CEO Ara Mahdessian leaned into when analysts asked about AI-native startups competing for the same customers. A point-solution AI tool built for the trades can optimize one workflow but it has no context on how that workflow connects to the rest of the business. ServiceTitan’s integrated platform, spanning marketing, scheduling, dispatch, invoicing, and payroll, creates a contextual data layer that generic AI tools simply can’t replicate from the outside.
The second AI product gaining traction is Virtual Agents: AI-based modules that handle inbound call management and appointment booking, especially during call surges or after normal business hours. For a trades business, missed calls during peak season or after hours are direct revenue losses. A potential customer who can’t book gets routed to a competitor. Virtual Agents directly plug that gap, and because it’s priced as a consumption product, it creates a new usage revenue stream that management believes could grow faster than GTV in fiscal 2027.
The honest caveat is that Max is still early-stage and capacity-constrained. ServiceTitan plans to double Max capacity in Q1 FY2027, with scaling tied to onboarding efficiency and customer success, which is a signal that the bottleneck right now is deployment and training, not demand. To accelerate on all fronts, the company hired Abhishek Mathur from Figma, Meta, and Microsoft as Chief Technology and Product Officer, specifically to drive organizational and technology velocity around AI initiatives. FY2027 is also expected to represent the company’s largest R&D investment ever, with explicit focus on AI inference and internal tooling. The trades are not typically an industry associated with cutting-edge software, which is precisely why ServiceTitan’s moat both in data and in customer relationships could prove so durable as AI raises the stakes for what field service software can actually do.
ServiceNow
If there’s one company in enterprise software that has the most fully-formed AI story right now, it’s ServiceNow. CEO Bill McDermott came into this call on offense, explicitly addressing the “AI will eat software” narrative that has spooked investors across the sector and flipping it on its head. His argument was direct: enterprise AI will be the largest driver of return on the multitrillion-dollar super cycle of AI infrastructure investment, and the real payoff comes when tokens move beyond pilots and get embedded directly into the workflows where business decisions are made, with ServiceNow serving as the semantic layer that makes AI ubiquitous in the enterprise.
The numbers behind Now Assist, ServiceNow’s AI product suite, are the most concrete AI monetization metrics in this cohort. Now Assist ACV surpassed $600 million, more than doubling year-over-year in Q4, with deals over $1 million nearly tripling quarter-over-quarter and 35 such deals closing in Q4 alone. Enterprises aren’t just adding AI as a line item but are making meaningful commitments to it within the ServiceNow platform. The AI control tower, which allows enterprises to govern and orchestrate AI agents across the business, grew over 4x its 2025 targets, another signal that customers are moving from individual AI use cases to thinking about AI management as a platform-level problem.
What ServiceNow is really selling is AI governance as much as AI capability. As enterprises deploy more agents someone has to be in charge of orchestrating them, monitoring them, and ensuring they don’t go rogue or create compliance exposure. ServiceNow is positioning its platform as that orchestration layer, what they call the “universal agentic network” built on MCP and Workflow Data Fabric. Monthly active users grew 25% and the number of workflows and transactions processed on the platform increased over 33% each, reaching 80 billion workflows and 6.4 trillion transactions.
ServiceNow doesn’t want to bet on any single model winning rather it wants to be the workflow layer that works with all of them, insulating itself from model commoditization while capturing value from enterprise adoption regardless of which AI providers customers prefer.
The honest tension in the ServiceNow story is pricing and gross margin. Management acknowledged some gross margin headwinds from hyperscaler and AI infrastructure choices, and the shift to a hybrid pricing model, combining traditional subscription with consumption-based AI usage, introduces some revenue variability that investors are still getting comfortable with. But with $15.5 billion in subscription revenue guided for 2026 at 19–20% growth, a 32% operating margin, and 36% free cash flow margins, ServiceNow is arguably the most financially powerful pure-play on enterprise AI adoption in the market right now. The core business is robust enough that they can absorb the cost of building out AI infrastructure while competitors are still figuring out their strategy.
Microsoft
Microsoft’s call was a declaration that AI is now the gravitational center of the entire business. Satya Nadella opened with a striking statement: “We are only at the beginning phases of AI diffusion and already Microsoft has built an AI business that is larger than some of our biggest franchises.” When you consider that Microsoft’s “biggest franchises” include Office, Windows, and Xbox, each multi-billion dollar businesses, that’s an extraordinary claim. And the numbers behind it are hard to argue with: Microsoft Cloud crossed $50 billion in quarterly revenue for the first time, up 26%, while Azure grew 39%, the acceleration driven explicitly by AI workloads.
The AI story at Microsoft operates on three distinct layers, each reinforcing the others. The first is infrastructure. Capital expenditures hit $37.5 billion in the quarter, with roughly two-thirds allocated to short-lived assets like GPUs and CPUs. Management introduced a new internal metric, tokens per watt per dollar, as their guiding optimization target for AI infrastructure, and reported a 50% increase in throughput on OpenAI inferencing due to infrastructure advances.
The second layer is the Copilot product suite, where AI is directly touching customers at scale. Microsoft 365 Copilot reached 15 million paid seats, with over 160% seat growth year-over-year and tripled number of customers with over 35,000 seats, a signal that enterprise adoption is moving from departmental pilots to company-wide deployments. Average user conversations doubled and daily active users grew 10x. GitHub Copilot reached 4.7 million paid subscribers, up 75% year-over-year, with individual Copilot Pro Plus subscriptions growing 77% sequentially.
The honest tension is on margins. Gross margins declined slightly, driven by continued AI infrastructure investments and growing AI product usage, and management guided for operating margins to be down slightly year-over-year in Q3. Microsoft is, in effect, spending today to capture a revenue curve that extends well into the next decade, and the Q2 numbers suggest that curve is steeper than almost anyone expected.
Appfolio
AppFolio’s Q4 call told a quietly compelling story for understanding how AI actually changes the economics of a vertical SaaS business. AppFolio serves property managers, a customer base that has historically been underserved by software and skeptical of hype. The fact that 98% of AppFolio’s customers are already actively using one or more AI capabilities included in the platform, against an industry backdrop where half of AI users in property management report they can’t actually rely on the AI features in their core system, is a striking market differentiation signal. It suggests AppFolio has threaded a needle that most vertical SaaS companies are still struggling with: making AI functional and trusted at the ground level.
AppFolio is repositioning its platform around three layers: a system of record, a system of action, and a system of growth with agentic AI embedded directly into daily operations, with the explicit goal of enabling customers to evolve from property managers to performance managers. That reframing reflects a fundamental shift in what AppFolio is selling: not software that helps you manage properties, but a platform that actively improves the financial performance of your property management business. Adoption of premium tiers has already exceeded 25%, a meaningful indicator that customers are upgrading to capture AI-driven capabilities.
The business impact of AI is showing up directly in AppFolio’s financials. Non-GAAP operating margin expanded to 24.9% in Q4, up from 20.2% a year earlier, a nearly 500 basis point improvement that reflects both revenue growth and the operating leverage that comes from AI-driven efficiency gains across the platform itself.
What makes AppFolio particularly interesting from an investment lens is that their AI advantage is compounding in a way that’s structurally hard to replicate. 45% of survey respondents in the property management industry say they plan to consolidate their software solutions and AppFolio is the natural consolidation destination, precisely because they’ve embedded AI deeply enough that customers would lose significant operational capability by leaving. The moat here isn’t just the product; it’s the AI-trained workflows, the resident data, and the operational patterns that accumulate the longer a customer stays on the platform. For a vertical SaaS business approaching $1 billion in revenue, that’s a durable competitive position.
Palantir
Palantir’s Q4 2025 call was unlike any other earnings call in enterprise software this cycle partly because of the numbers, which were objectively extraordinary, and partly because Alex Karp delivers earnings calls the way a wartime general addresses troops, not the way a CFO addresses analysts. But strip away the theater and what you find is a company that has, almost overnight, become one of the defining stories of what enterprise AI actually looks like when it works at production scale.
The headline numbers require context to be fully appreciated. Q4 revenue grew 70% year-over-year, the highest growth rate since Palantir went public, and representing a 3,400 basis point acceleration versus Q4 of the prior year. This isn’t a company maintaining a high growth rate, it’s a company where the growth rate itself is accelerating sharply. Full-year 2025 revenue grew 56%, and the company is guiding full-year 2026 revenue of $7.19 billion representing 61% growth.
The company closed 61 deals greater than $10 million in the quarter, and management cited multiple examples of customers signing $80 to $96 million contracts within months of initial engagement, a compressed deal cycle that reflects genuine organizational conviction.
The US vs. international divergence on the call was striking and strategically revealing. US revenue grew 93% year-over-year and 22% sequentially, while international commercial revenue grew just 8% year-over-year, a massive gap that management was candid about.
The government side of the business is also worth understanding in the context of AI impact. US government revenue grew 66% year-over-year, driven in part by a massive $10 billion Army software contract signed last summer and a $448 million Navy contract for shipbuilding supply chain modernization. Karp was unambiguous about what these contracts represent: not just data analytics or workflow tools, but AI systems that are actively changing the operational capabilities of the US military. He argued that AI implementations in the defense context have changed what warfighters are able to do.
The risk most frequently raised by skeptical analysts on the call is whether the US commercial acceleration is sustainable or whether it represents a concentrated burst of pent-up demand that will moderate. Palantir’s answer is essentially that AI-driven enterprise transformation is still in very early innings, that their pipeline of committed deal value reached $11.2 billion (up 105% year-over-year), and that the real constraint on growth is their own capacity to onboard and deliver for customers, not demand.