Posted in:
FinOps & Cost ManagementThe April 2026 FinOps Foundation Virtual Summit made one thing clear: AI isn’t a future state for FinOps teams — it’s already here, and the practitioners who are figuring it out now will have a significant advantage.
The FinOps Foundation gathered practitioners, platform vendors, and cloud providers for its April 2026 virtual summit focused entirely on AI for FinOps. From crawl-walk-run frameworks to Google Cloud’s latest spend controls, the conversation was equal parts practical and forward-looking. Here are the five takeaways worth bringing back to your team.
1. Your Data Foundation Has to Come First
If there’s one theme that ran through every session, it was this: AI is only as good as the data underneath it.
Jonathan Moley of the FinOps Foundation put it simply — without a data foundation built to operationalize your information, AI will struggle for context. That means standardized unit economics, clean tagging strategies, and consistent cost allocation before you start layering in AI capabilities.
Amit Kanha of DoiT reinforced this from the practitioner side. When teams start using a conversational AI layer on top of messy cost data, they quickly discover how much they’ve been papering over — inconsistent tags, tribal knowledge that lives in someone’s head or a Confluence page, showback figures with negotiated adjustments that were never formally documented. The opportunity, as Amit framed it, is to use AI adoption as the forcing function to finally do the cleanup. Migrate to FOCUS. Simplify your tagging. Make your data legible to a system that doesn’t have two months of institutional context.
2. Generative AI and Agentic AI Are Not the Same Thing — and That Distinction Matters
The summit drew a clear line between two modes of AI that often get lumped together.
Generative AI functions as a high-level assistant. It answers questions when asked, handles one task at a time, and keeps a human in the loop. Think: ask it to explain a cost spike, generate a report, or write a SQL query against your billing data.
Agentic AI is the autonomous layer. It monitors continuously, investigates issues, and can take action across a technology ecosystem without waiting for a prompt. It collapses multi-step manual workflows into single automated steps — sometimes performed without human intervention at all.
Most organizations are operating in generative AI territory today, and that’s a reasonable place to start. But the practitioners building the most leverage are thinking ahead to where agentic capabilities fit — and, critically, what governance needs to be in place before they get there.
3. The Real ROI Is in Automating the Boring Stuff First
A consistent message from practitioners on the ground: don’t start with the high-stakes automation.
Amit Kanha was direct about it: “You’ll want to automate the wrong things first. You’ll want to automate the big moving pieces like commitments. But if you’re wrong, you’re going to shut the whole program down.”
The practitioners seeing results are starting with tagging enforcement, idle resource cleanup, non-prod shutdown schedules, and personalized Slack outreach to resource owners. One senior practitioner at a North American technology company reported a 50%+ action rate on agent-generated Slack messages identifying unused resources. The personalization is what makes it land — it doesn’t feel like an automated blast, it feels like a relevant conversation.
The principle is the same one that made FinOps work in the first place: start small, demonstrate value, build trust, then scale.
4. Token Economics Is the New Cloud Bill
FinOps Foundation Executive Director J.R. Stormment said it plainly: “The new job of the FinOps team in an organization is to maximize the value of the tokens.”
That framing is showing up everywhere. FinOps teams at Barclays are already using cost-per-message metrics in executive trade-off conversations. Teams at Snowflake are using AI-built commitment modeling to inform negotiations. SAP is presenting at FinOps X on how to build a token factory effectiveness framework at enterprise scale.
This isn’t theoretical. Organizations are watching token usage skyrocket and are building control planes to manage it. The metrics that matter are shifting — cost per inference call, input versus output token spend, model routing efficiency. FinOps teams that get ahead of this now are positioning themselves as the function that governs AI value, not just cloud costs.
As Stormmit noted, many organizations are already self-funding AI through optimization savings in their existing spend footprint: “That reminds me a lot of FinOps early days — how many of our FinOps teams were funded.”
5. The Cloud Providers Are Building FinOps Controls for AI — and It’s Moving Fast
Google Cloud’s session at the summit was a signal of where the broader market is headed. The announcements were notable:
- FinOps AI Explainability Agent — proactively scans your billing footprint and tells you why costs changed before you ask. Breaks down spend by model, modality (text vs. image vs. video), and token consumption (input vs. output).
- Spend Caps on Google Cloud Budgets — a one-click, non-destructive cost boundary that stops runaway AI spend without deleting resources or impacting other workloads. Private preview launches in May.
- Billing Account Groups and Commitment Tracking Reporting — consolidated spend visibility and commitment drawdown tracking for enterprise agreement customers.
The FOCUS specification is also keeping pace. FOCUS 1.4 — which wraps consistency review this month — adds invoice reconciliation, expanded contract commitment categorization, and per-row commitment program eligibility. Conformance for 1.3 is now open, and MongoDB, Oracle Cloud, Tencent Cloud, and Snowflake are in the pipeline for 1.3 support.
The infrastructure for governing AI spend is being built in real time. FinOps teams that engage with these tools early will have a head start on the organizations that wait for them to mature.
What This Means If You’re Managing Hybrid Infrastructure
The summit’s focus was largely on cloud-native AI spend — but the underlying challenge is broader than any single environment.
For organizations running workloads across on-premises infrastructure and cloud, the visibility problem is compounded. Token economics, AI model costs, and cloud-native controls matter — but so does what’s happening in the data center, on IBM Power, in your storage environment, and across the legacy commitments that don’t show up in a cloud billing console.
That’s exactly the gap Visual One Intelligence® was built to close. VOI’s Hybrid FinOps platform brings cloud and on-premises cost data together in a single view — giving FinOps practitioners, IT Finance Managers, and leadership the full picture they need to make decisions on budgeting, driver identification, and run vs. innovate spend trade-offs.
The FinOps Foundation is right: the shift to AI-driven cost management is already underway. But for hybrid organizations, that shift has to start with complete visibility across the entire infrastructure footprint — not just the cloud bill.
Visual One Intelligence® is a Hybrid FinOps and IT Operations Management platform. Learn more at visualoneintelligence.com.
Source: FinOps Foundation April 2026 Virtual Summit, “AI for FinOps: Helping FinOps Work Faster, Smarter.” Watch the full summit here
Check out The FinOps Foundation to learn more about FinOps and FinOps practices.
