top of page

AI stack in motion: models, infrastructure, data – everything is being redistributed

  • Writer: Ralph Schwehr
    Ralph Schwehr
  • Dec 18, 2025
  • 6 min read

Just a few days until Christmas, fairy lights outside, a fireworks display of releases inside. While many switch into year-end mode, the big AI players are really ramping up their efforts: Google, NVIDIA, OpenAI, Microsoft, Amazon, IBM & Co. will be shifting the focus of the AI stack in just a few days, from standard models and infrastructure to data flows and location policies.


Before we dive in: Thank you for joining us week after week , reading this newsletter, sharing it in Teams channels, and using it as a basis for decision-making in your projects. That's precisely why we write it. If you like this issue, please feel free to forward it to colleagues, clients, or partners, because the more people understand technological change, the better the decisions will be.


By the way: Our newsletters are also available in English 🌍 Simply set the website language to English and presto, the text will be available in that language too. Thanks to AI 😉


What we will see this week before Christmas:

  • Models are becoming strategic defaults in search, apps & workflows.

  • Compute and energy are becoming the scarcest resources.

  • Real-time data streams are becoming the backbone of agent systems.

  • And politics and location issues are finally moving into the AI stack.

These four themes run through the ten topics of this issue and show what will really matter in 2026.



1. Models: Defaults become a strategic weapon

Google is ramping up its efforts: Gemini 3 Flash is becoming the new default model in the Gemini app and AI Search. This model is expected to outperform the previous Flash model not only in speed and cost, but also in reasoning, and is being rolled out broadly to developers, from AI Studio and the Gemini API to Vertex AI.

Google is sending two messages with this:

  1. Default wins ; whoever controls the default setting controls the usage.

  2. Low latency + high IQ is the new sweet spot: no longer a choice between "fast" and "smart".


OpenAI responds at the model level with GPT-5.2 in three variants: Instant , Thinking , and Pro . The new generation aims for improved planning, coding, long-context understanding, and production-ready outputs (spreadsheets, financial models, presentations), and follows an internal "Code Red" prioritization to counter the pressure from Gemini.


For companies, this means:

  • Model decisions are becoming less religious and more operational.

  • More important than "which model?" is the question: How do I integrate different profiles (fast vs. deep) into processes and agent architectures?


2. Compute & Infrastructure: “Infra is King” becomes reality

While public attention remains focused on models, real power shifts to the layers below.


Google + Meta: TorchTPU: With the "TorchTPU" project, Google and Meta are working to make TPUs the go-to solution for PyTorch workloads. The goal: to reduce dependence on the NVIDIA ecosystem around CUDA and make TPUs more attractive for the dominant framework, including potential open-source components.


NVIDIA + SchedMD (Slurm): NVIDIA is acquiring SchedMD , the developer of the open-source scheduler Slurm , which already orchestrates a large portion of the biggest AI and HPC clusters. While Slurm will remain open source, it will be more tightly integrated into NVIDIA's stack, from the GPU to workload optimization.


Microsoft: “Hundreds of billions” as the entry ticket: Mustafa Suleyman, CEO of Microsoft AI, puts it bluntly: Anyone who wants to keep up on the frontier must invest “hundreds of billions” in computing power, energy, and talent over the next 5-10 years. Microsoft now describes itself, somewhat ironically, as a modern construction company building gigawatts of AI infrastructure.


Hut 8 + Fluidstack + Anthropic: Simultaneously, the players on the infrastructure side are shifting: Hut 8 , formerly a Bitcoin miner, has secured a 15-year, $7 billion deal with Fluidstack and Anthropic to build a large AI data center in Louisiana, with an option for significantly more capacity. AI computing is becoming its own asset class with long-term leasing structures.


Frontier tech under one roof: Amazon is consolidating a new unit with Peter DeSantis , combining AI models, its own chips (Graviton, Trainium), and quantum computing . The goal: to shorten innovation cycles, strengthen vertical integration, and align AWS infrastructure even more closely with agent-based workloads.


The line is clear: Compute, energy and location policy are becoming boardroom topics.



3. Data Streams & M&A: Streaming Becomes the Backbone of Agent Systems

Models without data streams remain ineffective. Accordingly, investments in data platforms are being made aggressively.


IBM + Confluent (US$11 billion) : IBM is acquiring Confluent for US$11 billion to integrate its hybrid cloud and AI portfolio with a Kafka-based real-time streaming platform. The goal is a "Smart Data Platform" approach where AI agents can continuously access trusted, governance-enabled real-time data across applications and APIs.


$800 million in M&A firepower: SaaS provider Freshworks announces it will strategically use its $800 million cash for AI acquisitions, focusing on AI-native incident management, IT operations, and employee experience. With deals like the acquisition of FireHydrant, Freshworks is building an agentic ServiceOps platform designed to proactively prevent disruptions rather than simply reacting to them.


The message for enterprise IT:

  • Real-time streaming and AI are merging into one infrastructure category.

  • Mid-cap SaaS players are becoming serious M&A players in the AI ecosystem.


4. Politics & Location: “Stargate” becomes global and highly political

Finally, perhaps the most visible shift: AI is leaving the pure tech sphere and becoming part of infrastructure and location policy .


OpenAI has recruited former British Chancellor George Osborne as Managing Director for "OpenAI for Countries" and as the face of its global Stargate program . Osborne will be responsible for managing the international expansion of the $500 billion infrastructure project , including data centers outside the US and partnerships with governments worldwide.


This makes it clear:

  • AI infrastructure is negotiated like airports or ports , with questions about security, energy, education and sovereignty.

  • Tech companies are professionalizing their “Government & Infra” teams with political heavyweights.


For companies, this means that location, energy and data strategies can no longer be considered separately from AI roadmaps.



💡 Key takeaways in brief

  • Shifting Defaults : With Gemini 3 Flash and GPT-5.2, the race for default models in apps and search is reopened.

  • Infrastructure is King : TorchTPU, NVIDIA x Slurm and the capital requirements on a "hundred billion" scale show: Compute & Energy are the bottleneck.

  • AI Compute is becoming an asset class : Hut-8 deals and Amazon's bundling of AI/chips/quantum are making data centers a strategic investment asset.

  • Data streams are the new backbone : IBM + Confluent and Freshworks' M&A hunger are anchoring real-time streaming in the enterprise AI stack.

  • Politics moves into the stack : With Osborne and Stargate, AI infrastructure becomes a geopolitical issue and a top priority for governments.


🔍 Source overview


🎯 Conclusion & Call to Action

The new power distribution in the AI stack is no longer decided in the prompt window, but rather:


  • in data centers and gigawatt plans,

  • in M&A deals for data and streaming platforms,

  • in government programs like Stargate & Co.


The central question is not: "Which model is the best?" but rather: "How do you orchestrate models, infrastructure and data flows to create robust products, services and business models?"


👉 If you want to sort exactly that for your company, then please contact OAK AI directly.


And one more little preview 🎁: Our interactive AI Readiness Check is about to launch. With it, you can see in just a few minutes where your company really stands in the AI stack, from strategy and data to infrastructure.


More on this in the coming weeks in this newsletter.

Until then: If you want to take your AI strategy from buzzword level to infrastructure, data and business model reality , contact OAK AI at any time.

Yours truly, Ralph Schwehr


 
 
 

Comments


OAKAI®

© 2024 OAKAI®

Imprint

Data protection

  • LinkedIn

Thanks for subscribing!

NEWSLETTER

Sign up for the OAKAI newsletter.

bottom of page