AT2k Design BBS Message Area
Casually read the BBS message area using an easy to use interface. Messages are categorized exactly like they are on the BBS. You may post new messages or reply to existing messages!

You are not logged in. Login here for full access privileges.

Previous Message | Next Message | Back to Slashdot  <--  <--- Return to Home Page
   Local Database  Slashdot   [96 / 108] RSS
 From   To   Subject   Date/Time 
Message   VRSS    All   Cloud-Native Computing Is Poised To Explode   November 18, 2025
 6:40 PM  

Feed: Slashdot
Feed Link: https://slashdot.org/
---

Title: Cloud-Native Computing Is Poised To Explode

Link: https://slashdot.org/story/25/11/18/2242231/c...

An anonymous reader quotes a report from ZDNet: At KubeCon North America 2025
in Atlanta, the Cloud Native Computing Foundation (CNCF)'s leaders predicted
an enormous surge in cloud-native computing, driven by the explosive growth
of AI inference workloads. How much growth? They're predicting hundreds of
billions of dollars in spending over the next 18 months. [...] Where cloud-
native computing and AI inference come together is when AI is no longer a
separate track from cloud-native computing. Instead, AI workloads,
particularly inference tasks, are fueling a new era where intelligent
applications require scalable and reliable infrastructure. That era is
unfolding because, said [CNCF Executive Director Jonathan Bryce], "AI is
moving from a few 'Training supercomputers' to widespread 'Enterprise
Inference.' This is fundamentally a cloud-native problem. You, the platform
engineers, are the ones who will build the open-source platforms that unlock
enterprise AI." "Cloud native and AI-native development are merging, and it's
really an incredible place we're in right now," said CNCF CTO Chris
Aniszczyk. The data backs up this opinion. For example, Google has reported
that its internal inference jobs have processed 1.33 quadrillion tokens per
month recently, up from 980 trillion just months before. [...] Aniszczyk
added that cloud-native projects, especially Kubernetes, are adapting to
serve inference workloads at scale: "Kubernetes is obviously one of the
leading examples as of the last release the dynamic resource allocation
feature enables GPU and TPU hardware abstraction in a Kubernetes context." To
better meet the demand, the CNCF announced the Certified Kubernetes AI
Conformance Program, which aims to make AI workloads as portable and reliable
as traditional cloud-native applications. "As AI moves into production, teams
need a consistent infrastructure they can rely on," Aniszczyk stated during
his keynote. "This initiative will create shared guardrails to ensure AI
workloads behave predictably across environments. It builds on the same
community-driven standards process we've used with Kubernetes to help bring
consistency as AI adoption scales." What all this effort means for business
is that AI inference spending on cloud-native infrastructure and services
will reach into the hundreds of billions within the next 18 months. That
investment is because CNCF leaders predict that enterprises will race to
stand up reliable, cost-effective AI services.

Read more of this story at Slashdot.

---
VRSS v2.1.180528
  Show ANSI Codes | Hide BBCodes | Show Color Codes | Hide Encoding | Hide HTML Tags | Show Routing
Previous Message | Next Message | Back to Slashdot  <--  <--- Return to Home Page

VADV-PHP
Execution Time: 0.0158 seconds

If you experience any problems with this website or need help, contact the webmaster.
VADV-PHP Copyright © 2002-2025 Steve Winn, Aspect Technologies. All Rights Reserved.
Virtual Advanced Copyright © 1995-1997 Roland De Graaf.
v2.1.250224