image of this article category

AI Workloads Are Reshaping Enterprise Networks

23.04.2025 08:45 AM
Latest
AI Workloads Are Reshaping Enterprise Networks
dooklik website logo
As businesses increasingly deploy AI in production, the rapid expansion of the technology — both at the edge and within data centers — is placing unprecedented demands on bandwidth, latency, and network architecture. Traditional networks were not built to support this scale and complexity.
share
share this article on facebook
share this article on twitter
share this article on whatsapp
share this article on facebook messenger
AI Workloads Are Reshaping Enterprise Networks

According to new data from Omdia, AI-related traffic — including both newly developed AI applications and those enhanced with AI — amounted to 39 exabytes of total network traffic in 2024. Meanwhile, AI-enhanced applications without AI-specific traffic accounted for 131 exabytes, and conventional application traffic reached 308 exabytes. These figures were shared by Brian Washburn, Omdia’s research director.

Looking ahead, Omdia projects that AI traffic will surge to 79 exabytes in 2025, nearly doubling in just one year. Washburn anticipates this growth will continue to far outpace that of traditional traffic, eventually surpassing it by 2031.

New AI-driven traffic includes use cases like visual processing, surveillance, next-gen games and media, and generative AI content. AI-enhanced traffic encompasses smart transcription, content summarization, code review, intelligent analytics, natural language querying, and content moderation. Notably, these statistics exclude fully private networks such as those used by hyperscalers, on-premise deployments, and enterprise campuses.




A recent report from Zscaler highlights the growing AI footprint in enterprise environments, noting a staggering 3,464% year-over-year increase in AI-related activity. In the last 11 months of 2024 alone, 3,624 terabytes of data were exchanged with more than 800 AI tools, including ChatGPT.

Salesforce is one example of a company already feeling the shift. With the integration of both generative and agent-based AI into its cloud CRM platform, the demand for real-time inference and model training has skyrocketed.

“We’re seeing a major uptick in data processing and movement, especially as we work with larger datasets,” says Paul Constantinides, executive vice president of engineering at Salesforce. “This calls for higher bandwidth, lower latency, and a more resilient networking infrastructure.”

As AI adoption accelerates, enterprises will need to reassess the way they approach data center networking, cloud integration, edge computing, and network security.

AI Networking Demands Inside the Data Center

Within data centers, AI introduces unique networking requirements, especially during model training when heavy communication between GPUs and servers occurs.

“The need for vast compute resources — particularly CPUs and GPUs — is leading to the emergence of dedicated AI zones within enterprise data centers,” explains Lori MacVittie, a distinguished engineer and chief evangelist in the CTO’s office at F5 Networks. “These so-called ‘AI factories’ require advanced traffic steering, smarter networking solutions, enhanced security, and the ability to process significantly larger data volumes.”

Live Video Streaming
Live video streaming lets you engage with your audience in real time with a video feed. Broadcast your daily show to your audience with no limits, no buffering and high quality videos. Reach all devices anytime anywhere with different video qualities that suits any device and any connection.
$1,120/YE*
The website uses cookies to improve your experience. We’ll assume you’re ok with this, but you can opt-out if you wish.
ACCEPT