Akamai Unveils Cloud Inference: Boosting AI Performance with Edge Computing and Cost Efficiency
March 27, 2025
This innovative service operates on Akamai Cloud, recognized as the world's most distributed platform, which addresses the limitations of centralized cloud models by enabling AI inference closer to end users.
Akamai has launched Akamai Cloud Inference, a service that significantly enhances AI application performance by delivering a threefold increase in throughput, 60% reduction in latency, and 86% lower costs compared to traditional hyperscale infrastructure.
By bringing data workloads closer to end users, the platform can increase throughput by three times and reduce latency by up to 2.5 times, enhancing overall performance.
Adam Karon, COO of Akamai, emphasized the critical need for executing actionable AI tasks at the edge, where Akamai's infrastructure provides a competitive advantage over other cloud providers.
The growing necessity for distributed cloud solutions is underscored by Gartner's prediction that by 2025, 75% of data will be generated outside centralized data centers, highlighting the demand for real-time insights.
Akamai Cloud Inference supports a variety of use cases, including in-car voice assistance, AI-driven crop management, and automated customer feedback analysis, showcasing its versatility across different industries.
The platform offers a comprehensive range of computing options, including CPUs, GPUs, and specialized ASIC VPUs, and integrates with Nvidia's AI Enterprise ecosystem to optimize AI inference performance.
The platform's WebAssembly capabilities simplify the development process for AI applications, making it easier for developers to build efficient solutions.
With over 4,200 points of presence worldwide, Akamai Cloud ensures high throughput for data-intensive workloads and low-latency AI inference on a global scale.
Containerization of AI workloads is facilitated through Kubernetes, allowing for demand-based autoscaling and improved resilience across hybrid and multicloud environments.
Akamai has partnered with VAST Data to enhance data management capabilities, enabling real-time access to critical data for AI applications and integration with leading vector database vendors.
As AI adoption matures, enterprises are increasingly shifting their focus from large language models (LLMs) to more practical, industry-specific AI solutions that optimize performance and improve ROI.
Summary based on 4 sources
Get a daily email with more AI stories
Sources

Yahoo Finance • Mar 27, 2025
Akamai Sharpens Its AI Edge with Launch of Akamai Cloud Inference
Analytics Insight • Mar 28, 2025
Akamai Sharpens Its AI Edge with Launch of Akamai Cloud Inference
SD Times • Mar 27, 2025
Akamai launches new platform for AI inference at the edge
Curated - BLOX Digital Content Exchange • Mar 27, 2025
Akamai Sharpens Its AI Edge with Launch of Akamai Cloud Inference