India, March 9 -- Akamai is expanding its distributed cloud infrastructure with the deployment of thousands of NVIDIA Blackwell GPUs, creating what the company describes as a unified AI platform designed to support research, model optimisation and large-scale inference workloads.

The initiative forms the foundation of the Akamai NVIDIA Blackwell GPUs AI platform, which aims to route AI inference tasks across compute resources distributed throughout Akamai's global network. The approach is intended to reduce latency and address data transfer challenges that arise when AI workloads rely solely on centralised datacentres.

According to the company, the architecture allows AI workloads to be processed closer to users and connected devices, e...