India, March 18 -- Akamai Technologies has announced what it describes as the first global-scale implementation of the NVIDIA AI Grid reference design, marking a shift in how artificial intelligence infrastructure is deployed and managed. By integrating NVIDIA's AI systems into its own network and applying workload orchestration across its distributed infrastructure, Akamai aims to move beyond centralised AI processing towards a more distributed model for inference.
The development builds on the company's Inference Cloud platform, introduced last year. As part of the rollout, Akamai is deploying thousands of NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs. The platform is designed to support emerging AI applications, including agent-ba...
Click here to read full article from source
इस लेख के रीप्रिंट को खरीदने या इस प्रकाशन का पूरा फ़ीड प्राप्त करने के लिए, कृपया
हमे संपर्क करें.