Advertisement
Top
image credit: DCStudio / Freepik

Cloudflare doubles down on inference workloads

December 5, 2023

Via: CIO Dive
Category:

Cloudflare joined the industrywide race to deploy AI-optimized graphics processing units in the cloud last month.

“Right now, there are members of the Cloudflare team traveling the world with suitcases full of GPUs, installing them throughout our network,” CEO Matthew Prince said during the company’s Q3 2023 earnings call in November.

The cloud service provider had inference-optimized GPUs running in 75 cities and was on schedule to install the chips in 100 regions by the end of the year. Prince aims to make Cloudflare “the most widely distributed cloud-based AI inference platform,” he told CIO Dive in an interview Thursday.

Read More on CIO Dive