
TechMar 5· 3 min
Cloudflare Launches Edge AI Inference Platform With Sub-10ms Latency Worldwide
Cloudflare is pushing AI inference to the network edge with a new platform that runs models at 300+ global locations, targeting sub-10ms response times.
Tag
2 articles

Cloudflare is pushing AI inference to the network edge with a new platform that runs models at 300+ global locations, targeting sub-10ms response times.

A Cloudflare está levando a inferência de IA até a borda da rede com uma nova plataforma que executa modelos em mais de 300 locais globais, visando tempos de resposta inferiores a 10 ms.