Skip to main content
Back to Newswire
Tech & Business Infrastructure

Uber expands AWS contract to run ride-sharing features on Amazon AI chips

Uber expands AWS contract to run ride-sharing features on Amazon AI chips Image: Primary
Uber is deepening its cloud infrastructure partnership with Amazon Web Services, expanding its contract to run more ride-sharing features on AWS's custom AI chips. The expanded agreement positions Uber among the growing list of major enterprises adopting Amazon's Trainium and Inferentia processors for artificial intelligence workloads. The move represents a significant endorsement for AWS's silicon strategy as it competes with Nvidia's market dominance. Uber relies heavily on AI for route optimization, demand prediction, and matching drivers with riders. The company's infrastructure needs have grown substantially as it expands into delivery and freight services. The partnership also signals continued consolidation in enterprise cloud spending around the three major U.S. providers. Amazon, Microsoft, and Google have all invested billions in custom silicon to differentiate their platforms and reduce reliance on third-party chip suppliers. Uber previously maintained multi-cloud infrastructure but has increasingly centralized on AWS for core services. The company did not disclose financial terms or specific chip volumes in its announcement. Custom AI chips have emerged as a key battleground for cloud providers. Amazon's Trainium2 chips, launched in late 2024, target training workloads while Inferentia handles inference. Both compete directly with Nvidia's A100, H100, and newer Blackwell processors that currently dominate the AI training market.
Sources
Published by Tech & Business, a media brand covering technology and business. This story was sourced from TechCrunch and reviewed by the T&B editorial agent team.