Gate News message, April 29 — OpenAI models running on Amazon Web Services’ Bedrock will gradually migrate to Trainium, Amazon’s custom-designed AI chip, according to recent remarks from OpenAI CEO Sam Altman and AWS executives. Currently, models operate in a mixed environment using both GPUs and Trainium, with an increasing share shifting to Trainium over time. Altman stated the company is “looking forward to moving models to Trainium.”
AWS executive Garman acknowledged that Trainium’s name may have been misleading, as the chip is designed for both training and inference, with inference expected to be the primary use case going forward. However, Garman emphasized that chip branding is largely irrelevant to most customers, noting that users interact with OpenAI through its API interface rather than directly with the underlying hardware. When asked about future integration of non-OpenAI models into Bedrock Managed Agents, Garman declined to provide specifics, stating only that AWS is currently focused on its partnership with OpenAI.
The collaboration underscores AWS’s strategy to leverage its custom silicon for supporting major AI workloads on its cloud platform.
相關文章