Amazon plans to deploy AI models on massive Cerebras chips.

Amazon plans to deploy AI models on massive Cerebras chips.

22 hardware

Amazon and Cerebras Join Forces to Accelerate Large Language Models

Amazon Web Services (AWS) announced that by mid-2026 it will begin using chips from startup Cerebras Systems Inc. alongside its own Trainium processors. According to the company, this will create “optimal conditions” for launching and maintaining large language models (LLMs). Financial details of the deal have not yet been disclosed.

What Will Happen
* AWS’s Trainium 3 chips will handle user requests—“understanding” their meaning.
* Then Cerebras’ Wafer‑Scale Engine (WSE) chips will generate the responses.

Thus, two specialized accelerators work in tandem to provide inference calculations for LLMs.

> “Interaction between different components usually slows down the process,” notes Nafea Bshara, AWS vice president. “But we aim to gain an advantage by using chips that handle inference faster.”

The benefit is especially noticeable where response speed matters: for example, step‑by‑step code writing or real‑time text generation.

Why It Matters
* Amazon is one of the largest cloud providers and a heavy user of Nvidia GPU manufacturers. The company is now developing its own AI chips to improve data center efficiency and offer clients unique services.
* For Cerebras, partnering with AWS is its first major contract from a data‑center giant, boosting brand recognition among potential customers. It also matters ahead of their planned IPO.

Takeaway
AWS and Cerebras are jointly creating new infrastructure for large language models: Trainium 3 chips process requests while WSE chips generate answers. Although a standalone Trainium service might be cheaper, the combined solution promises significant acceleration where “time is money.” This strengthens Amazon’s position as a major Nvidia customer while simultaneously advancing the company’s own AI chips toward higher efficiency.

Comments (0)

Share your thoughts — please be polite and stay on topic.

No comments yet. Leave a comment — share your opinion!

To leave a comment, please log in.

Log in to comment