Cerebras Unleashes WSE-3 AI Chip: 4 Trillion Transistors Powering Next-Gen Supercomputing
March 13, 2024
Cerebras Systems introduced the Wafer Scale Engine 3 (WSE-3) AI chip, doubling the performance of the prior WSE-2 model.
The WSE-3 features an unprecedented 4 trillion transistors and 900,000 AI cores, along with 44GB of on-chip SRAM.
Cerebras' CS-3 supercomputer, utilizing WSE-3 chips, supports up to 24 trillion parameters and can cluster up to 2048 systems for extensive scaling capabilities.
Enhancements in Cerebras Software Framework include PyTorch 2.0 support and dynamic sparsity, which can speed up AI training by up to eightfold.
Strategic partnerships with entities such as Argonne National Laboratory and Mayo Clinic highlight the industry's confidence, alongside a significant backlog of orders for the CS-3.
The company's advancements aim to revolutionize AI chip technology with improvements in performance, scalability, and power efficiency, targeting applications in healthcare and beyond.
CEO Andrew Feldman's vision emphasizes reducing the cost of generative AI model inference and delivering a comprehensive, high-performance AI platform.
Summary based on 9 sources
Get a daily email with more Tech stories
Sources

Forbes • Mar 13, 2024
Cerebras Partners With Qualcomm, Launches 3rd-Gen Wafer-Scale AI
IEEE Spectrum • Mar 13, 2024
Cerebras Unveils Its Next Waferscale AI Chip
ZDNET • Mar 13, 2024
AI startup Cerebras unveils the WSE-3, the largest chip yet for generative AI
ZDNET • Mar 13, 2024
AI startup Cerebras unveils the WSE-3, the largest chip yet for generative AI