Oregon State's AI Chip Slashes Energy Use for Large Language Models by 50%

May 9, 2025
Oregon State's AI Chip Slashes Energy Use for Large Language Models by 50%
  • High-speed data transmission often results in data corruption, which traditionally requires extensive energy to rectify using conventional equalizers, highlighting the need for more efficient solutions.

  • Researchers at Oregon State University have developed a groundbreaking AI chip that reduces energy consumption for large language models (LLMs) like Gemini and GPT-4 by 50% compared to traditional designs.

  • Currently, Javadi is working on a more advanced version of the chip to achieve even greater energy efficiency, reflecting the ongoing commitment to sustainable AI development.

  • The integration of energy-efficient designs and smaller models presents a promising pathway for sustainable AI development, balancing performance, cost, and environmental impact.

  • The technology was unveiled by doctoral student Ramin Javadi and Associate Professor Tejasvi Anand at the IEEE Custom Integrated Circuits Conference in Boston, where Javadi also received the Best Student Paper Award.

  • This innovative chip employs AI principles to enhance signal processing efficiency and incorporates on-chip techniques for data recovery, significantly reducing power consumption during data transmission.

  • Associate Professor Tejasvi Anand emphasized that while data transmission rates are increasing, the energy required to transmit data has not decreased at the same pace, leading to high power usage in data centers.

  • The new chip addresses this issue by utilizing an on-chip classifier trained with AI to efficiently recognize and correct errors in data transmission, further improving energy efficiency.

  • The project received support from notable organizations including the Defense Advanced Research Projects Agency, the Semiconductor Research Corporation, and the Center for Ubiquitous Connectivity.

  • As generative AI technologies like ChatGPT continue to gain popularity, the demand for sustainable AI solutions becomes increasingly critical to mitigate environmental impacts.

  • Smaller language models (SLMs), characterized by 10 to 15 billion parameters, are emerging as cost-effective and secure alternatives to LLMs, offering advantages such as improved privacy and lower resource demands.

  • The urgent need for energy-efficient AI solutions is underscored by the increasing energy demands of applications like Gemini and GPT-4, which have significant environmental implications.

Summary based on 3 sources


Get a daily email with more AI stories

Sources


New Chip Cuts Energy Use For AI Models By 50%

The Pinnacle Gazette • May 9, 2025

New Chip Cuts Energy Use For AI Models By 50%

OSU’s New AI Chip Shrinks Energy Consumption by Half

The Corvallis Advocate • May 9, 2025

OSU’s New AI Chip Shrinks Energy Consumption by Half

More Stories