- Back to Home »
- Solving Big AI’s Big Energy Problem
Posted by : Brij Bhushan
Tuesday, 16 March 2021

It seems that the more ground-breaking deep learning models are in AI, the more massive they get. This summer’s most buzzed-about model for natural language processing, GPT-3, is a perfect example. To reach the levels of accuracy and speed to write like a human, the model needed 175 billion parameters, 350 GB of memory and $12 million to train (think of training as the “learning” phase). But, beyond cost alone, big AI models like this have a big energy problem. UMass Amherst researchers found that the computing power needed to train a large AI model can produce over 600,000 pounds…
This story continues at The Next Web