A change in the way we construct neural networks could dramatically reduce energy consumption.
Researchers have found that replacing silicon in neural training networks (hardware or software systems that function like human brains) by magnetic nanowires could slash energy consumption by a factor of 20X-30X. That would be a huge saving. AI data processing is expected to account for a mind-boggling 20% of global electricity consumption by 2025. At a rough calculation, that would equal to 5.5% of all CO2 emissions. (oilprice.com)
Energy implications of AI
AI systems and silica neural training networks can consume unimaginable amounts of energy. MIT researchers, in a paper titled “Energy and Policy Considerations for Deep Learning in NLP” said training an AI model could give of over 626,000 pounds of CO2 equivalent. That would be equal to approximately five times the lifetime emissions of an average American car, its manufacture included.
An indication of what’s in store for us is available from the sizzling trajectory of the AI software market. It is likely to expand by 154% year-on-year and maintain double-digit growth over the next five years at the minimum.
As AI systems and their applications grow, there are serious implications for energy consumption. In turn, these impact global warming and climate change.
Rethinking neural networks
Researchers at the Cockrell School of Engineering at the University Of Texas have discovered that they can induce “lateral inhibition” in neural networks by carefully spacing magnetic nanowires that act like artificial neurons.
Lateral inhibition is similar in function to human neurons. Compared to silicon, magnetic components can dramatically reduce space, cost and energy requirements.
If the researchers at Cockrell succeed in commercializing the technology for magnetic nanowire in neural networks, it would mark a huge advance in lowering emissions from AI processing.
Related Story: Artificial Intelligence: AI is Helping Itself to Get Smarter and Smarter