Artificial Intelligence: MIT CSAIL Researchers Find a Way to Reduce Emissions From AI
Though AI’s use cases abound, its sustainability issues raise concern.
Researchers at MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) have found a more environmentally friendly way to train and operate AI models. By improving computational efficiency, they claim they can significantly reduce the number of carbon emissions from that AI. (VentureBeat)
The emission implications from operating the vast banks of powerful computers that run AI applications are only now sinking in.
AI, energy, and emissions
MIT researchers, in a paper titled “Energy and Policy Considerations for Deep Learning in NLP” said training an AI model could give of over 626,000 pounds of CO2 equivalent. That would be equal to approximately five times the lifetime emissions of an average American car, its manufacture included.
AI data processing is expected to account for a mind-boggling 20% of global electricity consumption by 2025. At a rough calculation, that would equal to 5.5% of all CO2 emissions.
“If rapid progress in AI is to continue, we need to reduce its environmental impact,” said IBM fellow and member of the MIT-IBM Watson AI Lab John Cohn, referring to the study. “The upside of developing methods to make AI models smaller and more efficient is that the models may also perform better.”
The energy-saving solution from MIT CSAIL
The MIT CSAIL researchers propose to develop a network that trains a large AI model. This large model is made up of many pre-trained AI sub-modules that can deploy immediately across a wide range of platforms without retraining.
These sub-modules can also operate independently. The AI model automatically deploys the sub-module best suited to the target hardware.
The method significantly reduces the energy that would be consumed to separately train each neural network.
Example: computer vision
The researchers’ system was tested on a computer vision application. Its training entailed carbon emissions of approximately 1/1,300 of the carbon emissions compared to conventional neural architecture methods.
“Searching efficient neural network architectures has until now had a huge carbon footprint,” says Song Han, an assistant professor in the Department of Electrical Engineering and Computer Science. “But we reduced that footprint by orders of magnitude with these new methods.”
The process was also much more efficient. The computer vision model contained over 10 quintillion architectural settings. The approach adopted by MIT CSAIL proved more efficient compared to the conventional method of individually training each sub-network. The process was also accurate when benchmark-tested on certain mobile devices.
Hardware and software
The researchers implemented their project on Satori, an efficient computing cluster donated to MIT by IBM that is capable of performing 2 quadrillion calculations per second.
They used a recent AI advance called AutoML that eliminates manual network design.
Latest Alternative Investment News
Michael Saylor, CEO of MicroStrategy (NASDAQ: MSTR), who is permanently bullish on bitcoin, reiterated his positive outlook for the leading crypto in an interview on Thursday. Note that MicroStrategy currently…
The crypto world was dealt a body blow by the collapse of the TerraUSD (UST) stablecoin earlier this month. The Terra meltdown is now being referred as crypto’s “Lehman moment.”…
Fast-growing, Southeast Asia-focused payments infrastructure platform Xendit closed on a US$300 million Series D funding co-led by Coatue and Insight Partners with additional investment from Accel, Tiger Global, Kleiner Perkins,…
FTX Stocks is an equity trading platform offered through the FTX US trading application. FTX US announced Thursday a private beta phase trading on the platform for US customers selected…