Artificial Intelligence: An AI Bot Will Design the Next-Gen AI Chips
Google’s neural network will drastically reduce the time and effort on the placement process in chip design.
Chips specifically designed for artificial intelligence applications are very time-intensive to design and develop. Unfortunately, by the time these chips are ready for use, machine learning algorithms have moved on, becoming faster and more advanced. Result: the chips are no longer relevant.
By cutting down the time taken during the design cycle, this lagging anomaly could be solved. Then Google (NASDAQ: GOOGL) hit upon the solution. Using AI for the design process. (IEEE SPECTRUM)
Virtuous AI cycle
In other words, AI would help to design an AI chip that would again help design more advanced AI chips – a virtuous AI cycle. Google puts it thus:
“We believe that it is AI itself that will provide the means to shorten the chip design cycle, creating a symbiotic relationship between hardware and AI, with each fueling advances in the other.”
Training neural networks…differently
Google researchers hit upon a key area that is a bottleneck in faster AI design – placement. This involves strategically placing blocks of logic and memory (or clusters of those blocks called macros) on the chip. This has to be done in such a way that the chip’s power and performance are maximized while its area is kept to the minimum.
There is a third trade-off in placement: the density of interconnects.
Could they hand over the onerous placement process to AI?
Reinforcement learning
Azalia Mirhoseini, a senior research scientist and senior software engineer Anna Goldie at Google, decided that the neural network be trained using reinforcement learning. This kind of machine learning is different from typical big data-based deep learning. A reinforced learning system learns by trial and error – by doing something. When it is successful, it receives a reward and understands it did right. It adjusts itself and moves on to one more task. Over time it becomes more and more accurate.
The researchers trained the neural network to optimize the power reduction, performance improvement, and area reduction tradeoff.
Now, after having learned on enough chip designs, the neural network can design a Google Tensor Processing Unit in less than 24 hours.
Related Story: Venture Capital: AI-Chipmaker Graphcore Soars To Nearly $2B Valuation
Latest Alternative Investment News
Artificial Intelligence: AMD Takes On Rivals In The AI Chip Sweepstakes
Chipmaker AMD (NASDAQ: AMD) has unveiled a range of innovative AI solutions spanning from data centers to personal computers. The AMD Instinct MI300 Series features data center AI accelerators, while…
Digital Assets: Robinhood Debuts Crypto Trading On Its App In The EU
Robinhood (NASDAQ: HOOD) has launched its Crypto app in the European Union (EU), allowing eligible customers to engage in crypto trading with the added incentive of earning Bitcoin rewards. Customers…
FinTech: Samsung Electronics Ties With Mastercard’s Wallet Express
Samsung Electronics (KRX: 005930) and Mastercard (NYSE: MA) have partnered to launch the Wallet Express program, offering banks and card issuers a cost-effective way to expand digital wallet offerings. Through…
Venture Capital: Revaia, Europe’s Biggest Female-Led VC Firm, Racks Up $160M For Second Fund
Revaia, Europe’s largest female-founded venture capital firm, has successfully raised €150 million ($160 million) for its second fund, Revaia Growth II. The funding was secured from sovereign wealth funds, family…