By now, AI’s addiction to energy is well known. Not only does it take an ungodly amount of energy to train and set up an AI, but it also takes a tremendous amount of energy to run AI services. Take OpenAI’s infamous ChatGPT4, which took over 50 GWh of energy to train and uses 500,000 kWh of energy daily servicing user queries. In total, its energy footprint is equivalent to that of a medium-sized city. But, as AI improves, its energy use increases exponentially. As such, the IEA predicts that the AI industry will use a staggering 1,000 TWh of energy each year by 2026! That is equivalent to Japan’s annual energy usage, and it will only continue to grow from there. Obviously, such energy usage isn’t sustainable, as it will slow our net zero transition, incur huge carbon emissions, and even impact power supplies and disrupt industry. Sam Altman, CEO of OpenAI, has, rather questionably, suggested nuclear fusion is needed to unlock the future of AI. But, researchers from the University of Minnesota Twin Cities may have just saved the future of AI with a far more tenable solution.
This solution is a new way of computing known as Computational Random-Access Memory (CRAM) that can train and operate AI 1,000 times more efficiently than current systems!
How? Well, current AI systems must rapidly and constantly transfer data from memory to computational processors during training and when queried, which uses a tremendous amount of energy. CRAM combines memory and processing power, meaning the data can be processed without needing to be transferred, dramatically reducing energy use.
The team that developed this technology has successfully patented their work and is now looking to collaborate with semiconductor industry leaders to produce hardware specifically to advance AI functionality.
So, how will this impact the AI world? If, hypothetically, the entire AI industry could adopt this technology by 2026, they would use less than a TWh of energy per year or about 2.5% of London’s annual energy use, rather than the predicted 1,000 TWh of energy per year. Not only would this make operating AI far cheaper and potentially make the technology decently profitable, but it would also solve the energy supply and carbon emission issues.
So, has this saved the AI industry’s future? Well, not entirely.
Firstly, we don’t yet know if CRAM can be manufactured at scale and if it will be commercially feasible. But even if it is, it doesn’t solve all the problems of AI. As I have written before, for AI to continue developing at its current rate, its energy usage and training database size must grow exponentially. As such, these ultra-efficient computers will only act to delay the energy usage issue. However, it also doesn’t negate the training data issue; after all, there is an upper ceiling to how much high-quality data is available for these AIs, so their development still faces huge hurdles.
Nonetheless, this technology could help the future of AI become brighter. It isn’t a silver bullet, but it might be what the AI industry needs to secure its growth in the near term.
Thanks for reading! Content like this doesn’t happen without your support. So, if you want to see more like this, don’t forget to Subscribe and help get the word out by hitting the share button below.
Sources: Techspot, Planet Earth & Beyond, BBC, Business Insider, Forbes