For the greater part of a decade, artificial intelligence (AI) has been recognized by the greater public in its astonishing remarks on a number of tasks, to the point where it has become an integral part of most, if not all, fields of science and economics – and its capabilities are wide, spanning everything from data analysis to image recognitions, stock trading, and game playing.
For a long time, the main focus in AI research has been to attain the best accuracy of the AI models in any given case. In other words, high performance and utmost precision, at all cost – and with large companies having practically no limitations on available resources, there has seemingly been no end to how accurate these models can be made. Contrary to what you may think, this has led to more research by smaller research groups with vastly different resource claims, that have taken it upon themselves to challenge the big companies on their innovation and their core AI ideas rather than on performance alone. This has led to an AI community that as a whole has managed to significantly push the envelope of what is possible with AI. So, one question remains; what is stopping them from continuing to push the forefront of AI?
To put things into perspective, in a blog post from May 2018 by Dario Amodei and Danny Hernandez, they showed that the number of computations required for what is called “Deep Learning Research”, which is one of the most popular sub-genres of recent AI, is estimated to have had a 300.000x increase from 2012 to 2018. This includes AI initiatives like the AlphaGo Zero machine, that beat the human champion in the 4000-year-old board game GO. The problem with these kinds of projects is that the cost of computations and hardware has not gone down significantly enough in the last decade to compensate for the increase in AI training. This is causing a massive spike in the carbon footprint of the industry as a whole. In another study from 2019 from the College of Information and Computer Sciences, they showed that the average CO2 emission for training just one high-end “Natural Language Predictor” model could be up to five times as large as the average emission from a regular car during its entire lifetime, including fuel.
In the summer of 2019, a group of researchers at Allen Institute for AI went at the problem in an article appropriately called “Green AI”, in which they proposed a whole new approach to AI research. In the article, they advocate for a practical solution to the increasing carbon emission in a compelling argument for the research groups to reconsider the main focus of their research. They propose that instead of aiming for complete accuracy, researchers should also consider the efficiency of their models as a secondary criteria for how good their AI is performing. In other words, going forward, research should minimize the number of computations needed to reach a certain accuracy. This way, they suggest, the industry will correct for its own overuse of computational power. They also suggest a so-called “price tag”, that could go along with the result of the proposed models, to bring the efforts of the research groups and their resources into account.
Evidently, these changes are not merely relevant to the big companies that are investing in large-scale operations but especially to the smaller research groups which do not have access to the same resources and computational power. Students, in particular, will benefit greatly from this change of focus, as their biggest bottleneck is their low computationally powered laptops and their often-limited time frames. More efficient AI would mean more time for experimenting with the algorithms and a greater overall outcome from courses – and with the growing number of courses across all fields that include some kind of programming and/or machine learning and/or AI, the importance of these kinds of initiatives is immense.
If you have come this far, I will urge you to look up and explore these topics, AI and machine learning in general, as it will surely be a big part of your near future, if it is not already.