_Happy boy Posted January 19, 2021 Share Posted January 19, 2021 A group of international AI researchers and data scientists have collaborated to design software capable of estimating the carbon footprint of computing operations. The open-source software package, called CodeCarbon was designed by a consortium of AI and data-science companies. The hope is that the software will enable and incentivize programmers to make their code more efficient and reduce the amount of CO2 generated by the use of computing resources. According to ITP, the new CodeCarbon software package was developed by a team of AI research groups lead by AI research company Mila, along with Comet.ml, Haverford College in Pennsylvania, and GAMMA. Not only does the software estimate the amount of CO2 produced by the use of computing resources, but it also provides developers with advice for reducing their carbon energy footprint. Training AI models can require a lot of energy. As explained by ArsTechnica, researchers from the University of Massachusetts Amherst estimated the total cost of creating and training certain AI models, and team found that training the natural language network BERT once generated approximately as much carbon as a round trip flight between San Francisco and New York. Meanwhile, training the model multiple times until it is optimized could generate as much CO2 as 315 different passengers taking that same flight. Why exactly do AI models consume so much energy and generate so much CO2 as a byproduct? Part of the answer lies in how AI models are trained and optimized. To get even small improvements over the existing state of the art algorithms, AI researchers might train their model thousands of times over, making slight tweaks to the model every time until an optimal model architecture is discovered. AI models are also growing in size all the time, becoming more complex every year. The most powerful machine learning algorithms and models like GPT-3, BERT, and VGG, have millions of parameters and are trained for weeks at a time, amounting to hundreds or thousands of hours of training time. GPT-2 had approximately 1.5 billion parameters within the network, whereas GPT-3 has around 175 billion weights. This ends up using hundreds of kilograms worth of CO2. CodeCarbon has a tracking mechanism module that logs the amount of power used by cloud providers and data centers. The system then uses data pulled from publicly available sources to estimate the volume of CO2 generated, checking statistics from the electrical grid that the hardware is connected to. The tracker estimates the CO2 produced for every experiment using a particular AI module, storing the emissions data for both projects and the entire organization. The founder of Mila, Yohua Bengio, explained that while AI is an incredibly powerful tool that can tackle many problems, it often requires a substantial amount computer power. Sylvian Duranton, Managing Director of the Boston Consulting Group, argued that computing and AI will continue to grow at exponential rates around the world. The idea is that CodeCarbon will help AI and computing companies restrain their carbon footprint as they continue to grow. CodeCarbon will generate a dashboard that allows companies to easily see the amount of emissions generated by the training of their machine learning models. It will also represent the emissions in metrics developers can easily understand, such as miles driven in a car, hours of TV watched, and typical energy consumption by a household in the US. The CodeCarbon developers expect that the software will not only encourage AI researchers to try and reduce their own carbon footprint, but that it will encourage greater transparency regarding emissions overall. Developers will be able to quantify and report on emissions generated by a range of different AI and computing experiments. The team responsible for creating CodeCarbon hopes that other developers will take their open-source tool and enhance it with new features that will help AI engineers and researchers curb their environmental impact even further. Link to comment Share on other sites More sharing options...
Recommended Posts