Follow us on social

Latest Posts

Stay in Touch With Us

For Advertising, media partnerships, sponsorship, associations, and alliances, please connect to us below

Email
info@globaltechoutlook.com

Phone
+91 40 230 552 15

Address
540/6, 3rd Floor, Geetanjali Towers,
KPHB-6, Hyderabad 500072

Follow us on social

Globaltechoutlook

  /  Latest News   /  Prospects and Evolution of Artificial Intelligence in the backdrop of Climate Change

Prospects and Evolution of Artificial Intelligence in the backdrop of Climate Change

This article features prospects and evolution of artificial intelligence in the backdrop of climate change

The emergent discussion on the impact of large and complex AI on energy consumption, CO2 emission and overall climate risk has not yet captured the general awareness of the AI investors and user community. There has been massive growth in energy consumption by compute-intensive AI applications as the hunt for near perfect accuracy of these models continues to intensify. A combination of soaring demand for AI applications, shift from research to industrialisation and lack of reporting and monitoring of energy efficiencies of algorithms has invigorated the challenge. Emergence of demand for AutoML type solutions to democratise such applications is threatening to compound the problem to an altogether different scale. The dual approach of an industry baseline protocol for energy efficiency and proportionate distribution of investment into research and industrialisation is seen as key to sustainable growth in Artificial Intelligence in the longer time horizon.

Swagatam Sen

Introduction

When posterity looks back into the nature of our times and civilisation through that unrelenting lens of retrospect, there is little doubt that two of the most defining trends of our times would dominate the overall narrative – soaring growth in Artificial Intelligence and a heady plunge into the climate change catastrophe.

However, discourses around these topics often have limited and peripheral association in the public view. For instance, relative co-interest in AI in association with Environmental narrative has always remained below 5% and has indeed stagnated below 2% during last 5 years. 

Even within that limited context of mutual influence, prevalent theme has been around how AI can help tackle the ever-mounting challenges posed by a rapidly changing climate [1][2][3]. It is beyond a doubt that there is great merit in that discussion and even greater emphasis should be given to such efforts to invigorate the campaign to address environmental risks using AI. 

However, there is another context which draws the two topics together in an arena that is as important however one that with few exceptions [44][5] remains relatively obscure to both the consumers of AI and frontbench of serious climate research – how much adverse impact does complex and compute-intense AI have on an already worsening climate scenario?   

AI Summer

Artificial Intelligence draws a lot of attention and controversy regarding the true nature of progress into the subject matter. But what is beyond any doubt is the extent to which it has shaped and coloured the world in the first two decades of 21st century. 

Last decade in particular has seen an explosive growth in investment around Artificial Intelligence and its subordinate/peripheral technologies like Big data, Machine learning, Deep learning etc. Corporate investment into AI in 2020 stood at a staggering 68B USD – a jump of as high as 40% from last year [6]. This comes off the back of a strong trend since 2015 and even before that. 

Such tremendous growth comes with the expectation of competitive return for businesses, More than 57% of the respondents in a 2018 Deloitte survey of AI Early adopters worldwide [7], expect AI to transform their respective companies within next 3 years. At the same time, only 23% of them have actualized a return from AI investment at transformative scale. Clearly there is an expectation for the AI to bring home increasingly better return over the years.

Irrespective of whether that expectation manifests into reality, the current gap between reality and expectation has been a clear sign of AI summers of past [8]. While the ‘summers’ of 60s and 80s came to a halt as the promise of a generally intelligent machine started to prove less scalable beyond mere toy applications [9][10], the current boom in AI rides on the wave of increased compute speed and storage to feed larger scale applications and solve more complex problems[11].

With cost of technology going down and ever-expanding scope of AI applications around us, it feels the general expectation of a transformative return on AI investment is justified. However, what often gets omitted from investment return calculations, are the hidden costs of a technology that is still in its infancy, namely the human cost and environmental cost [12].

Environmental effect of AI

The urgency of the problem of a rapidly changing climate cannot be overstated, nor can it be overlooked any more as a secondary priority behind those of the conventional drivers of economy – investment, labour and consumption [13][14][15][16][17]. From years of efforts to understand the economic impact of environmental policies, we are now poised at a juncture where it’s critical to it the other way round – embed environment as a factor in general economic policies and business strategies [18]. As we move closer to the milestones set by 2015 Paris Accord, we are likely to see the transformative power of this changing approach. Strategies that seemed optimal and profitable, may not remain so given their impact on environment. Investments in Artificial Intelligence cannot be an exception to that change.

In fact, it is ever more likely for AI to come under scrutiny. In the list of ‘human actions’ with significant impact on climate change, AI would definitely not be among the ‘Also ran’. In a telling comparison with some of the more well-known drivers of climate change, CO2 emissions due to a complex AI algorithm (with massive number of hyperparameters) is found to be 5 times more impactful than a standard US car in its full lifetime and 300 times that of a flight from NY to SF [5]. This statistic though overused continue to be a sobering yet often overlooked reminder of the environmental cost to running super-complex AI algorithms. 

Consumption CO2 Emission (lbs)
Air Travel, 1 passenger, NY to SF return 1,984
Human life, average, 1 year 11,023
Car, average, including fuel, lifetime 126,000
Single training of model (GPU)
NLP Pipeline with tuning and experimentation 78,468
Transformer (big) with NAS 626,155

Having said that an accurate measurement of impact these algorithms impart on climate remain to a great extent elusive [19]. Energy consumption and carbon emissions for AI machines are not yet adequately regulated and transparent, and thus are not captured in the wider consumer imagination when they choose a technology for their usage. This is akin to buying an electronics product off the shelf without knowing the energy efficiency of the product. 

Part of the reasons leading to this gap is the innate obscurity of the algorithms themselves. While for most users the output manifests simply on a mobile screen or a tablet, the process behind such outputs entails a massive supply chain involving human labour, data and resources [12]. Some of the studies that have tried to understand this complex process to quantify the carbon cost has shown that the environmental impact can vary based on several factors [20] –

  1. Location of the server
  2. Type of energy grid used (renewable or conventional)
  3. Compute power
  4. Hardware used etc

Interestingly Compute used itself can be well defined and measured. In fact, in last decade we have seen an unprecedented and exponential boost in compute usage as measured by numbers of operations performed per unit time [19]. 

However, such compute power is so often obtained by throwing more hardware, more clusters, more GPUs/TPUs at the problem drawing up more energy to reduce training time for complex algorithms. Hence compute is not the sole driver of carbon cost and as such doesn’t imply an improvement in the sustainability of such algorithms.

But so often when we are presented with the story of how AI is changing our world, we hear of great feats – and all of them are true – but we don’t quite often hear about the efficiency story. 

No Free Lunch – Efficiency vs Accuracy

Predominant focus of AI research and literature in general has been around improving the accuracy – and precision-recall trade-off – for pattern recognition problems. It’s been a relentless drive, a burst of positivism towards a machine that can predict with near certitude: a perfect machine. Scientific positivism – like AI summers – aren’t a historical novelty. From Archimedes to Bertrand Russell, the tradition of belief in a hypothetical framework, or a machine that can explain, codify the universe – with all its tangibles, intangibles and that most intangible of all: human mind.

However, one of the biggest barriers to such positivism has always been a fundamental law of the universe: The Law of Entropy. On the face of it, it seems AI models are impervious to such law. After all, we have seen arrays of models with ever improving accuracy starting to threaten that sacred 100% mark. With incremental costs going down, it appears we have finally found the self-improving machine that runs itself. It is a veritable free lunch. 

Such illusion fades in consideration of the true efficiency – not accuracy – of such models. An AI model that uses massive amount of compute and data to bring up its predictive accuracy is sure to produce some amount of energy waste. But a single-minded focus on the Accuracy as measure of performance will continue to hide the nature and extent of such wastes.  

Studies comparing multiple AI algorithms operating in similar thematic domains (e.g. Image recognition or NLP) shows that prevalently the most computationally intensive algorithms tend to produce results with highest accuracy but diminishing incremental accuracy with additional resource usage [21]. This indicates that to truly understand the innate efficiency of the algorithms we need to ask the key question: What is the relative improvement in accuracy per units of energy spent? 

While such a definition of Efficiency would resonate with well-known conceptualisations of efficiency in literature [22][23][24][25], it is relatively rare to have been mentioned within AI research. For example, if we consider two Natural-Language-Processing algorithms ELMo and BERTbase both trained on GPU hardwares (P100x3 for EMLo and V100x64 for BERTbase), it is well documented that the latter provides significant lift in accuracy on same tasks involving sentence level constructions. Stanford Sentiment Treebank is a databank of single sentence classification task extracted from human annotated sentiments of movie reviews. On that data, comparison of ELMo and BERTbase shows accuracies of 90.4% and 93.5% which amounts to an increase of 3.3% [26]. 

However, for every single train of ELMo  the Carbon emission is only 262 lbs, wherein the same soars for BERTbase to 1,438 lbs [5]. In other words, to gain an incremental accuracy of 3.3% we are paying the price of 1,176 lbs of CO2 in the environment. This adds up to an efficiency of about 2.82 × 10-5 per lbs. This is simply considering the CO2 emission from a single train of the model. The efficiency calculations should consider the R&D experiments before finalising the model design as well as ongoing tuning of the model once deployed. In addition to that, deployment of such models requires reconfiguration of the underlying architecture because of the varying hardware platforms (e.g. phones, tablets, laptops) to be used for deployment. Consideration of the multiplicative effect of these bits of the processes in terms of their contribution to total energy expenditure, one can reasonably expect an even further plunge in the Efficiency.

Clearly, there is a need for individual and corporate users of technologies to be aware of the Efficiency metric so that they have an opportunity to make an environment-aware choice. Not only that, but the mounting impact on climate also necessitates that such metric should be part of the future research and development of AI as an upstream constraint to manage. 

There is an increasingly critical role for regulations in that space. A clear benchmark in the form of an Efficiency Protocol must be set as a minimum standard for industrial deployment of AI algorithms. That would enable the users to identify algorithms that are still at experimental stage as they do not the Efficiency Protocol. Even those algorithms that do meet the protocol, should be accompanied by a star-based rating system for users to be aware of the best-in-class, not just in accuracy but overall efficiency.

This in turn leads into further impetus on algorithmic innovation to balance the juggernaut of AI industrialisation driven by the burgeoning demand of an increasingly individualised consumerist society – often at the risk of over-consumption. 

AI Consumerism and Role of Research

History, as they say, repeats itself. But sometimes it just tries to outsmart itself by taking a sidestep. If we wind back the clock to one of the last great technological revolution, we would find ourselves in late 18th century Europe and America right in the middle of Industrial Revolution. The revolution, with its concept of a ‘central engine’ which was usually a Water Wheel or Steam Engine, ushered in an era where a whole new array of hitherto luxury products became accessible to wider populace. Output of production grew cheaper in turn swelling up a wave of new ‘mass’ demand for such products [27]. 

However, the means of productions – the machines – remained largely in the hands of the few, while the strong winds of democracy in 19th century liberated consumerist culture from upper echelons of the aristocracy to a wider public demand. It took nearly a century for the machines to be available for use en masse when finally Henry Ford came along with the concept of Mass Production. At the heart of this new innovation was the meta-level concept of automating the industrial process itself and decentralising the power supply to be embedded in each part of the machinery [28]. 

This was only possible because of 100 years of research that transformed the early water wheels into much more efficient electric motors. It’s hard to imagine a scenario where the early industrial managers were to be immediately subjected to a ‘mass’ demand only to be catered to using early prototypes of steam engines. Thankfully such a scenario was not a logical possibility then. However, the individualist demand in a post-modern 21st century has skyrocketed [29] and coupled with the AI Summer has posed an interesting riddle.  

We are only a decade or so into a potential AI revolution, but the individualistic consumerism is threatening to outpace the evolution of AI into its maturity – a clear risk of an AI Winter based on experience [30]. In response there is already a growing trend – indeed a historical sidestep – to automate the ‘water wheels’ of AI to reduce the cost of production. This is the so-called AutoML wave [31][32][33] that’s hitting the corporate market of AI and they seem poised to lap it up. 

AutoML, as the name suggests, automates the development and training of Machine Learning models. These models are one of the cornerstones of AI and require skilled Data Scientists to run multiple experiments to decide the best architecture for the model. Once the design is decided, then the hyperparameters of the chosen architecture, which can often run into millions for complex problems, need to be tuned. Finally, once the model is deployed there is an ongoing maintenance cost that keep cutting deeper into corporate purse. Aside from the operational cost, in previous section we have also discussed how the entire development and deployment lifecycle significantly multiplies the environmental cost of the process. In addition, there is a growing realisation of a skill gap in the market to fulfil the exponentially growing demand for individualised AI products [35][36][37][38].

 AutoML – a Google product – tries to fill that void and is not alone in that journey. There are a number of players who are picking up on that idea and trying to be early movers. They essentially attempt to achieve the dual purpose of democratising the machinery of AI from the hands of a few data scientists and engineers, and in that process making it cheaper and faster-to-market. It is an idea that can potentially propel the AI market to an unprecedented product maturity; but it uses the same approach that Ford had taken a century back – Automation. 

But given a general lack of awareness around the environmental costs of these processes, it is unsurprising that AutoML and related other solutions largely fail to deliver a sustainable efficiency for their innate processes [5]. In fact, often they compound it. At the heart of AutoML lies the relatively new technology called Network Architecture Search or in short, NAS. It effectively searches for the best model architecture among all possible architecture combinations which can run into astronomical proportions for more complex problems [19]. 

Currently on an average NAS produces approximately 600 times more CO2 compared to BERTbase on same data [5]. This clearly indicates a need for further research to improve the algorithmic efficiency of the search before the products should line up for mass distribution [20]. That would invariably mean that we would need to accept a deviation from the fundamental tenets of a demand-driven economy in the case of AI. As in the case of Industrial revolution, it is paramount that Research be allowed time to catch up with the progress on industrialisation. In fact, it can be argued since research productivity diminishes over time (“Ideas get scarcer”) that to attain an equivalent level of research maturity, it would require more research effort [39].

There has indeed been a recent growth in AI research since 2017. There are more than 120,000 peer-reviewed papers linked with AI which 3.8% of all peer-reviewed papers worldwide [2]. There are already some strong results coming through including those attempting to optimise the network search within a NAS framework [40]. However, the rate of growth in algorithmic research remains linear compared to the exponential growth in investment for AI products. Also, only a fraction of that research focuses on algorithmic improvements with most of the funded research moving towards interdisciplinary AI applications [41]. 

 

On the other hand, there is a clear shift in trend for AI PhDs switching to the industry from Academia, which while not conclusive, do indicate a move away from research to industrialisation persuaded by the trail of new investments with promise of quick returns.

In that backdrop, to maintain a balance between the pace of research and industrialisation, it is recommended that a balanced approach be adopted by AI investors wherein for every dollar invested on industrialisation of AI applications that aren’t best-in-class as per the Efficiency Protocol, a fraction of that fund be reserved for algorithmic research to improve the efficiency of future AI applications. The measure, by allocating a research reserve of capital would enable a certain level of control on the true progress of AI through algorithmic innovation instead of an exponentially mounting use of compute and energy. Required investment in research – in academia or in industry – would pave the way to break the cycles of AI Summers and Winters; and would lend sustainability to the technological progress.

Conclusions

It is beyond a shadow of doubt that any discussion of Artificial Intelligence in the context of Climate Change is going to be a complicated one. On one hand, AI has brought to life tremendous possibilities to improve the overall condition of people’s lives on this planet. Indeed there are many AI applications that can assist us is tackling the climate change problems which are fast moving away from the reaches of human abilities. 

On the other hand, as we have established here, it is still a technology in its nascency. There is a great deal of variance in the AI market among product and algorithms of different accuracy, efficiency, and scalability. Some of these algorithms suffer from energy efficiency far below the best-in-class yet goes largely unreported. Hence there is a case for safeguarding the continuance of progress for the matured AI applications by identifying more experimental AI algorithms so that they are allowed to evolve through further research to an adequate level of maturity.

We have strongly argued here that it is expedient to create a standard Efficiency Protocol (AI/EP) which can help potential users to be informed on ‘Experimental’ AI – AI products that are yet to meet the baseline for Efficiency. For the ones that do meet the baseline criteria, a star-based system has been proposed to grade them based on Efficiency. 

While that would mean that unless the skill gap in the data science field is bridged, considerable amount of mounting demand for AI based applications – which currently the experimental AI products are meeting – might have to go unaddressed, it would ensure a sustained growth in the field without running down the resources or casting long term effect on the planet’s health. Given the demand gap, public sector needs of AI for the general upliftment of people’s lives (e.g. healthcare) could perhaps be reasonably prioritised.

However, as we discussed in the last section, part of that gap can also be filled by allowing investors to fund applications of ‘experimental’ AI that are yet to attain a satisfactory status in Efficiency Protocol while a fraction of that budget goes back into algorithmic research to continue to improve the efficiency of the algorithm for future use cases. 

There is a wider narrative around need for regulations in the space of artificial intelligence. The discussion in this article reinforces at least in part – from environmental perspective – the need to raise awareness among the investors around sustainability of some of the expectations around AI and accordingly regulate the investment patterns. Failure to do so, would either cause irreversible damage to the climate, or would see the investment optimism start to fade beyond the veils of yet another AI Winter.