Green AI

@article{Schwartz2020GreenA,
  title={Green AI},
  author={Roy Schwartz and Jesse Dodge and Noah Smith and Oren Etzioni},
  journal={Communications of the ACM},
  year={2020},
  volume={63},
  pages={54 - 63}
}
Creating efficiency in AI research will decrease its carbon footprint and increase its inclusivity as deep learning study should not require the deepest pockets. 

Figures from this paper

GreenML : A methodology for fair evaluation of machine learning algorithms with respect to resource consumption

TLDR
It is shown that when stacking deep neural networks hierarchies together, state-of-the-art results are achieved when evaluating their models with differenced models.

Greening the Artificial Intelligence for a Sustainable Planet: An Editorial Commentary

TLDR
This research presents a probabilistic approach that can be used to estimate the level of knowledge a person has about the world around them and provide them with a picture of the world in order to estimate their intelligence.

The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink

Machine learning (ML) workloads have rapidly grown, raising concerns about their carbon footprint. We show four best practices to reduce ML training energy and carbon dioxide emissions. If the whole

Data-Centric Green AI: An Exploratory Empirical Study

TLDR
Evidence is shown that, by exclusively conducting modifications on datasets, energy consumption can be drastically reduced, often at the cost of a negligible or even absent accuracy decline, which calls for a research agenda that focuses on data-centric techniques to improve AI energy efficiency.

Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning

TLDR
A framework is introduced that makes accounting easier by providing a simple interface for tracking realtime energy consumption and carbon emissions, as well as generating standardized online appendices, and creates a leaderboard for energy efficient reinforcement learning algorithms to incentivize responsible research.

The AI Gambit — Leveraging Artificial Intelligence to Combat Climate Change: Opportunities, Challenges, and Recommendations

TLDR
It is argued that leveraging the opportunities offered by AI for global climate change whilst limiting its risks is a gambit which requires responsive, evidence-based and effective governance to become a winning strategy.

Unraveling the hidden environmental impacts of AI solutions for environment

TLDR
The different types of AI impacts are reviewed, then the different methodologies used to assess those impacts are presented, and how to apply life cycle assessment to AI services is shown.

Sustainable AI: Environmental Implications, Challenges and Opportunities

TLDR
The carbon footprint of AI computing is characterized by examining the model development cycle across industry-scale machine learning use cases and, at the same time, considering the life cycle of system hardware.

Measuring the Carbon Intensity of AI in Cloud Instances

TLDR
This paper provides a framework for measuring software carbon intensity, and proposes to measure operational carbon emissions by using location-based and time-specific marginal emissions data per energy unit, and provides recommendations for how machine learning practitioners can useSoftware carbon intensity information to reduce environmental impact.
...

References

SHOWING 1-10 OF 55 REFERENCES

Energy and Policy Considerations for Deep Learning in NLP

TLDR
This paper quantifies the approximate financial and environmental costs of training a variety of recently successful neural network models for NLP and proposes actionable recommendations to reduce costs and improve equity in NLP research and practice.

Tackling Climate Change with Machine Learning

TLDR
From smart grids to disaster management, high impact problems where existing gaps can be filled by ML are identified, in collaboration with other fields, to join the global effort against climate change.

Revisiting Unreasonable Effectiveness of Data in Deep Learning Era

TLDR
It is found that the performance on vision tasks increases logarithmically based on volume of training data size, and it is shown that representation learning (or pre-training) still holds a lot of promise.

Deep Learning for Wildlife Conservation and Restoration Efforts

Climate change and environmental degradation are causing species extinction worldwide. Automatic wildlife sensing is an urgent requirement to track biodiversity losses on Earth. Recent improvements

An Analysis of Deep Neural Network Models for Practical Applications

TLDR
This work presents a comprehensive analysis of important metrics in practical applications: accuracy, memory footprint, parameters, operations count, inference time and power consumption and believes it provides a compelling set of information that helps design and engineer efficient DNNs.

ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design

TLDR
This work proposes to evaluate the direct metric on the target platform, beyond only considering FLOPs, and derives several practical guidelines for efficient network design, called ShuffleNet V2.

Learning Time/Memory-Efficient Deep Architectures with Budgeted Super Networks

TLDR
A novel family of models called Budgeted Super Networks (BSN) is proposed, learned using gradient descent techniques applied on a budgeted learning objective function which integrates a maximum authorized cost, while making no assumption on the nature of this cost.

Aggregated Residual Transformations for Deep Neural Networks

TLDR
On the ImageNet-1K dataset, it is empirically show that even under the restricted condition of maintaining complexity, increasing cardinality is able to improve classification accuracy and is more effective than going deeper or wider when the authors increase the capacity.

Squeeze-and-Excitation Networks

TLDR
This work proposes a novel architectural unit, which is term the “Squeeze-and-Excitation” (SE) block, that adaptively recalibrates channel-wise feature responses by explicitly modelling interdependencies between channels and shows that these blocks can be stacked together to form SENet architectures that generalise extremely effectively across different datasets.

ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices

TLDR
An extremely computation-efficient CNN architecture named ShuffleNet is introduced, which is designed specially for mobile devices with very limited computing power (e.g., 10-150 MFLOPs), to greatly reduce computation cost while maintaining accuracy.
...