Green AI

@article{Schwartz2020GreenA,
  title={Green AI},
  author={Roy Schwartz and Jesse Dodge and Noah Smith and Oren Etzioni},
  journal={Communications of the ACM},
  year={2020},
  volume={63},
  pages={54 - 63}
}
Creating efficiency in AI research will decrease its carbon footprint and increase its inclusivity as deep learning study should not require the deepest pockets. 
GreenML : A methodology for fair evaluation of machine learning algorithms with respect to resource consumption
TLDR
It is shown that when stacking deep neural networks hierarchies together, state-of-the-art results are achieved when evaluating their models with differenced models. Expand
New universal sustainability metrics to assess edge intelligence
TLDR
Cadle to grave sustainability of edge intelligence models and platforms is assessed with novel deep learning lifecycle efficiency and life cycle recognition efficiency metrics that include the number of times models are used. Expand
Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning
TLDR
A framework is introduced that makes accounting easier by providing a simple interface for tracking realtime energy consumption and carbon emissions, as well as generating standardized online appendices, and creates a leaderboard for energy efficient reinforcement learning algorithms to incentivize responsible research. Expand
The AI Gambit — Leveraging Artificial Intelligence to Combat Climate Change: Opportunities, Challenges, and Recommendations
In this article we analyse the role that artificial intelligence (AI) could play, and is playing, to combat global climate change. We identify two crucial opportunities that AI offers in this domain:Expand
Sustainable Federated Learning
TLDR
This paper proposes a practical federated learning framework that leverages intermittent energy arrivals for training, with provable convergence guarantees, and can be applied to a wide range of machine learning settings in networked environments, including distributed and Federated learning in wireless and edge networks. Expand
Interdisciplinary Research in Artificial Intelligence: Challenges and Opportunities
TLDR
Future development of AI should not only impact other scientific domains but should also take inspiration and benefit from other fields of science, and AI education should receive more attention, efforts and innovation from the educational and scientific communities. Expand
Energy Usage Reports: Environmental awareness as part of algorithmic accountability
TLDR
This paper takes analyses usually applied at the industrial level and makes them accessible for individual computer science researchers with an easy-to-use Python package, and demonstrates the use of these reports as part of model-choice in a machine learning context. Expand
Quantifying the Carbon Emissions of Machine Learning
TLDR
This work presents their Machine Learning Emissions Calculator, a tool for the community to better understand the environmental impact of training ML models and concrete actions that individual practitioners and organizations can take to mitigate their carbon emissions. Expand
Data Partition and Rate Control for Learning and Energy Efficient Edge Intelligence
TLDR
A new paradigm called learning-and-energy-efficient (LEE) EI is explored, which simultaneously maximizes the learning accuracies and energy efficiencies of multiple tasks via data partition and rate control. Expand
CUMULATOR — a tool to quantify and report the carbon footprint of machine learning computations and communication in academia and healthcare by Tristan Trébaol Semester Project Report
"The cost of training machines is becoming a problem". This is the title of an article from The Economist published in June 2020 that highlights the staggeringly unappreciated the financial impact ofExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 98 REFERENCES
Energy and Policy Considerations for Deep Learning in NLP
TLDR
This paper quantifies the approximate financial and environmental costs of training a variety of recently successful neural network models for NLP and proposes actionable recommendations to reduce costs and improve equity in NLP research and practice. Expand
Revisiting Unreasonable Effectiveness of Data in Deep Learning Era
TLDR
It is found that the performance on vision tasks increases logarithmically based on volume of training data size, and it is shown that representation learning (or pre-training) still holds a lot of promise. Expand
Tackling Climate Change with Machine Learning
TLDR
From smart grids to disaster management, it is described how machine learning can be a powerful tool in reducing greenhouse gas emissions and helping society adapt to a changing climate. Expand
Deep Learning for Wildlife Conservation and Restoration Efforts
Climate change and environmental degradation are causing species extinction worldwide. Automatic wildlife sensing is an urgent requirement to track biodiversity losses on Earth. Recent improvementsExpand
An Analysis of Deep Neural Network Models for Practical Applications
TLDR
This work presents a comprehensive analysis of important metrics in practical applications: accuracy, memory footprint, parameters, operations count, inference time and power consumption and believes it provides a compelling set of information that helps design and engineer efficient DNNs. Expand
ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design
TLDR
This work proposes to evaluate the direct metric on the target platform, beyond only considering FLOPs, and derives several practical guidelines for efficient network design, called ShuffleNet V2. Expand
Learning Time/Memory-Efficient Deep Architectures with Budgeted Super Networks
TLDR
A novel family of models called Budgeted Super Networks (BSN) is proposed, learned using gradient descent techniques applied on a budgeted learning objective function which integrates a maximum authorized cost, while making no assumption on the nature of this cost. Expand
Aggregated Residual Transformations for Deep Neural Networks
TLDR
On the ImageNet-1K dataset, it is empirically show that even under the restricted condition of maintaining complexity, increasing cardinality is able to improve classification accuracy and is more effective than going deeper or wider when the authors increase the capacity. Expand
Squeeze-and-Excitation Networks
TLDR
This work proposes a novel architectural unit, which is term the “Squeeze-and-Excitation” (SE) block, that adaptively recalibrates channel-wise feature responses by explicitly modelling interdependencies between channels and shows that these blocks can be stacked together to form SENet architectures that generalise extremely effectively across different datasets. Expand
ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices
TLDR
An extremely computation-efficient CNN architecture named ShuffleNet is introduced, which is designed specially for mobile devices with very limited computing power (e.g., 10-150 MFLOPs), to greatly reduce computation cost while maintaining accuracy. Expand
...
1
2
3
4
5
...