There are undeniable gains of using machine learning to effectively build upon the accumulated datas of mankind to make progress. But in a world of finite resources and in the light of a circular economy we all may need to talk about how this energy is used and with which output.
The compute power for AI landmarks (AlphaZero, GPT-3 etc.) increased by a factor of 300,000 between 2012 to 2018*.
Estimates see around ~20% of the worlds electric consumption coming from ICT (information and communications technology) in 2030 already. This is equal to about 9,000 terawatt hours! The ingredients of AI require an enormous amount of energy and hence the question must become legitimate to challenge its efficiency and purposefulness.
[further read: wired 2020-01-21]
*Footnote: During the same time our stochastic topology optimizer (STO) has reduced the previously required computation time from 15+/- days to some minutes – there is plenty of room for HI (human intelligence) to efficiently contribute to new and better designs.