- The AI-fication is an unstoppable phenomenon. And this will likely increase the overall energy consumption associated to digital infrastructures and services. At the same time the increasing adoption of AI can bring significant benefits – in terms of energy efficiency and sustainability – to a large set of application domains. It is unclear whether we’ll reach a sort of sweet spot, representing the optimal ‘amount of AI’ to be applied to.
- AI is still rather energy-hungry: early attempts to optimize AI methods and implementations for minimizing energy consumption are there (optimized chipsets, AI on silicon etc.), but the field is still very young — and relatively immature.
- AI will move out of the datacenter — and make its way to the fog and to the edge. On the one hand this opens up new challenges, on how to effectively ‘downscale’ solutions developed for the cloud, so that they can be accommodated to the edges, i.e., closer to where data is generated/consumed. On the other one, this means new opportunities for players that have missed the datacenter train; in particular European companies may be a step ahead and profit from a first-mover advantage in this arena.
- Greening the fog, i.e., a distributed datacenter made of a large number of small nodes, often under the administrative control of different entities, is far more difficult than greening the datacenter (where economics make the adoption of any energy-efficiency solution rather straightforward). Non-monetary incentives (including new regulations) may be required to achieve energy sustainability in such scenarios.
If you are interested to know more, here are the slides I presented on The evaporating cloud: from data centers to fog computing