Watt Matters in AI
- Date: 26 Nov - 01 Jan
- Time: 9:00 - 18:00
- Location: Microstad, Eindhoven, Nederland
- Entry fee Early bird €75,00 ; Regular €100,00 ; Student €40,00

Artificial Intelligence (AI) can address all sorts of societal problems while attaining the Global Development Goals. However, the anticipated massive use of AI will severely strain energy resources, creating new problems.
To master this conundrum, new paradigms are required at many levels, from new hardware solutions to regulations for the use of AI and everything in between. This goes far beyond the ongoing improvements made with present-day solutions - we need a holistic approach leading to radical breakthroughs.
This conference intends to explore the potential of these new paradigms, providing a basis for AI with vastly improved energy efficiencies. Recent developments in AI will be reviewed, and promising upcoming concepts like brain-inspired neuromorphic computing, physics-based computing, or approximate computing will be addressed.
Watt really matters in AI; bringing multidisciplinary experts together at this conference can help establish an ecosystem that pushes innovations to new levels.
Why This Topic Is Important
The problem:
- Estimated energy consumption: Training GPT-4 is estimated to have consumed between 51,773 MWh and 62,319 MWh of electricity. This is over 40 times the energy used to train its predecessor, GPT-3.
- Comparison to household energy use: This amount of energy is comparable to the annual consumption of approximately 1,000 average U.S. households over 5 to 6 years.
- AI's energy appetite is exploding: Multiply that by thousands of models worldwide and the environmental impact becomes enormous.
- Unsustainable Trends: Without intervention, energy costs of AI will conflict with climate goals and widen global inequalities in AI access.
These figures underscore the significant environmental impact associated with training large-scale AI models. As AI systems become more advanced and widespread, addressing their energy efficiency becomes increasingly critical.
Key Questions This Conference Will Address`
- Can we make AI sustainable and robust?
- What hardware or algorithmic innovations offer 10x or 100x efficiency gains?
- What are the ethical and policy responsibilities around energy use in AI?
- How can Europe take the lead in energy-aware AI innovation?
- What infrastructure is needed to support low-power AI — from chips to data centers to regulation?