AI uses too much energy—nanotech is the solution | Dr. Mark Hersam | TEDxChicago

  • AI is everywhere and while it brings many benefits, it also consumes a significant amount of energy.
  • By 2027, AI is expected to consume 100 trillion Watt hours of electricity each year, equivalent to the energy usage of Argentina.
  • This energy consumption also leads to a significant amount of heat, requiring large amounts of water for cooling.
  • Dr. Hersam proposes nanotechnology as a solution to this problem. By understanding matter at the smallest scales, we can create more energy-efficient hardware for AI.
  • Current computing technology is energy-hungry because it requires data to move back and forth between memory and a central processing unit. This is especially problematic for AI, which deals with large amounts of data.
  • Dr. Hersam's lab is developing neuromorphic (brain-like) computing as a more energy-efficient alternative. The brain is more energy-efficient because memory and information processing are located in the same place, minimizing the need to move data.
  • Nanomaterials can be used to create reconfigurable, energy-efficient devices for AI. These devices can perform AI-based machine learning at 100 times lower power consumption than a digital computer.
  • This technology could allow AI to be performed directly on portable electronic devices, reducing the need for energy-hungry data centers.
  • The ultimate goal is to create sophisticated AI that can emulate cognitive function. This requires a diverse range of devices that can be dynamically reconfigured, similar to the heterogeneous subsystems of the brain.
  • While it's difficult to predict exactly when this technology will be widely deployed, Dr. Hersam is optimistic that it can be delivered in time to prevent the negative consequences of conventional AI's energy consumption.

via AI uses too much energy—nanotech is the solution | Dr. Mark Hersam | TEDxChicago