Project

2D Neuromorphic Devices for Sustainable Artificial Intelligence

adobe stock

Groups

Energy consumption of artificial intelligence (AI) systems are projected to grow at an alarming rate over the next two decades. A recent study estimates that, at current growth rate of AI systems, by 2040 total energy spent on computation will reach  1e27 J, which is far greater than the total energy that humans may be able to generate by then. Moreover, global Information Technology (IT) flourishing over internet of things and artificial intelligence paradigms, is rapidly emerging as a major consumer of world’s primary electricity, which is still the second largest contributor to greenhouse gases emissions worldwide. Thus, there is a critical need to find solutions at the material, devices, and architecture level to reduce energy consumption of computing hardware. A way forward is to replace the traditional von-Neumann computing hardware with technologies like neuromorphic and stochastic computing which are better suited for AI applications. Neuromorphic devices and architectures mimic the biological brain, the extremely energy-efficient neural network, so that memory and logic operations can be performed locally… View full description

Energy consumption of artificial intelligence (AI) systems are projected to grow at an alarming rate over the next two decades. A recent study estimates that, at current growth rate of AI systems, by 2040 total energy spent on computation will reach  1e27 J, which is far greater than the total energy that humans may be able to generate by then. Moreover, global Information Technology (IT) flourishing over internet of things and artificial intelligence paradigms, is rapidly emerging as a major consumer of world’s primary electricity, which is still the second largest contributor to greenhouse gases emissions worldwide. Thus, there is a critical need to find solutions at the material, devices, and architecture level to reduce energy consumption of computing hardware. A way forward is to replace the traditional von-Neumann computing hardware with technologies like neuromorphic and stochastic computing which are better suited for AI applications. Neuromorphic devices and architectures mimic the biological brain, the extremely energy-efficient neural network, so that memory and logic operations can be performed locally. Thus, energy losses (and latency) associated with billions of data retrieval and storage cycles in a neural network can be eliminated. Here, we are developing 2D magnetic material-based devices to form the building blocks of neuromorphic and stochastic computing architectures. Use of correlated systems like ferromagnets provides a way towards low energy device switching, while 2D nature of the materials allows ultimate scalability, and tunability of magnetic and electrical transport properties. Our benchmarking results show that our 2D neuromorphic devices based neural network can lead to more than 10,000X reduction in energy compared to that based on CMOS for performing machine learning tasks. Thus, our technology can address the energy crisis of computing industry, lead to massive reduction in greenhouse Gases helping to combat climate change and enable environmentally sustainable "Green” AI.

Our project is highlighted at the MIT AI Hardware program.