The high energy consumption of artificial neural networks’ learning activities is one of the biggest hurdles for the broad use of Artificial Intelligence (AI), especially in mobile applications. One approach to solving this problem can be gleaned from knowledge about the human brain.
Although it has the computing power of a supercomputer, it only needs 20 watts, which is only a millionth of the energy of a supercomputer.
One of the reasons for this is the efficient transfer of information between neurons in the brain. Neurons send short electrical impulses (spikes) to other neurons—but, to save energy, only as often as absolutely necessary.
Comments are closed.