AI and Machine Learning Can Reside in the Human Brain, Know How

AI and Machine Learning

According to a team of Penn State researchers, a brain cell called astrocytes function can be emulated in the physics of hardware devices that may also result in artificial intelligence or AI and machine learning that freely self-repair and consume less energy than that of the technologies do as of now. 

Now, what are astrocytes? Astrocytes have a star shape that is a type of glial cell that functions as a support cell for neurons in the brain. These cells play a great role in major brain functions such as learning, memory, self-repair, and synchronization. 

Abhronil Sengupta, who is an assistant professor of electrical engineering and computer science said that this project stemmed from current observations in computational neuroscience as there has been a lot of effort put in understanding how the brain works and how people try to revise the model of simplistic neuron-synapse connections. “It turns out there is a third component in the brain, but its role in machine learning and neuroscience has kind of been overlooked”. 

AI and machine learning are witnessing a great demand these days. According to the latest report, AI and machine learning skills are likely to gain popularity by a compound growth rate of 71% by 2025. Even though there is a great chance of booming, there are also challenges as the use of AI and machine learning is increasing as they also use a lot of energy. 

“An often-underestimated issue of AI and machine learning is the amount of power consumption of these systems”, says Sengupta. He also says that a few years back IBM tried to simulate the brain activity of a cat and in doing so ended up consuming around a few megawatts of power. And if we were to just extend this number to simulate the brain activity of a human being on the best possible supercomputer we have today, the power consumption would be even higher than megawatts, he adds. 

The usage of power is due to various complex dances of semiconductors, switches, mechanical and electrical processes that take place in computer processing that increase when the processes are as complex as what AI and machine learning is in demand. For all these, one solution is neuromorphic computing which can imitate brain functions. Many researchers are behind neuromorphic computing as the human brain has evolved to use much less energy for its processes than that of the computer, so imitating those primary functions can make AI and machine learning more effective processes. 

The other major key function that the brain holds potential for neuromorphic computing is how the brain can self-repair damaged neurons and synapses. Astrocytes play a vital role in self-repairing the brain. When we try to come up with these new device structures, we try to form a prototype of artificial neuromorphic hardware, which has a lot of hardware-level faults. So, Sengupta says that we can draw insights from computational neuroscience that are based on how astrocyte glial cells are self-repairing the brain by using those concepts to cause self-repair of neuromorphic hardware to repair these faults. 

The lab of Sengupta works with spintronic devices, a form of electronics that process information using spinning electrons. The research was part of  a study that was published in January in Frontiers in Neuroscience. Temporal information binding is how the brain can make sense of relations between separate events that happen at different times and makes sense of these events in a sequence, which is a significant function of AI and machine learning. 

To understand how it might be achieved, researchers came up with neuroscience models of astrocytes to understand what aspects of astrocyte functions would be most relevant for its research. They have developed theoretical modeling of the potential spintronic devices. 

“We needed to understand the device physics and that involved a lot of theoretical modeling of the devices, and then we looked into how we could develop an end-to-end, cross-disciplinary modeling framework including everything from neuroscience models to algorithms to device physics”, says Sengupta. This new research has opened new doors for more significant AI and machine learning work to be done on power-constrained devices such as new smartphones. 

AI and machine learning are revolutionizing the world around us every day, you see it from your smartphones recognizing pictures of your friends and family, to machine learning’s huge impact on medical diagnosis for different kinds of diseases,” Sengupta said. “At the same time, studying astrocytes for the type of self-repair and synchronization functionalities they can enable in neuromorphic computing is really in its infancy. There’s a lot of potential opportunities with these kinds of components.” 

Scroll to top
Browse Categories