One of the biggest challenges facing artificial intelligence is the interaction between computer memory and processing power. When the algorithm is running, data is transferred rapidly between these two components. However, AI models rely on massive amounts of data, which creates bottlenecks.
impossible new discoveryA paper published Monday in the journal Frontiers in Science by Purdue University and the Georgia Institute of Technology proposes a new approach to building computing architectures for artificial intelligence models using brain-inspired algorithms. According to the researchers, creating algorithms in this way could reduce the energy costs associated with AI models.
“Over the past four years, the size of language processing models has increased 5,000 times,” said Kaushik Roy, lead author of the study and professor of computer science at Purdue University. from statement“With such incredibly rapid expansion, it is important to make AI as efficient as possible, This means fundamentally rethinking how computers are designed,”
Most modern computers are based on an idea originated in 1945 called the von Neumann architecture, which separates processing and memory. This is where recession occurs. As more people around the world rely on data-intensive artificial intelligence models, the differences in computer processing power and memory capacity may become more significant.
IBM researchers drew attention to this problem. in this post Earlier this year. The problem computer engineers face is called the “memory wall”.
breaking the wall of memory
To memory wall This refers to the difference between memory and processing power. Basically, your computer’s memory is having trouble keeping up with its processing speed. This is not a new problem. Two researchers from the University of Virginia i made up the word This is from the nineties.
But as AI becomes more widespread, memory constraints are draining computers running AI models of time and energy. In this paper, the researchers argue that it is possible to experiment with new computer architectures that integrate memory and processing.
The AI algorithms described in this article are inspired by the way our brains work. spike neural networkIn the past, these algorithms have often been criticized for being slow and inaccurate, However, some computer scientists say: These algorithms include: showed significant improvement In the last few years.
Researchers believe that the AI model is based on concepts related to social networks (SNNs). in-memory computingThis concept is still relatively new in the field of artificial intelligence,
“CIM provides a promising solution to the memory wall problem by integrating computing capabilities directly into the memory system,” the authors write in the study abstract.
Medical devices, transportation, and drones are areas that researchers believe could be improved by integrating computer processing and memory into a single system.
“AI is one of the most transformative technologies of the 21st century,” study co-author Tanvi Sharma, a Purdue University researcher, said in a statement. “But taking AI out of data centers and into the real world requires significant reductions in energy consumption.”
“With less data transfer and more efficient processing, artificial intelligence can now be put into cheaper devices with shorter and longer battery life,” Sharma said.
