Engineers develop a brain-like computer using magnets

 

Engineers develop a brain-like computer using magnets

University of Texas at Dallas engineers have created a neuromorphic computer prototype that mimics human brain learning, using magnetic components to achieve dramatically improved energy efficiency compared to traditional artificial intelligence systems. The breakthrough, published in August in Nature Communications Engineering, represents a significant step toward bringing AI processing directly to smartphones and other mobile devices.​

Revolutionary Magnetic Brain Technology

Dr. Joseph S. Friedman and his team at UT Dallas developed the prototype using magnetic tunnel junctions (MTJs), nanoscale devices that function like biological synapses by strengthening or weakening connections based on activity patterns. Unlike conventional computers that separate memory and processing, this neuromorphic system integrates both functions, mimicking how the human brain operates.​

"Our work shows a potential new path for building brain-inspired computers that can learn on their own," said Friedman, associate professor of electrical and computer engineering. "Since neuromorphic computers do not need massive amounts of training computations, they could power smart devices without huge energy costs."​

The prototype demonstrated remarkable efficiency in testing, achieving over 600 trillion operations per second per watt—more than six times the efficiency of existing memristor systems and thousands of times more efficient than standard graphics processors. When tested on handwritten digit recognition using the MNIST dataset, the system achieved 90% accuracy while consuming significantly less power than traditional AI hardware.

Industry Collaboration and Federal Support

The research emerged from a collaboration with industry partners Everspin Technologies and Texas Instruments, positioning the technology for potential commercial applications. Dr. Sanjeev Aggarwal, president and CEO of Everspin Technologies, co-authored the study alongside Friedman.​

The project gained additional momentum in September 2025 when the U.S. Department of Energy awarded Friedman a $498,730 two-year grant to advance the neuromorphic computing research. This federal investment underscores growing government interest in energy-efficient computing alternatives as AI systems consume increasingly massive amounts of electricity.​

The research builds on Hebbian learning principles, often summarized as "neurons that fire together, wire together," allowing the artificial system to adapt and learn from experience rather than requiring extensive pre-training. The MTJ devices' binary switching capabilities provide reliability advantages over previous neuromorphic approaches that suffered from stability issues.​

According to recent analyses, training large language models can consume as much electricity as 300 homes use in an entire year, while the human brain processes complex information using roughly the same power as a 20-watt light bulb. Neuromorphic chips could reduce energy consumption by up to 80% compared to conventional AI systems, making them particularly valuable for edge computing applications where power efficiency is critical.
Previous Post