Researchers put a new spin on machine learning


Wednesday, 10 January, 2024

Researchers put a new spin on machine learning

Researchers at Tohoku University have shown a proof-of-concept of an energy-efficient computer compatible with current AI. It utilises the stochastic behaviour of nanoscale spintronics devices and is suitable for probabilistic computation problems such as inference and sampling.

With the slowing down of Moore’s Law, there has been an increasing demand for domain-specific hardware. Probabilistic computers with naturally stochastic building blocks (probabilistic bits, or p-bits) are a representative example owing to their potential to efficiently address various computationally hard tasks in machine learning (ML) and artificial intelligence (AI). Much like quantum computers are suitable for inherently quantum problems, room-temperature probabilistic computers are suitable for intrinsically probabilistic algorithms, which are used for training machines and computational hard problems in optimisation and sampling.

The researchers showed that robust and fully asynchronous (clockless) probabilistic computers can be realised at scale using a probabilistic spintronic device called magnetic tunnel junction (sMTJ) interfaced with powerful Field Programmable Gate Arrays (FPGA). Until now, sMTJ-based probabilistic computers have only been capable of implementing recurrent neural networks.

“As feedforward neural networks underpin most modern AI applications, augmenting probabilistic computers toward this direction should be a pivotal step to hit the market and enhance the computational capabilities of AI,” said Professor Kerem Camsari, the Principal Investigator of the project.

In the recent development, the researchers made two advances. First, leveraging earlier works by the Tohoku University team on stochastic magnetic tunnel junctions at the device level to demonstrate the fastest p-bits at the circuit level by using in-plane sMTJs. Second, by enforcing an update order at the computing hardware level and leveraging layer-by-layer parallelism, the researchers demonstrated the basic operation of the Bayesian network as an example of feedforward stochastic neural networks.

Professor Shunsuke Fukami from Tohoku University said that while the current demonstrations are small-scale, their designs can be scaled up by making use of CMOS-compatible Magnetic RAM technology. This could enable advances in machine learning applications while unlocking the potential for efficient hardware realisation of deep/convolutional neural networks.

Image credit: iStock.com/FlashMovie

Related News

Theory reveals the shape of a single photon

A new theory that explains how light and matter interact at the quantum level has enabled...

Electron microscopy reveals colours of outermost electron layer

Researchers have used atomic-resolution secondary electron (SE) imaging to capture the atomic...

Novel way to transmit data via laser light

The discovery, centred on controlling tiny hurricanes of light and electromagnetic fields, could...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd