“Researchers have developed a new type of memory device that they say could reduce the energy consumption of artificial intelligence (AI) by at least [a factor of] 1,000.

Called computational random-access memory (CRAM), the new device performs computations directly within its memory cells, eliminating the need to transfer data across different parts of a computer.

In traditional computing, data constantly moves between the processor (where data is processed) and the memory (where data is stored) — in most computers this is the RAM module. This process is particularly energy-intensive in AI applications, which typically involve complex computations and massive amounts of data…

In a peer-reviewed study published July 25 in the journal npj Unconventional Computing, researchers demonstrated that CRAM could perform key AI tasks like scalar addition and matrix multiplication in 434 nanoseconds, using just 0.47 microjoules of energy. This is some 2,500 times less energy compared to conventional memory systems that have separate logic and memory components, the researchers said.”

From Live Science.