DeepMind breaks 50-year math record using AI

AlphaTensor discovers better algorithms for matrix math, inspiring another improvement from afar.

In 1969, a German mathematician named Volker Strassen discovered the previous-best algorithm for multiplying 4×4 matrices, which reduces the number of steps necessary to perform a matrix calculation. For example, multiplying two 4×4 matrices together using a traditional schoolroom method would take 64 multiplications, while Strassen’s algorithm can perform the same feat in 49 multiplications.

Using a neural network called AlphaTensor, DeepMind discovered a way to reduce that count to 47 multiplications, and its researchers published a paper about the achievement.

Going from 49 steps to 47 doesn’t sound like much, but when you consider how many trillions of matrix calculations take place in a GPU every day, even incremental improvements can translate into large efficiency gains, allowing AI applications to run more quickly on existing hardware.

Credit nature.com , arstechnica.com/

Create a website or blog at WordPress.com

%d bloggers like this: