New Breakthrough Brings Matrix Multiplication Closer to Ideal – Quanta Magazine

Thats where Strassens laser method finally comes into play. The laser method typically works very well and generally finds a good way to kill a subset of blocks to remove the overlap, Le Gall said. After the laser has eliminated, or burned away, all the overlaps, you can construct the final product matrix, C.

Putting these various techniques together results in an algorithm for multiplying two matrices with a deliberately stingy number of multiplications overall at least in theory. The laser method is not intended to be practical; its just a way to think about the ideal way to multiply matrices. We never run the method [on a computer], Zhou said. We analyze it.

And that analysis is what led to the biggest improvement to omega in more than a decade.

Last summers paper, by Duan, Zhou and Wu, showed that Strassens process could still be sped up significantly. It was all because of a concept they called a hidden loss, buried deep within previous analyses a result of unintentionally killing too many blocks, Zhou said.

The laser method works by labeling blocks with overlaps as garbage, slated for disposal; other blocks are deemed worthy and will be saved. The selection process, however, is somewhat randomized. A block rated as garbage may, in fact, turn out to be useful after all. This wasnt a total surprise, but by examining many of these random choices, Duans team determined that the laser method was systematically undervaluing blocks: More blocks should be saved and fewer thrown out. And, as is usually the case, less waste translates into greater efficiency.

Being able to keep more blocks without overlap thus leads to a faster matrix multiplication algorithm, Le Gall said.

After proving the existence of this loss, Duans team modified the way that the laser method labeled blocks, reducing the waste substantially. As a result, they set a new upper bound for omega at around 2.371866 an improvement over the previous upper bound of 2.3728596, set in 2020 by Josh Alman and Vassilevska Williams. That may seem like a modest change, lowering the bound by just about 0.001. But its the single biggest improvement scientists have seen since 2010. Vassilevska Williams and Almans 2020 result, by comparison, only improved upon its predecessor by 0.00001.

But whats most exciting for researchers isnt just the new record itself which didnt last long. Its also the fact that the paper revealed a new avenue for improvement that, until then, had gone totally unnoticed. For nearly four decades, everyone has been relying upon the same laser method, Le Gall said. Then they found that, well, we can do better.

The January 2024 paper refined this new approach, enabling Vassilevska Williams, Zhou and their co-authors to further reduce the hidden loss. This led to an additional improvement of omegas upper bound, reducing it to 2.371552. The authors also generalized that same technique to improve the multiplication process for rectangular (n-by-m) matrices a procedure that has applications in graph theory, machine learning and other areas.

Some further progress along these lines is all but certain, but there are limits. In 2015, Le Gall and two collaborators proved that the current approach the laser method coupled with the Coppersmith-Winograd recipe cannot yield an omega below 2.3078. To make any further improvements, Le Gall said, you need to improve upon the original [approach] of Coppersmith and Winograd that has not really changed since 1987. But so far, nobody has come up with a better way. There may not even be one.

Improving omega is actually part of understanding this problem, Zhou said. If we can understand the problem well, then we can design better algorithms for it. [And] people are still in the very early stages of understanding this age-old problem.

Here is the original post:

New Breakthrough Brings Matrix Multiplication Closer to Ideal - Quanta Magazine

Related Posts

Comments are closed.