Cool Multiplying Matrices Faster Than Coppersmith-Winograd Ideas


Cool Multiplying Matrices Faster Than Coppersmith-Winograd Ideas. Spencer [1942]) to get an algorithm with running time ˇ o(n2:376). Matrix multiplication via arithmetic progressions.

CoppersmithWinograd algorithm Semantic Scholar
CoppersmithWinograd algorithm Semantic Scholar from www.semanticscholar.org

Also see the section references, which lists some pointers to even faster methods. Journal of symbolic computation [1990]. These are faster than the previous o(n.

Quoting Directly From Their 1990 Paper.


The coppersmith{winograd algorithm relies on a certain identity which we call the coppersmith{winograd identity. This means that, treating the input n×n matrices as block 2 × 2. As of april 2014 the asymptotically fastest algorithm runs in \mathcal{o}(n^{2.3728639}) time [1].

But The Algorithm Is Not Very Practical, So I Recommend Either Naive Multiplication, Which Runs In \Mathcal{O}(N^3), Or Strassen's Algorithm [2], Which Runs In \Mathcal{O}(N^{2.8}).


You are unlikely to find it as it’s not practical. On symbolic and algebraic computation, 2014 Henceforth it’s unlikely someone bothered to actually c.

The Conceptual Idea Of These Algorithms Are Similar To Strassen's Algorithm:


Also see the section references, which lists some pointers to even faster methods. Check if you have access through your login credentials or your institution to get full access on. Pan presents some algorithms for matrix multiplication for which &ohgr;

Spencer [1942]) To Get An Algorithm With Running Time ˇ O(N2:376).


Le gall, powers of tensors and fast matrix multiplication, proc. Also see the section references, which lists some pointers to even faster methods. Proceedings of the 44th symposium on theory of computing, stoc.

As It Can Multiply Two ( N * N) Matrices In 0(N^2.375477) Time.


However, you can do much better for certain kinds of matrices, e.g. On the theory of computing (stoc), pp. Journal of symbolic computation [1990].