CATEGORII DOCUMENTE  


Bulgara  Ceha slovaca  Croata  Engleza  Estona  Finlandeza  Franceza 
Germana  Italiana  Letona  Lituaniana  Maghiara  Olandeza  Poloneza 
Sarba  Slovena  Spaniola  Suedeza  Turca  Ucraineana 
This section presents Strassen's remarkable recursive algorithm for multiplying n n matrices that runs in (n^{lg 7}) = O(n^{2.81}) time. For sufficiently large n, therefore, it outperforms the naive (n^{3}) matrixmultiplication algorithm MATRIX MULTIPLY from Section 26.1.
Strassen s algorithm can be viewed as an application of a familiar design technique: divide and conquer. Suppose we wish to compute the product C = AB, where each of A, B, and C are n n matrices. Assuming that n is an exact power of 2, we divide each of A, B, and C into four n/2 n/2 matrices, rewriting the equation C = AB as follows:
(Exercise 22 deals with the situation in which n is not an exact power of 2.) For convenience, the submatrices of A are labeled alphabetically from left to right, whereas those of B are labeled from top to bottom, in agreement with the way matrix multiplication is performed. Equation (9) corresponds to the four equations
r = ae + bf ,Each of these four equations specifies two multiplications of n/2 n/2 matrices and the addition of their n/2 n/2 products. Using these equations to define a straightforward divideandconquer strategy, we derive the following recurrence for the time T(n) to multiply two n n matrices:
T(n) = 8T(n/2) + (n^{2}).Unfortunately, recurrence (14) has the solution T(n) = (n^{3}), and thus this method is no faster than the ordinary one.
Strassen discovered a different recursive approach that requires only 7 recursive multiplications of n/2 n/2 matrices and (n^{2}) scalar additions and subtractions, yielding the recurrence
T(n) = 7T(n/2) + (n^{2})Strassen's method has four steps:
1. Divide the input matrices A and B into n/2 n/2 submatrices, as in equation (9).
2. Using (n^{2}) scalar additions and subtractions, compute 14 n/2 n/2 matrices A_{1}, B_{1}, A_{2}, B_{2}, . . . , A_{7}, B_{7}.
3. Recursively compute the seven matrix products P_{i} = A_{i}B_{i} for i = 1, 2, . . . , 7.
4. Compute the desired submatrices r, s, t, u of the result matrix C by adding and/or subtracting various combinations of the P_{i}_{ }matrices, using only (n^{2}) scalar additions and subtractions.
Such a procedure satisfies the recurrence (15). All that we have to do now is fill in the missing details.
It is not clear exactly how Strassen discovered the submatrix products that are the key to making his algorithm work. Here, we reconstruct one plausible discovery method.
Let us guess that each matrix product P_{i} can be written in the form
P_{i } A_{i}B_{i}where the coefficients _{ij}, _{ij}_{ }are all drawn from the set . That is, we guess that each product is computed by adding or subtracting some of the submatrices of A, adding or subtracting some of the submatrices of B, and then multiplying the two results together. While more general strategies are possible, this simple one turns out to work.
If we form all of our products in this manner, then we can use this method recursively without assuming commutativity of multiplication, since each product has all of the A submatrices on the left and all of the B submatrices on the right. This property is essential for the recursive application of this method, since matrix multiplication is not commutative.
For convenience, we shall use 4 4 matrices to represent linear combinations of products of submatrices, where each product combines one submatrix of A with one submatrix of B as in equation (16). For example, we can rewrite equation (10) as
The last expression uses an abbreviated notation in which '+' represents +1, ' represents 0, and ' represents 1. (From here on, we omit the row and column labels.) Using this notation, we have the following equations for the other submatrices of the result matrix C:
We begin our search for a faster matrixmultiplication algorithm by observing that the submatrix s can be computed as s = P_{1} + P_{2}, where P_{1 }and P_{2} are computed using one matrix multiplication each:
The matrix t can be computed in a similar manner as t = P_{3} + P_{4}, where
Let us define an essential term to be one of the eight terms appearing on the righthand side of one of the equations (10)(13). We have now used 4 products to compute the two submatrices s and t whose essential terms are ag, bh, ce, and df . Note that P_{1} computes the essential term ag, P_{2} computes the essential term bh, P_{3} computes the essential term ce, and P_{4} computes the essential term df. Thus, it remains for us to compute the remaining two submatrices r and u, whose essential terms are the diagonal terms ae, bf, cg, and dh, without using more than 3 additional products. We now try the innovation P_{5} in order to compute two essential terms at once:
In addition to computing both of the essential terms ae and dh, P_{5 }computes the inessential terms ah and de, which need to be cancelled somehow. We can use P_{4} and P_{2} to cancel them, but two other inessential terms then appear:
By adding an additional product
however, we obtain
We can obtain u in a similar manner from P_{5} by using P_{1} and P_{3} to move the inessential terms of P_{5} in a different direction:
By subtracting an additional product
we now obtain
The 7 submatrix products P_{1}, P_{2}, . . . , P_{7} can thus be used to compute the product C = AB, which completes the description of Strassen's method.
The large constant hidden in the running time of Strassen s algorithm makes it impractical unless the matrices are large (n at least 45 or so) and dense (few zero entries). For small matrices, the straightforward algorithm is preferable, and for large, sparse matrices, there are special sparsematrix algorithms that beat Strassen's in practice. Thus, Strassen s method is largely of theoretical interest.
By using advanced techniques beyond the scope of this text, one can in fact multiply n n matrices in better than (n^{1g 7}) time. The current best upper bound is approximately O(n^{2.376}). The best lower bound known is just the obvious (n^{2}) bound (obvious because we have to fill in n^{2 }elements of the product matrix). Thus, we currently do not know how hard matrix multiplication really is.
Strassen s algorithm does not require that the matrix entries be real numbers. All that matters is that the number system form an algebraic ring. If the matrix entries do not form a ring, however, sometimes other techniques can be brought to bear to allow his method to apply. These issues are discussed more fully in the next section.
Use Strassen s algorithm to compute the matrix product
Show your work.
How would you modify Strassen s algorithm to multiply n n matrices in which n is not an exact power of 2? Show that the resulting algorithm runs in time (n^{1g 7}).
What is the largest k such that if you can multiply 3 3 matrices using k multiplications (not assuming commutativity of multiplication), then you can multiply n n matrices in time o(n^{1g 7})? What would the running time of this algorithm be?
V. Pan has discovered a way of multiplying 68 68 matrices using 132,464 multiplications, a way of multiplying 70 70 matrices using 143,640 multiplications, and a way of multiplying 72 72 matrices using 155,424 multiplications. Which method yields the best asymptotic running time when used in a divideandconquer matrixmultiplication algorithm? Compare it with the running time for Strassen s algorithm.
How quickly can you multiply a kn n matrix by an n kn matrix, using Strassen s algorithm as a subroutine? Answer the same question with the order of the input matrices reversed.
Show how to multiply the complex numbers a + bi and c + di using only three real multiplications. The algorithm should take a, b, c, and d as input and produce the real component ac  bd and the imaginary component ad + bc separately.
Politica de confidentialitate 
Vizualizari: 1182
Importanta:
Termeni si conditii de utilizare  Contact
© SCRIGROUP 2023 . All rights reserved