Block Wiedemann algorithm


The block Wiedemann algorithm for computing kernel vectors of a matrix over a finite field is a generalisation of an algorithm due to Don Coppersmith.

Coppersmith's algorithm

Let be an square matrix over some finite field F, let be a random vector of length, and let. Consider the sequence of vectors obtained by repeatedly multiplying the vector by the matrix ; let be any other vector of length, and consider the sequence of finite-field elements
We know that the matrix has a minimal polynomial; by the Cayley–Hamilton theorem we know that this polynomial is of degree no more than. Say. Then ; so the minimal polynomial of the matrix annihilates the sequence and hence.
But the Berlekamp–Massey algorithm allows us to calculate relatively efficiently some sequence with. Our hope is that this sequence, which by construction annihilates, actually annihilates ; so we have. We then take advantage of the initial definition of to say and so is a hopefully non-zero kernel vector of.

The block Wiedemann algorithm

The natural implementation of sparse matrix arithmetic on a computer makes it easy to compute the sequence S in parallel for a number of vectors equal to the width of a machine wordindeed, it will normally take no longer to compute for that many vectors than for one. If you have several processors, you can compute the sequence S for a different set of random vectors in parallel on all the computers.
It turns out, by a generalization of the Berlekamp–Massey algorithm to provide a sequence of small matrices, that you can take the sequence produced for a large number of vectors and generate a kernel vector of the original large matrix. You need to compute for some where need to satisfy and are a series of vectors of length n; but in practice you can take as a sequence of unit vectors and simply write out the first entries in your vectors at each time t.