# Doc/Articles/Play141

**At Play With J**
... Table of Contents ... Previous Chapter ... Next Chapter

# 15. Oh, No, Not Eigenvalues Again!

. *By* Eugene McDonnell. *First published in* Vector, 14, 1, *(July 1997), 135-139*.

I can't explain why it is that I keep running into problems associated with eigenvalues. I don't seek them out, and have no interest in them, but there it is--they keep cropping up before me and I have somehow to find a way to drive a stake through their hearts before I can go on to something else.

When the eigenvalue problem last loomed before me, I found that I had to go back to basics in order to come to terms with it. In the course of doing so, I happened upon a technique in J for finding eigenvalues which is eminently satisfying pedagogically, since it can be used for small matrices to show all the theory of eigenvalues, even though it is staggeringly inefficient, becoming dreadfully slow for matrices of size 5 or 6 or larger. It is its pedagogical merit that I commend to you.

The basics are simple. If A is a square numeric matrix, we replace each diagonal atom aii with the two-atom list aii _1. What we are doing is replacing the problem of evaluating the determinant of a numeric matrix with that of evaluating the determinant of a matrix of polynomials. Of course, the non-diagonal terms are the simplest of polynomials, that is, constants, but the diagonal terms are all linear polynomials. Our problem is to evaluate the determinant of this polynomial matrix. In J we evaluate the determinant of a numeric matrix with the monad of the dot conjunction:

det =: -/ . * NB. (1)

For example,

]m =: 2 3,:5 8 2 3 5 8 det m 1

The verb det is not directly suited to our purpose, but with a few maneuvers we'll get where we want to go. First, let's take a suitable matrix for our example (the example and much of the approach I have borrowed from Cornelius Lanczos's book, Applied Analysis, Prentice-Hall, 1956, chapter II):

]A =: 33 16 72 , _24 _10 _57 ,: _8 _4 _17 33 16 72 _24 _10 _57 _8 _4 _17

We can obtain the diagonal atoms of A using the monad d [JP 3.B.m4, meaning phrase m4 from section B of chapter 3 of the book J Phrases].

d =: (<0 1)&|: NB. JP 3.B.m4 ]a0 =: d A 33 _10 _17

We append _1 to each of these using J's stitch (,.):

]a1 =: a0 ,. _1 33 _1 _10 _1 _17 _1

In order to be able to work with a matrix in which a single number is replaced by a list of two numbers, it will be necessary to box the rows of the 2-column matrix above, using monad B1 [JP 1.C.m11]

B1 =: <"1 NB. JP 1.C.m11 ]a2 =: B1 a1 +-----+------+------+ |33 _1|_10 _1|_17 _1| +-----+------+------+

In order to amend the diagonal atoms of A, we'll box the atoms of matrix A first, using monad B0 [JP 1.C.m10]:

B0 =: <"0 NB. JP 1.C.m10 ]a3 =: B0 A +---+---+---+ |33 |16 |72 | +---+---+---+ |_24|_10|_57| +---+---+---+ |_8 |_4 |_17| +---+---+---+

The amend adverb in J needs the indices of the places to be amended. We obtain these using the monad d again, modified by the adverb IR [JP 3.B.a3].

IR=: @(i.@$@]) NB. JP 3.B.a3 ]a4 =: (d IR) A 0 4 8

We can use amend now to obtain the matrix we want:

]a5 =: a2 d IR } a3 +-----+------+------+ |33 _1|16 |72 | +-----+------+------+ |_24 |_10 _1|_57 | +-----+------+------+ |_8 |_4 |_17 _1| +-----+------+------+

For use later, we'll collect the steps taken so far to form the verb db:

db =: B1@(_1 ,.~ d) d IR } B0

Now we are in position to evaluate the determinant of this matrix. We can't use the det verb from above, since it can only deal with matrices whose atoms are simple numbers. My first thought was to use the definition of the Determinant conjunction u . v from the J Introduction and Dictionary:

u =: -/ v =: * DET =: v/@,`({."1 u .v$:@minors)@.(1&<@{:@$)"2 minors =: }."1@(1&([\.))

replacing the definitions of u and v by polynomial difference and polynomial product, dyads dif and ppr [JP 9.C.d1 and 9.C.d2]. However, I hadn't proceeded far when I suddenly realized that, since DET was the definition of the monadic form of the dot conjunction, I ought to be able to use dot itself directly, and not mess with its sybilline definition. The first thing I did was to provide myself with modified forms of dif and ppr, ones which operated on boxed atoms:

dif =: -/@,: NB. JP 9.C.d1 ppr =: +//.@(*/) NB. JP 9.C.d2 difb =: dif&.> pprb =: ppr&.>

Taking this tack, I could now write, in direct analogy with (1),

detp =: difb/ . pprb

and then apply this to matrix a5:

detp a5 +----------+ |6 _11 6 _1| +----------+

I wanted an open, not a boxed result, so modified the definition of detp:

detp =: >@(difb/ . pprb) ]a6 =: detp a5 6 _11 6 _1

This looked promising, but was it indeed the characteristic polynomial of my matrix? One way to find out was to use the Cayley-Hamilton theorem, which says that a matrix satisfies its own characteristic equation. To see whether this was so, I found the first four powers of A, using the monad I [JP 9.A.m0] to form the 0th power of the matrix, that is, the identity matrix:

I =: =@i.@# NB. JP 9.A.m0 ip =: +/ . * NB. inner product ]a7 =: A&ip^:(i.@>:@#`]) I A 1 0 0 0 1 0 0 0 1 33 16 72 _24 _10 _57 _8 _4 _17 129 80 240 _96 _56 _189 _32 _20 _59 417 304 648 _312 _220 _507 _104 _76 _161

then multiplied these by the supposed coefficients of the characteristic equation:

]a8 =: a6 * a7 6 0 0 0 6 0 0 0 6 _363 _176 _792 264 110 627 88 44 187 774 480 1440 _576 _336 _1134 _192 _120 _354 _417 _304 _648 312 220 507 104 76 161

and lastly, summed a8, trusting that the result would be the zero matrix:

+/a8 0 0 0 0 0 0 0 0 0

From here it was a short step to the eigenvalues: they are the roots of this polynomial, so I could use J's polynomial primitive (p.):

p. detp a5 +--+-----+ |_1|3 2 1| +--+-----+

The result I wanted was the open (>) of the tail ({:) of this one, so I made another modification:

]a9 =: >@{:@p.@detp a5 3 2 1

The parts can now be assembled to give the eigenvalue finder cm:

cm =: > @ {: @ p. @ detp @ db cm A 3 2 1

If these are the eigenvalues, their product should be equal to the determinant of the matrix.

*/ cm A 6 det A 6

So far, so good. Furthermore, if these are the eigenvalues, if any one of them is subtracted from the diagonals of A, the determinant of the result should be zero. Is this the case?

det A - 3 * I A 0 det A - 2 * I A 0 det A - 1 * I A 0

Yes it is.

The verb cm is a model of the monad of the c. primitive described in the J Introduction and Dictionary 1996 Edition. It differs in that the roots are given in order of descending magnitude, which is how the polynomial (p.) primitive provides them, rather than the ascending order prescribed in the Dictionary. Since c. has not yet been implemented, it's anyone's guess how p. and c. will be reconciled. I've brought this matter to the attention of the proper authorities, so they do at least know that the problem exists.

Vector Vol.14 No.1