home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!psinntp!kepler1!andrew
- From: andrew@rentec.com (Andrew Mullhaupt)
- Newsgroups: sci.math
- Subject: Re: Tensor Diagonalization
- Message-ID: <1342@kepler1.rentec.com>
- Date: 23 Nov 92 18:51:48 GMT
- References: <1992Nov20.221459.3566@news.media.mit.edu>
- Organization: Renaissance Technologies Corp., Setauket, NY.
- Lines: 83
-
- In article <1992Nov20.221459.3566@news.media.mit.edu> perry@media.mit.edu (Chris Perry) writes:
- >
- > I'm wondering if anyone has seen a tensor of rank greater than two
- >factored into its principle components - i.e., given a tensor A of rank n,
- >is there a decomposition similiar to
- > -1
- > S A S = L
- >
- > where L is a "diagonal" tensor of rank n, and S is a tensor of rank n that
- >is built out of the eigentensors (!) of A? Thanks for any advice or
- >references on this topic -
-
- I can help, but I look at tensor products from the mathematical,
- instead of physicist's approach.
-
-
- There are some facts about tensor products of vector spaces which are
- needed, and I give them here, using @ for the tensor product.
-
- Let U and V be vector spaces, and U@V their tensor product. If {e_i}
- is a basis for U and {f_j} is a basis for V then {e_i @ f_j} is a basis
- for U@V. The standard basis for U@V obtains when e_i and f_j are the standard
- bases for U and V, respectively.
-
- Let A and B be linear transformations on U and V, then the tensor product
- A@B is defined by the linear transformation on U@V which satisfies
-
- (A@B) (u@v) = (Au) @ (Bv)
-
- for all vectors u in U and v in V. This determines that the identity
- transformation on U@V is the tensor product of the identities on U and V.
- The inverse of A@B is also determined by this; an inverse can be formed
- by A^-1 @ B^-1,
-
- (A@B) (A^-1 @ B^-1) = (A A^-1) @ (B B^-1) = 1@1.
-
- Now the inverse is unique, so A@B is invertible iff A and B are, and
- (A@B)^-1 = A^-1 @ B^-1.
-
- Similarity transforms are given by the obvious tensor product. Suppose
- S^-1 A S = J and T^-1 B T = K. Then
-
- (S@T)^-1 (A@B) (S@T) = (S^-1 A S) @ (T^-1 B T) = J @ K.
-
- It follows that a transformation A@B is diagonalized with respect to a basis
- {e_i @ f_j} if and only if A is diagonalized in {e_i} and B is diagonalized
- in {f_j}, and the similarity is given by the obvious tensor product.
-
- Note that the dual space of U@V is isomorphic (i.e. equal) to U* @ V*, and
-
- if U and V have inner products, then so does U@V and you calculate it by:
-
- <u_1 @ v_1, u_2 @ v_2> = <u_1, u_2> <v_1, v_2>.
-
- This lets you determine what orthogonal vectors are in U@V: u_1 @ v_1 is
- orthogonal to u_2 @ v_2 if and only if either u_1 is orthogonal to u_2 or
- v_1 is orthogonal to v_2.
-
- The inner product determines what adjoints of linear transformations are;
- in particular if A+ is the adjoint of A and B+ is the adjoint of B then
-
- <u_1 @ v_1, (A@B) u_2 @ v_2> = <u_1, A u_2> <v_1, B v_2> =
-
- = <A+ u_1, u_2> <B+ v_1, v_2> = <(A+ @ B+)u_1 @ v_1, u_2 @ v_2>
-
- so the adjoint of (A@B) must be (A@B)+ = (A+ @ B+).
-
- In the case of real vector spaces, this means that the tensor product of
- orthogonal transformations is orthogonal.
-
- Finally, we get to the classic example. If A and B are symmetric, and
- S diagonalizes A to J and T diagonalizes B to K, then
-
- (S@T)^-1 (A@B) (S@T) = (S^-1 A S) @ (T^-1 B T) = J @ K
-
- Note that the diagonalization is orthogonal.
-
- You cannot extend this trivially to Jordan normal form, since the
- tensor product of transformations of the form (diagonal + nilpotent) is
- not necessarily (diagonal + nilpotent).
-
- Later,
- Andrew Mullhaupt
-