Friday, December 8, 2017

Gram-Schmidt

Reminder definitions

Orthogonal- oft interchangeable with perpendicularity at points of intersection.But within linear algebra it is better described as the dot product of two (nonzero) vectors producing a sum of zero.  
If the sum of these vectors was anything other than zero, orthogonality is not present
Orthonormal- when orthogonal vectors have a magnitude of 1, both of these conditions need to be met to be considered orthonormal.
Note how the summation of either vector entries will equal total; hence being normalized.
Note: ||v||=(v*v)^(1/2)

Basis- a linearly independent spanning set of vectors within a subspace. Going along with the subject theme, we are concerned with orthonormal basis'.


Gram-Schmidt

  • The Gram-Schmidt process allows for linearly independent basis' and orthogonalizes along orthonormalizes the basis.

  •  
  • The Gram-Schmidt process can also be used for polynomial basis'; the process is modified slightly in order to account for the basis being able to be orthogonalized easily but not orthonormalized as easily.
    • ". For more abstract spaces, however, the existence of an orthonormal basis is not obvious. The Gram-Schmidt algorithm is powerful in that it not only guarantees the existence of an orthonormal basis for any inner product space, but actually gives the construction of such a basis."
  • If you are a perpetual cynic like myself, you are probably questioning the importance of  seemingly simple method. 
    1. An orthonormal basis is identical to the standard basis and is able to be. 
      1. This simplifies problem sets significantly and allows problems to be solved in fewer steps (Thank you zeroes!)
    2. This process allows for QR factorization to be completed since Q is composed of an orthogonal matrix and R is composed from a triangular matrix.
      1. The determinant is quickly solved & eigenvalues.
      2. QR factorization solves complex linear equations and requires matrices(of various sizes) to be easily invertible.
      3. Useful in physics; quantum mechanics, and dynamic problems.

Examples

QR factorization

Practice

References

  1. Gram-Schmidt (n.d.). [PDF] Available  at:   http://www.math.ucla.edu/~yanovsky/Teaching/Math151B/handouts/GramSchmidt.pdf [Accessed 8 Dec. 2017].    
  2. Gram-Schmidt tutorial (n.d.). [PDF] Available at:  https://www.math.hmc.edu/calculus/tutorials/gramschmidt/gramschmidt.pdf  [Accessed 7 Dec. 2017]  
  3. Lecture (n.d.). [PDF] Available at:  http://www.math.usm.edu/lambers/mat415/lecture3.pdf [Accessed 8 Dec. 2017].
  4. Lesson Plan (n.d.). [PDF] Available at:http://www.ucl.ac.uk/~ucahmdl/LessonPlans/Lesson10.pdf [Accessed 8 Dec. 2017].  
  5. Practice. (n.d.). [PDF] Available at: http://www.math.ucsd.edu/~jmckerna/Teaching/14-15/Autumn/20F/practicef.pdf [Accessed 7 Dec. 2017].

 
 

   
  

Friday, October 27, 2017

Much Ado about Eigenvalues and their Vectors

Eigenvalues and Eigenvectors:



    • What is an eigenvector?
      •  Are vectors which are fixed in direction under a given linear transformation. 
    • What is an eigenvalue?
      • Is the scaling factor of these eigenvectors.
⭐ The great thing about eigenvectors and eigenvalues is that it allows simple linear equations (Ax=b),which are referred to as steady state problems, to be become dynamic problems(includes rate of change over periods of time) that answer a variety of questions

Rules:

  1. A scalar lambda(λ) is an eigenvalue of a square matrix A if there exists a nontrivial solution for x of Ax=λx  . The resulting nontrivial solution of x is called the eigenvector; which corresponds to the eigenvalue (λ)
  2. An eigenvalue is able to be zero but an eigenvector is not able to be equivalent to zero.
  3. A n*n matrix A is found to be invertible only if zero is not an eigenvalue of A
    1. Suppose eigenvalue of A=0 Then there must be a nontrivial solution for the vector x...
      • Meaning:
      • Ax=0*x=0
      1. This would imply that A is in fact not invertible
  4. Standard form equation; since Ax=λx is not in a familiar form seeing as the unknown 'x' is on both sides of the equation, the identity matrix is used to rectify the form and allow 'x' to be solved for more easily. 
    • Ax=λx
    • Ax-λx=0
    • Ax-λIx=0
    • (A-λI)x=0
  5. Eigenspace of the square matrix A corresponds to the eigenvalue of A and all of the corresponding eigenvectors.
  6. Eigenspace of λ is the null space of matrix A-λI; which then makes it a subspace of Rn
Eigenvector= (A-λI)x=0               Eigenvalue=(A-λI)x=0

Example:

Given eigenvalue to find eigenvector


Using eigenvectors to find eigenvalues
Practice:
Exercise EE.C11 Chris Black
Find the characteristic polynomial of the matrix A=[321011120].
Exercise EE.C12 Chris Black
Find the characteristic polynomial of the matrix A=[1210101021103101].


Exercise EE.C21 Robert Beezer
The matrix A below has λ=2 as an eigenvalue. Find the geometric multiplicity of λ=2 using your calculator only for row-reducing matrices.A=[181533154866991695694]
Exercise EE.C22 Robert Beezer
Without using a calculator, find the eigenvalues of the matrix B.B=[2111]

Exercise EE.C23 Chris Black
Find the eigenvalues, eigenspaces, algebraic and geometric multiplicities for 

Beezer, R. (n.d.). Eigenvalues and eigenvectors. Retrieved October 27, 2017, from http://linear.ups.edu/html/section-EE.html  Problem set and solutions

Black, C. (n.d.). Eigenvalues and eigenvectors. Retrieved October 27, 2017, from http://linear.ups.edu/html/section-EE.html  Problem set and solutions


Useful applications:
Fields that benefit from eigenvalues:
  • Physics
    • Quantum mechanics
  • Engineering
  • Biology
    • Ecology-i.e. Leslie model
      • Describes the growth of populations (and their projected age distribution), in which a population is closed to migration, growing in an unlimited environment, and where only one sex, usually the female, is considered. The dominant eigenvalue of ( Leslie matrix), gives the population's asymptotic growth rate (growth rate at the stable age distribution). The corresponding eigenvector provides the stable age distribution, the proportion of individuals of each age within the population. Once the stable age distribution has been reached, a population undergoes exponential growth at rate .
    • Microbiology
  • Geology and glaciology
  • Face recognition software:
    • "In image processing, processed images of faces can be seen as vectors whose components are the brightnesses of each pixel. The dimension of this vector space is the number of pixels. The eigenvectors of the covariance matrix associated with a large set of normalized pictures of faces are called eigenfaces; this is an example of principal component analysis. They are very useful for expressing any face image as a linear combination of some of them. In the facial recognition branch of biometrics, eigenfaces provide a means of applying data compression to faces for identification purposes. Research related to eigen vision systems determining hand gestures has also been made".
      1. (n.d.) Eigenface. Retrieved 10.29.2017. from https://en.wikipedia.org/wiki/Eigenface[45]
Helpful resources: