A Study of The Applications of Matrices and R^(n

Download Report

Transcript A Study of The Applications of Matrices and R^(n

A Study of The Applications of
Matrices and R^(n) Projections
By Corey Messonnier
Matrices and R^(n) projections
Matrices are rectangular arrays of numbers,
symbols, or expressions, where the individual
entries are called its elements. An R^(n)
projection is a mapping from an n
dimensional space to another space of n or m
dimensions.
Some examples of matrixes and R^(n)
projection in practical applications
a) Graph theory:
b) Symmetries and transformations in physics:
c) Linear combinations of quantum states:
d) Normal modes:
e) Geometrical optics
f) Electronics:
some examples of matrixes and R^(n)
projection in mathematical applications
a) Analysis and geometry:
b) Probability theory and statistics:
c) Representations of equations:
Graph Theory:
1. is the theory of an adjacency matrix of a finite
graph, in which the matrix saves which vertices of
the graph that are connected by edges.
2. The concepts can be applied to websites
connecting to hyperlinks or cities connected by
roads. In these cases, the matrices are usually
sparse matrices which are matrices containing few
nonzero entries.
Symmetries and transformations in
physics
Examples
Are some elementary particles in quantum field theory are classified
as representations of the Lorentz group of special relativity and
also by their behavior under the spin group. Another example is
quarks: with the three lightest quarks, there is a group-theoretical
representation involving the special unitary group SU(3); for their
calculations, physicists use a convenient matrix representation
known as the Gell-Mann matrices. The Gell-Mann matrices are
also used for the SU(3) gauge group that forms the basis of the
modern description of strong nuclear interactions, called quantum
chromodynamics. The Cabibbo-Kobayashi-Maskawa matrix, is an
expression of the basic quarks states that are important for weak
interactions that are not the same as, but linearly related to, the
basic quarks states that define particles with specific and
distinct masses.
Linear combinations of quantum
states
1.
2.
The 1st model of quantum mechanics was representing the theory's
operators by infinite-dimensional matrices acting on quantum
states. This area of study is also referred to as matrix mechanics.
One particular example is the density matrix that characterizes the
"mixed" state of a quantum system as a linear combination of
elementary, "pure" eigenstates.
Collision reactions such as those that occur in particle accelerators
are where non-interacting particles head towards each other and
collide in a small interaction zone. The result of these types of
collision reactions is the production of a new set of non-interacting
particles, which can be described as the scalar product of outgoing
particle states and a linear combination of ingoing particle states.
The linear combination is given by a matrix known as the S-matrix,
which encodes all information about the possible interactions
between particles.
Normal modes
1. Harmonic systems
2. Equations of Motion
3. Uses of eigenvectors in normal modes
Geometrical optics
The wave nature of light can be modeled with matrices in which light
rays are represented as geometrical rays. If the deflection of light
rays by optical elements is small, the action of a lens or reflective
element on a given light ray can be expressed as the multiplication
of a two-component vectors with a two-by-two matrix called a ray
transfer matrix. The vector components are the light ray's slope
and its distance from the optical axis, while the matrix encodes
the properties of the optical element. There are two kinds of
matrices: (1) a refraction matrix describing the refraction at a lens
surface; (2) and a translation matrix, describing the translation of
the plane of reference to the next refracting surface, where
another refraction matrix is applied. The optical system,
consisting of a combination of lenses and/or reflective elements,
is simply described by the matrix resulting from the product of the
component matrices.
Electronics
1. Mesh analysis
2. Electronic components
Analysis and geometry
1.
The Hessian matrix
is a matrix of a differentiable function; ƒ: Rn → R, which consists of the second
derivatives of ƒ with respect to the several coordinate directions. That is, it encodes
information about the local growth behavior of the function: given a critical
point x = (x1, ..., xn), i.e., a point where the first partial derivatives of ƒ vanish, the
function has a local minimum if the Hessian matrix is positive definite.
2.
Jacobi matrix
The Jacobi matrix is also another good example in which a differentiable map f: Rn → Rm. If we
let f1, ..., fm denote the components of f, then the Jacobi matrix is defined as if n > m, and
if the rank of the Jacobi matrix attains its maximal value m, f is locally invertible at that
point, by the implicit function theorem. This theorem is a tool that allows relations to be
converted to functions.
3. Partial differential equations
Partial differential equations can be classified by considering the matrix of coefficients of the
highest-order differential operators of the equation. For elliptic partial differential
equations this matrix is positive definite, which has decisive influence on the set of
possible solutions of the equation in question.
4. The finite element method is an important numerical method to solve partial differential
equations, widely applied in simulating complex physical systems. It attempts to
approximate the solution to some equation by piecewise linear functions, where the pieces
are chosen with respect to a sufficiently fine grid, which in turn can be recast as a matrix
equation.
Probability theory and statistics
1. Stochastic matrices
2. Random matrices
Representations of equations
1. Augmented matrices
2. complex numbers can be show in real 2 x 2 matrices under
which addition and multiplication of complex numbers and
matrices correspond to each other.
3. There are at least two ways of representing the
quaternions as matrices in such a way that the quaternion
addition and multiplication correspond to matrix addition
and matrix multiplication. One way is to use 2×2 complex
matrices, and the other is to use 4×4 real matrices. In each
case, the representation given is one of a family of linearly
related representations. In the terminology of abstract
algebra, these are injective homomorphisms from H to
the matrix rings M2(C) and M4(R).
Work cited
• http://en.wikipedia.org/wiki/Matrix_(mathe
matics)
• Notes from matrix analysis