논문을 읽을 때나 쓸 때나 사용되는 수식의 대부분은 행렬 연산이다. 방법은 알아도 행렬 연산과 관련된 용어가 기억나지 않아서 곤란하게 된다. 학부 때 공부했던 Advanced Engineering Mathematics Chapter 6~7 (Erwin Kreyszig)의 내용 중에서 행렬의 기본적인 용어에 대해 공부해보자.
- matrix - rectangular array of numbers enclosed in brackets
- row - horizontal line in matrix
- column - vertical line in matrix
- row|column vector - matrix which consists of a single row|column
- square matrix - matrix which has as many rows as columns
- rectangular matrix - matrix that is not square
- main(principal) diagonal - diagonal entries a11, a22, a33, ..., ann of square matrix
- transposition - transpose of A of mxn matrix [ajk] is nxm matrix that has the first row of A is as its first column, the second row of A as its second column, and so on.
- matrix addition - for matrices A and B of the same size, A+B is obtained by adding the corresponding entries
- scalar multiplication - product of mxn matrix A and any scalar c, written cA, is obtained by multiplying each entry in A by c
- zero matrix - matrix with all entries zero
- matrix multiplication - product C=AB(in this order) of an mxn matrix A and an rxp matrix B is defined if and only if r=n and is defined as the mxp matrix C with entries cjk=aj1b1k+aj2b2k+...+ajnbnk. "multiplication of rows into columns". The matrix B is premultiplied, or multiplied from the left, by A. A is postmultiplied, or multiplied from the right, by B
- triangular matrix - upper triangular matrices are square matrices that can have nonzero entries only on and above the main diagonal, whereas any entry below the diagonal must be zero. lower triangular matrices can have nonzero entris only on and below the main diagonal.
- diagonal matrix - square matrix that can have nonzero entries only on the main diagonal
- scalar matrix - all the diagonal entries of a diagonal matrix S are equal
- unit(identity) matrix - scalar matrix whose entries on the main diagonal are all 1
- inner product - single entry given by matrix multiplication if a is a row vector and b a column vector, both with n components
- stochastic matrix - a square matrix with nonnegative entries and row sums all equal to 1
- Markov process - a stochastic process for which the probability of entering a certain state depends only on the last state occupied (and on the matrix governing the process)
- linear system - a linear system of m equations in n unknowns x1, x2, ..., xn is a set of equations of ai1x1 + ... ainxn=bi
- coefficients - number ajk of the system
- homogeneous system - linear system if the bi are all zero
- nonhomogenouse system - linear system if at least one bi is not zero
- solution - a set of numbers x1, ..., xn that satisfies all the m equations
- solution vector - a vector x whose components constitute a solution of linear system
- augmented matrix - augmenting coefficients matrix A by the column b
- Gauss elimination - a systematic elimination process and a standard method for solving linear systems
- pivot equation - an equation used in eliminating the other equations by Gauss's method
- partial pivoting - exchanging the order of equations
- total pivoting - the order of the equations and unknowns are changed
- back substitution
- elementary row operation - interchange of two rows, addition of a constatnt multiple of one row to another row, multiplication of a row by a nonzero constant c
- row-equivalent system - a linear system S1 is row-equivalent to a linear system S2 if S1 can be obtained from S2 by elementary row operations
- overdetermined - if a linear system has more equations than unknowns
- determined - if a linear system has as many equations as unknowns
- underdetermined - if a linear system has fewer equations than unknowns
- echelon form - the form of the system and of the matrix in the last step of the Gauss elimination
- linearly independent - vectors a(1), ..., a(n) is linearly independent if it has the only m-tuple of scalars with all cj's zero that holds the equation of linear combination c1a(1)+c2a(2)+...+cma(m)=0
- linearly depdenent - vectors a(1),,,,a(n) can be expressed at least one of them as a linear combination of the others
- rank of matrix - the maximum number of lineraly independent row vectors of a matrix
- vector space - a nonempty set V of vectors such that with any two vectors a and b in V all their linear combinations αa+βb are elements of V, and these vectors satisfy the laws - commutative, distributive, associative.
- dimension of vector space - the maximum number of lineary independent vectors in a vector space
- basis - a linearly independent set in vector space V consisting of a maximum possible number of vectors in V
- span - the set of all linear combinations of given vectors a(1),...,a(p) with the same number of components
- subspace - a nonempty subset of vector space V that itself forms a vector space with respect to the two algebraic operations defined for the vectors of V
- row|column space - the span of the row|column vectors of a matrix
- solution(null) space - vector space (of dimension n-r) formed by solutions of homogeneous linear system(mxn) whose rank is r
- nullity - dimension of null space of linear homogeneous system
- determinant - a scalar associated with an nxn matrix A, D=aj1Cj1+aj2Cj2+...+ajnCjn=a1kC1k+a2kC2k+...+ankCnk where Cjk=(-1)^(j+k)*Mjk
- submatrix - matrix Mjk is obtained from matrix A by deleting the row and column of the entry ajk
- minor - submatrix Mjk in determinant D
- cofactor - Cjk in determinant D
- inverse - inverse of an nxn matrix A=[ajk] is denoted by A^-1 and is an nxn matrix such that A*A^-1=A^-1*A=I where I is the nxn unit matrix. inverse of A exists if and only if rank A=n or det A≠0
- nonsingular matrix - matrix which has an inverse
- singular matrix - matrix which has no inverse
- mapping(transformation or operator) - To each vector x in vector space X, a unique vector y in vector space Y can be assigned
- image - the vector y in Y assigned to vector x in X
- linear mapping(linear transformation) - if for all vectors v and x in X and scalars c, F(v+x)=F(v)+F(x) and F(cx)=cF(x)
- eigenvalue(characteristic value) - a scalar value λ for which a solution x≠0 of the vector equation Ax=λx. The eigenvalues of a square matrix A are the roots of the characteristic equation of A.
- eigenvector(characteristic vector) - a vector solution x≠0 of the vector equation Ax=λx
- spectrum - the set of the eigenvalues
- spectral radius - the largest absolute values of the eigenvalues
- eigenspace - a vector space formed by the set of all eigenvectors corresponding to an eigenvalue, together with 0
- eigenvalue problem - the problem of determining the eigenvalues and eigenvectors of a matrix
- characteristic determinant(characteristic polynomial) - D(λ)=det(A-λI)
- characteristic equation - D(λ)=0
- algebraic multiplicity - the order Mλ of an eigenvalue λ as a root of the characteristic polynomial
- geometric multiplicity - the number mλ of linear independent eigenvectors corresponding to λ
- defect - the difference Δλ=Mλ-mλ
- symmetric matrix - square matrix whose transpose equals the matrix : A^T=A
- skew-symmetric matrix - square matrix whose transpose equals the minus matrix : A^T=-A
- orthogonal - square matrix if its transposition gives the inverse of itself : A^T=A^-1
- orthogonal transformation - y=Ax with A an orthogonal matrix. orthogonal transformation preserves the value of inner product of vectors and also the length or norm of a vector
- similarity transformation - A=P^-1AP where P is nonsingular nxn matrix
댓글 없음:
댓글 쓰기