Vector
A mathematical object that has both magnitude ($|\vec{v}|$) and direction in space.
Components
The individual elements of a vector, e.g., (v1, v2, ..., vn).
Vector Magnitude (Norm)
Length of a vector, denoted as $|\vec{v}|$ or $\|\vec{v}\|$
Unit Vector
Vector with magnitude of 1: $|\vec{u}| = 1$
Zero Vector (Null Vector)
A vector where all components are zero: $\vec{0}$ or $\mathbf{0}$
Column Vector
A $n \times 1$ matrix (vertical array of numbers): $$\vec{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix}$$
Row Vector
A $1 \times n$ matrix (horizontal array of numbers): $$\vec{v} = \begin{bmatrix} v_1 & v_2 & \cdots & v_n \end{bmatrix}$$
Vector Addition
For vectors $u,v$ in $\mathbb{R}^n$: $(u + v)_i = u_i + v_i$
Scalar Multiplication
For scalar $c$ and vector $v$: $(cv)_i = cv_i$
Linear Combination
A vector $\vec{v}$ is a linear combination of vectors $\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\}$ if it can be written as $\vec{v} = c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n}$ for some scalars $c_1, c_2, ..., c_n$
Dot Product
For vectors $u,v$ in $\mathbb{R}^n$: $u\cdot v = \sum_{i=1}^n u_iv_i$
Cross Product
For $u,v$ in $\mathbb{R}^3$: $u\times v = \|u\|\|v\|\sin\theta\, \mathbf{n}$
Vector Projection
Orthogonal projection of vector $\vec{v}$ onto vector $\vec{u}$ is the vector component of $\vec{v}$ parallel to $\vec{u}$, given by: $\text{proj}_{\vec{u}}\vec{v} = \frac{\vec{v} \cdot \vec{u}}{|\vec{u}|^2}\vec{u}$
Linearly Independent Vectors
A set of vectors $\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\}$ is linearly independent if the equation $c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n} = \vec{0}$ is satisfied only when all $c_i = 0$
Linearly Dependent Vectors
A set of vectors $\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\}$ is linearly dependent if there exist scalars $c_1, c_2, ..., c_n$, not all zero, such that $c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n} = \vec{0}$
Vector Space
A set $V$ with vectors $\vec{u}, \vec{v} \in V$ and scalars $c$ is a vector space if it's closed under addition ($\vec{u} + \vec{v} \in V$) and scalar multiplication ($c\vec{v} \in V$), and satisfies the vector space axioms
Vector Subspace
A subset $W$ of a vector space $V$ is a subspace if it's closed under addition and scalar multiplication: for all $\vec{u}, \vec{v} \in W$ and scalar $c$, both $\vec{u} + \vec{v} \in W$ and $c\vec{v} \in W$
Span
The span of vectors $\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\}$ is the set of all their linear combinations: $\text{span}\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\} = \{c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n} | c_i \in \mathbb{R}\}$
Basis
A basis of a vector space $V$ is a linearly independent set of vectors that spans $V$. For any vector $\vec{v} \in V$, there exists a unique representation $\vec{v} = c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n}$
Dimension
The dimension of a vector space $V$ is the number of vectors in any basis of $V$, denoted $\dim(V)$
Orthogonal Vectors
Two vectors $\vec{u}$ and $\vec{v}$ are orthogonal if their dot product is zero: $\vec{u} \cdot \vec{v} = 0$
Orthonormal Vectors
A set of vectors is orthonormal if they are orthogonal to each other and each has unit length: $\vec{u_i} \cdot \vec{u_j} = \delta_{ij}$ where $\delta_{ij}$ is the Kronecker delta
Gram-Schmidt Process
Produces orthonormal basis $\{u_1,\ldots,u_n\}$ from linearly independent vectors $\{v_1,\ldots,v_n\}$
Direction Cosines
For vector $v$, cosines with axes: $\cos\alpha = \frac{v_x}{\|v\|}, \cos\beta = \frac{v_y}{\|v\|}, \cos\gamma = \frac{v_z}{\|v\|}$
Linear Transformation
Function $T: V \to W$ where $T(au + bv) = aT(u) + bT(v)$
Matrix Representation
For transformation $T$ with basis $\{v_1,\ldots,v_n\}$, $[T]_{\mathcal{B}} = [T(v_1)\cdots T(v_n)]$
Gradient Vector
For scalar function $f(x_1,\ldots,x_n)$: $\nabla f = \left(\frac{\partial f}{\partial x_1},\ldots,\frac{\partial f}{\partial x_n}\right)$
Position Vector
Vector $\vec{r} = (x,y,z)$ from origin to point $P(x,y,z)$
Matrix
A rectangular array of elements $a_{ij}$ in $m$ rows and $n$ columns
Row Matrix
Matrix of size $1 \times n$ (single row)
Column Matrix
Matrix of size $m \times 1$ (single column)
Square Matrix
A [matrix](!/linear-algebra/definitions#matrix) with an equal number of rows and columns, often associated with special properties like determinants and eigenvalues.
Zero Matrix
A [matrix](!/linear-algebra/definitions#matrix) with all elements are equal to zero. Also called **null matrix**.
Main Diagonal
In a [square matrix](!/linear-algebra/definitions#square_matrix), the main diagonal (or principal diagonal, or leading diagonal) consists of elements where row index equals column index.
Triangular Matrix
A square matrix where all the elements either above or below the [main diagonal](!/linear-algebra/definitions#main_diagonal) are zero.
Upper Triangular Matrix
A [square matrix](!/linear-algebra/definitions#square_matrix) with zeros below the [main diagonal](!/linear-algebra/definitions#main_diagonal). All elements $a_{ij}=0$ where $i > j$.
Lower Triangular Matrix
A square [matrix](!/linear-algebra/definitions#matrix) with zeros above the [main diagonal](!/linear-algebra/definitions#main_diagonal). All elements $a_{ij}=0$ where $i < j$.
Identity Matrix
A square matrix with 1s on the [main diagonal](!/linear-algebra/definitions#main_diagonal) and 0s elsewhere.
Anti-symmetric(Skew-symmetric) Matrix
A square matrix that equals the negative of its transpose: $A = -A^T$. All diagonal elements must be zero.
Diagonal Matrix
A square matrix with non-zero elements only on the [main diagonal](!/linear-algebra/definitions#main_diagonal).
Symmetric Matrix
A matrix equal to its transpose, $A = A^T$.
Transposition
An operation that flips a matrix over its main diagonal, switching rows and columns: $(A^T)_{ij} = A_{ji}$
Matrix Addition
For matrices $A,B$ of same size, $(A+B)_{ij} = A_{ij} + B_{ij}$
Scalar Addition (Matrix)
For scalar $c$ and matrix $A$, $(c+A)_{ij} = c + A_{ij}$
Scalar Multiplication(Matrix)
For scalar $c$ and matrix $A$, $(cA)_{ij} = cA_{ij}$
Matrix Multiplication
For matrices $A_{m\times n}, B_{n\times p}$, $(AB)_{ij} = \sum_{k=1}^n a_{ik}b_{kj}$
Determinant
For square matrix $A$, denoted $\det(A)$ or $|A|$
Inverse Matrix
For square matrix $A$, its inverse $A^{-1}$ satisfies $AA^{-1} = A^{-1}A = I$
Rank
Number of linearly independent rows/columns, denoted $\text{rank}(A)$
Trace
Sum of main diagonal elements: $\text{tr}(A) = \sum_{i=1}^n a_{ii}$
Echelon Form
Matrix where each row's leading non-zero entry (pivot) is to the right of pivots in rows above
Reduced Row Echelon Form
Row echelon form where each pivot is 1 and is the only non-zero entry in its column
Elementary Matrix
Matrix representing one elementary row operation on identity matrix
Orthogonal Matrix
Square matrix $A$ where $A^T = A^{-1}$, equivalently $AA^T = A^TA = I$
Scalar Matrix
A [square matrix](!/linear-algebra/definitions#square_matrix) where all element are equal to zero except those on main diagonal which are equal to constant number.
Adjoint
For matrix $A$, adjoint (adj$(A)$) is transpose of cofactor matrix
Matrix Size
Matrix with $m$ rows and $n$ columns denoted as $A_{m\times n}$ or $A \in \mathbb{R}^{m\times n}$
Eigenvalues
Scalar $\lambda$ satisfying $Av = \lambda v$ for nonzero vector $v$ (eigenvector)
Eigenvectors
Nonzero vector $v$ satisfying $Av = \lambda v$ for eigenvalue $\lambda$
Singular Matrix
Square matrix with $\det(A) = 0$
Augmented Matrix
Matrix $[A|b]$ representing system $Ax = b$
LU Decomposition
Matrix $A = LU$ where $L$ is lower triangular and $U$ is upper triangular
QR Decomposition
Matrix $A = QR$ where $Q$ is orthogonal ($Q^TQ = I$) and $R$ is upper triangular
Positive Definite Matrix
Symmetric matrix $A$ where $x^TAx > 0$ for all nonzero $x$
Diagonalization
Matrix $A = PDP^{-1}$ where $D$ is diagonal matrix of eigenvalues
Block Matrix
Matrix partitioned into submatrices $A_{ij}$
Sparse Matrix
Matrix with mostly zero elements, typically $O(n)$ nonzero elements
Vector
A mathematical object that has both magnitude ($|\vec{v}|$) and direction in space.
Components
The individual elements of a vector, e.g., (v1, v2, ..., vn).
Vector Magnitude (Norm)
Length of a vector, denoted as $|\vec{v}|$ or $\|\vec{v}\|$
Unit Vector
Vector with magnitude of 1: $|\vec{u}| = 1$
Zero Vector (Null Vector)
A vector where all components are zero: $\vec{0}$ or $\mathbf{0}$
Column Vector
A $n \times 1$ matrix (vertical array of numbers): $$\vec{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix}$$
Row Vector
A $1 \times n$ matrix (horizontal array of numbers): $$\vec{v} = \begin{bmatrix} v_1 & v_2 & \cdots & v_n \end{bmatrix}$$
Vector Addition
For vectors $u,v$ in $\mathbb{R}^n$: $(u + v)_i = u_i + v_i$
Scalar Multiplication
For scalar $c$ and vector $v$: $(cv)_i = cv_i$
Linear Combination
A vector $\vec{v}$ is a linear combination of vectors $\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\}$ if it can be written as $\vec{v} = c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n}$ for some scalars $c_1, c_2, ..., c_n$
Dot Product
For vectors $u,v$ in $\mathbb{R}^n$: $u\cdot v = \sum_{i=1}^n u_iv_i$
Cross Product
For $u,v$ in $\mathbb{R}^3$: $u\times v = \|u\|\|v\|\sin\theta\, \mathbf{n}$
Vector Projection
Orthogonal projection of vector $\vec{v}$ onto vector $\vec{u}$ is the vector component of $\vec{v}$ parallel to $\vec{u}$, given by: $\text{proj}_{\vec{u}}\vec{v} = \frac{\vec{v} \cdot \vec{u}}{|\vec{u}|^2}\vec{u}$
Linearly Independent Vectors
A set of vectors $\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\}$ is linearly independent if the equation $c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n} = \vec{0}$ is satisfied only when all $c_i = 0$
Linearly Dependent Vectors
A set of vectors $\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\}$ is linearly dependent if there exist scalars $c_1, c_2, ..., c_n$, not all zero, such that $c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n} = \vec{0}$
Vector Space
A set $V$ with vectors $\vec{u}, \vec{v} \in V$ and scalars $c$ is a vector space if it's closed under addition ($\vec{u} + \vec{v} \in V$) and scalar multiplication ($c\vec{v} \in V$), and satisfies the vector space axioms
Vector Subspace
A subset $W$ of a vector space $V$ is a subspace if it's closed under addition and scalar multiplication: for all $\vec{u}, \vec{v} \in W$ and scalar $c$, both $\vec{u} + \vec{v} \in W$ and $c\vec{v} \in W$
Span
The span of vectors $\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\}$ is the set of all their linear combinations: $\text{span}\{\vec{v_1}, \vec{v_2}, ..., \vec{v_n}\} = \{c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n} | c_i \in \mathbb{R}\}$
Basis
A basis of a vector space $V$ is a linearly independent set of vectors that spans $V$. For any vector $\vec{v} \in V$, there exists a unique representation $\vec{v} = c_1\vec{v_1} + c_2\vec{v_2} + ... + c_n\vec{v_n}$
Dimension
The dimension of a vector space $V$ is the number of vectors in any basis of $V$, denoted $\dim(V)$
Orthogonal Vectors
Two vectors $\vec{u}$ and $\vec{v}$ are orthogonal if their dot product is zero: $\vec{u} \cdot \vec{v} = 0$
Orthonormal Vectors
A set of vectors is orthonormal if they are orthogonal to each other and each has unit length: $\vec{u_i} \cdot \vec{u_j} = \delta_{ij}$ where $\delta_{ij}$ is the Kronecker delta
Gram-Schmidt Process
Produces orthonormal basis $\{u_1,\ldots,u_n\}$ from linearly independent vectors $\{v_1,\ldots,v_n\}$
Direction Cosines
For vector $v$, cosines with axes: $\cos\alpha = \frac{v_x}{\|v\|}, \cos\beta = \frac{v_y}{\|v\|}, \cos\gamma = \frac{v_z}{\|v\|}$
Linear Transformation
Function $T: V \to W$ where $T(au + bv) = aT(u) + bT(v)$
Matrix Representation
For transformation $T$ with basis $\{v_1,\ldots,v_n\}$, $[T]_{\mathcal{B}} = [T(v_1)\cdots T(v_n)]$
Gradient Vector
For scalar function $f(x_1,\ldots,x_n)$: $\nabla f = \left(\frac{\partial f}{\partial x_1},\ldots,\frac{\partial f}{\partial x_n}\right)$
Position Vector
Vector $\vec{r} = (x,y,z)$ from origin to point $P(x,y,z)$
Matrix
A rectangular array of elements $a_{ij}$ in $m$ rows and $n$ columns
Row Matrix
Matrix of size $1 \times n$ (single row)
Column Matrix
Matrix of size $m \times 1$ (single column)
Square Matrix
A [matrix](!/linear-algebra/definitions#matrix) with an equal number of rows and columns, often associated with special properties like determinants and eigenvalues.
Zero Matrix
A [matrix](!/linear-algebra/definitions#matrix) with all elements are equal to zero. Also called **null matrix**.
Main Diagonal
In a [square matrix](!/linear-algebra/definitions#square_matrix), the main diagonal (or principal diagonal, or leading diagonal) consists of elements where row index equals column index.
Triangular Matrix
A square matrix where all the elements either above or below the [main diagonal](!/linear-algebra/definitions#main_diagonal) are zero.
Upper Triangular Matrix
A [square matrix](!/linear-algebra/definitions#square_matrix) with zeros below the [main diagonal](!/linear-algebra/definitions#main_diagonal). All elements $a_{ij}=0$ where $i > j$.
Lower Triangular Matrix
A square [matrix](!/linear-algebra/definitions#matrix) with zeros above the [main diagonal](!/linear-algebra/definitions#main_diagonal). All elements $a_{ij}=0$ where $i < j$.
Identity Matrix
A square matrix with 1s on the [main diagonal](!/linear-algebra/definitions#main_diagonal) and 0s elsewhere.
Anti-symmetric(Skew-symmetric) Matrix
A square matrix that equals the negative of its transpose: $A = -A^T$. All diagonal elements must be zero.
Diagonal Matrix
A square matrix with non-zero elements only on the [main diagonal](!/linear-algebra/definitions#main_diagonal).
Symmetric Matrix
A matrix equal to its transpose, $A = A^T$.
Transposition
An operation that flips a matrix over its main diagonal, switching rows and columns: $(A^T)_{ij} = A_{ji}$
Matrix Addition
For matrices $A,B$ of same size, $(A+B)_{ij} = A_{ij} + B_{ij}$
Scalar Addition (Matrix)
For scalar $c$ and matrix $A$, $(c+A)_{ij} = c + A_{ij}$
Scalar Multiplication(Matrix)
For scalar $c$ and matrix $A$, $(cA)_{ij} = cA_{ij}$
Matrix Multiplication
For matrices $A_{m\times n}, B_{n\times p}$, $(AB)_{ij} = \sum_{k=1}^n a_{ik}b_{kj}$
Determinant
For square matrix $A$, denoted $\det(A)$ or $|A|$
Inverse Matrix
For square matrix $A$, its inverse $A^{-1}$ satisfies $AA^{-1} = A^{-1}A = I$
Rank
Number of linearly independent rows/columns, denoted $\text{rank}(A)$
Trace
Sum of main diagonal elements: $\text{tr}(A) = \sum_{i=1}^n a_{ii}$
Echelon Form
Matrix where each row's leading non-zero entry (pivot) is to the right of pivots in rows above
Reduced Row Echelon Form
Row echelon form where each pivot is 1 and is the only non-zero entry in its column
Elementary Matrix
Matrix representing one elementary row operation on identity matrix
Orthogonal Matrix
Square matrix $A$ where $A^T = A^{-1}$, equivalently $AA^T = A^TA = I$
Scalar Matrix
A [square matrix](!/linear-algebra/definitions#square_matrix) where all element are equal to zero except those on main diagonal which are equal to constant number.
Adjoint
For matrix $A$, adjoint (adj$(A)$) is transpose of cofactor matrix
Matrix Size
Matrix with $m$ rows and $n$ columns denoted as $A_{m\times n}$ or $A \in \mathbb{R}^{m\times n}$
Eigenvalues
Scalar $\lambda$ satisfying $Av = \lambda v$ for nonzero vector $v$ (eigenvector)
Eigenvectors
Nonzero vector $v$ satisfying $Av = \lambda v$ for eigenvalue $\lambda$
Singular Matrix
Square matrix with $\det(A) = 0$
Augmented Matrix
Matrix $[A|b]$ representing system $Ax = b$
LU Decomposition
Matrix $A = LU$ where $L$ is lower triangular and $U$ is upper triangular
QR Decomposition
Matrix $A = QR$ where $Q$ is orthogonal ($Q^TQ = I$) and $R$ is upper triangular
Positive Definite Matrix
Symmetric matrix $A$ where $x^TAx > 0$ for all nonzero $x$
Diagonalization
Matrix $A = PDP^{-1}$ where $D$ is diagonal matrix of eigenvalues
Block Matrix
Matrix partitioned into submatrices $A_{ij}$
Sparse Matrix
Matrix with mostly zero elements, typically $O(n)$ nonzero elements