Table of Contents
ToggleIntroduction
Every square matrix A is associated with a number, called its determinant and it is denoted by Δ or det (A) or |A|.
Note- Only square matrices have determinants.
Types of Determinants and their Solving
First order determinant
If A = [a], then det (A) = |A| = a
Second order determinant
If A = \begin{pmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \end{pmatrix} , then
det (A) = |A| = a11a22 – a21a12
Third order determinant
If A = \begin{pmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{pmatrix}, then
det (A) = |A| = a11(a22a33 – a32a23) – a12(a21a33 – a31a23) + a13(a21a32 – a22a31)
Evaluation of Determinant of Square Matrix of order 3 by Sarrus Rule
If A = \begin{pmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{pmatrix}, then determinant can be formed by enlarging the matrix by adjoining the first two columns of the right and draw lines as shown below parallel and perpendicular to the diagonal.
Δ = a11a22a33 + a12a23a31 + a13a21a32 – a13a22a31 – a11a23a32 – a12a21a33.
Note: This method doesn’t work for determinants of order greater than 3.
Properties of Determinants
(i) The value of the determinant remains unchanged, if rows are changed into columns and columns are changed into rows.
{\begin{vmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{vmatrix} } = {\begin{vmatrix} a_{11} & a_{21} & a_{31}\\ a_{12} & a_{22} & a_{32}\\ a_{13} & a_{23} & a_{33} \end{vmatrix} }
(ii) The interchange of any two rows (or columns) of the determinant changes its sign.
{ \begin{vmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{vmatrix} } = -{ \begin{vmatrix} a_{21} & a_{22} & a_{23}\\ a_{11} & a_{12} & a_{13}\\ a_{31} & a_{32} & a_{33} \end{vmatrix} }
(iii) If all the elements of a row (or column) are zero, then the determinant is zero.
\begin{vmatrix} 0 & a_{12} & a_{13}\\ 0 & a_{22} & a_{23}\\ 0 & a_{32} & a_{33} \end{vmatrix} = 0
(iv) If any two rows (or columns) of a determinant are identical, then its value is zero.
\begin{vmatrix} k & k & k\\ k & k & k\\ a_{31} & a_{32} & a_{33} \end{vmatrix} = 0
(v) If all the elements of a row (or column) of a determinant are multiplied by a non-zero constant, then the determinant gets multiplied by the same constant.
k{ \begin{vmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{vmatrix} } = { \begin{vmatrix} ka_{11} & ka_{12} & ka_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{vmatrix} }
(vi) If each element of a row (or column) of a determinant is the sums of two or more terms, then the determinant can be expressed as the of two or more determinants
{ \begin{vmatrix} a_{11} + k & a_{12}+ k & a_{13}+ k\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{vmatrix} } = { \begin{vmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{vmatrix} +\begin{vmatrix} k & k & k\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{vmatrix} }
(vii) If the same multiple of the elements of any row (or column) of a determinant are added to the corresponding elements of any row (or column), then the value of the new determinant remains unchanged.
\begin{vmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{vmatrix} ={\begin{vmatrix} a_{11}+ka_{21} & a_{12}+ka_{22} & a_{13}+ka_{23}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{vmatrix}}
Minors and Cofactors
Minor of an element aij of a determinant is the determinant obtained by deleting its ith row and jth column in which element aij lies. Minor of an element aij is denoted by Mij.
Remark: The minor of an element of a determinant of order n(n ≥ 2) is a determinant of order n – 1.
The cofactor of an element aij, denoted by Aij is defined by Aij = (–1)i + j Mij, where Mij is minor of aij.
Area of a Triangle
In earlier classes, we have studied that the area of a triangle whose vertices are (x1 , y1 ), (x2 , y2 ) and (x3 , y3 ), is given by the expression
Area = ½ [x1 (y2 – y3 ) + x2 (y3 –y1 ) + x3 (y1 –y2 )].
Now this expression can be written in the form of a determinant as
½{\begin{vmatrix} x_1 & y_1 & 1\\ x_2 & y_2 & 1\\ x_3 & y_3 & 1 \end{vmatrix} }
Remarks
- Since area is a positive quantity, we always take the absolute value of the determinant in (1).
- If the area is given, use both positive and negative values of the determinant for calculation.
- The area of the triangle formed by three collinear points is zero.
Adjoint and Inverse of a Matrix
Adjoint of a matrix: The adjoint of a square matrix A = [aij] n × n is defined as the transpose of the matrix [Aij] n × n, where Aij is the cofactor of the element aij. Adjoint of the matrix A is denoted by adj A.
Let A = \begin{pmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{pmatrix}
Then adj A = Transpose of \begin{pmatrix} A_{11} & A_{12} & A_{13}\\ A_{21} & A_{22} & A_{23}\\ A_{31} & A_{32} & A_{33} \end{pmatrix}
Properties of the adjoint of a matrix
If A and B are two non-singular matrices of the same order n, then
- A (adj A) = (adj A) A = |A| I
- (adj AT) = (adj A)T
- adj (AB) = adj B × adj A
- adj (adj A) = |A|n-2
- |adj A| = |A|n-1
Note :
- Adjoint of a diagonal matrix is a diagonal matrix.
- Adjoint of a triangular Matrix is a triangular matrix
- Adjoint of a symmetric matrix is a symmetric matric
Inverse of a Matrix : A-1 = {1\over |A|}adj\ A
Properties of Inverse Matrix
Let A and B be two square matrices of the same order n. Then,
- (A-1)-1 = A
- (AB)-1 = B-1A-1
- (AT)-1 = (A-1)T
- |A-1| = |A|-1
- A.A-1 = A-1A = I
Singular and Non-Singular Matrix
Singular Matrix: A square matrix A is said to be singular if |A| = 0
Non-Singular Matrix: A square matrix A is said to be singular if |A| ≠ 0
- Theorem: If A and B are nonsingular matrices of the same order, then AB and BA are also nonsingular matrices of the same order.
- Theorem: The determinant of the product of matrices is equal to the product of their respective determinants, that is, AB = AB, where A and B are square matrices of the same order.
- Theorem: A square matrix A is invertible if and only if A is a nonsingular matrix.
Remark
In general, if A is a square matrix of order n, then |adj(A)| = |A|n – 1
Applications of Determinants and Matrices
- Consistent system: A system of equations is said to be consistent if its solution (one or more) exists.
- Inconsistent system: A system of equations is said to be inconsistent if its solution does not exist.
Solution of a system of linear equations using the inverse of a matrix
Let us express the system of linear equations as matrix equations and solve them using the inverse of the coefficient matrix.
Consider the system of equations
a1 x + b1 y + c1 z = d1
a2 x + b2 y + c2 z = d2
a3 x + b3 y + c3 z = d3
Let A = \begin{pmatrix} a_{1} & b_{1} & c_{1}\\ a_{2} & b_{2} & c_{2}\\ a_{3} & b_{3} & c_{3} \end{pmatrix}; X = \begin{pmatrix} x \\ y \\ z \end{pmatrix} and B = \begin{pmatrix} d_1 \\ d_2 \\ d_3 \end{pmatrix}
Case I – If A is a nonsingular matrix, then its inverse exists. Now
AX = B
or, A–1 (AX) = A–1 B (premultiplying by A–1)
or, (A–1A) X = A–1 B (by the associative property)
or, IX = A–1 B [∵ A–1A = I]
or, X = A–1 B
This matrix equation provides a unique solution for the given system of equations as the inverse of a matrix is unique. This method of solving system of equations is known as the Matrix Method.
Case II – If A is a singular matrix, then |A| = 0.
In this case, we calculate (adj A) B.
- If (adj A) B ≠ O, (O being zero matrices), then the solution does not exist and the system of equations is called inconsistent.
- If (adj A) B = O, then the system may be either consistent or inconsistent according to the system has either infinitely many solutions or no solution
Solution of a system of linear equations using the Cramer’s Rule
Consider the system of equations
a1 x + b1 y + c1 z = d1
a2 x + b2 y + c2 z = d2
a3 x + b3 y + c3 z = d3
Then the solution of the system of equations is
x = Δ_x \over Δ; y = Δ_y \over Δ; z = Δ_z \over Δ
Where, Δ = {\begin{vmatrix} a_{1} & b_{1} & c_{1}\\ a_{2} & b_{2} & c_{2}\\ a_{3} & b_{3} & c_{3} \end{vmatrix} }
Δx = {\begin{vmatrix} d_{1} & b_{1} & c_{1}\\ d_{2} & b_{2} & c_{2}\\ d_{3} & b_{3} & c_{3} \end{vmatrix} }
Δy = {\begin{vmatrix} a_{1} & d_{1} & c_{1}\\ a_{2} & d_{2} & c_{2}\\ a_{3} & d_{3} & c_{3} \end{vmatrix} }
Δz = {\begin{vmatrix} a_{1} & b_{1} & d_{1}\\ a_{2} & b_{2} & d_{2}\\ a_{3} & b_{3} & d_{3} \end{vmatrix} }
- If D ≠ 0, then the system of equation is consistent with the unique solution.
- If D = 0 and at least one of the determinants D1, D2, D3 is non-zero, then the given system is inconsistent, i.e. having no solution.
- If D = 0 and D1 = D2 = D3 = 0, then the system is consistent, with infinitely many solutions.
- If D ≠ 0 and D1 = D2 = D3 = 0, then systemhas only trivial solution, (x = y = z = 0).