Inverse of a 3×3 Matrix Calculator
Enter the entries of matrix A (3×3), then compute its inverse A⁻¹.
Step-by-step method
- Write the matrix A.
- Use the 3×3 determinant formula.
- Substitute values to compute det( A ).
- Compute the cofactor matrix C and the adjugate adj( A ) = Cᵀ.
- Compute A⁻¹ = ( 1 / det( A ) ) · adj( A ).
Determinant formula
a | b | c |
d | e | f |
g | h | i |
Inverse formula
| 1 |
| det( A ) |
C11 | C12 | C13 |
C21 | C22 | C23 |
C31 | C32 | C33 |
Example 1: 3×3 matrix inverse
Step 1 - Write the matrix A.
In this problem: We start with the given matrix A.
1 | 2 | 3 |
0 | 1 | 4 |
5 | 6 | 0 |
Step 2 - Use the 3×3 determinant formula.
In this problem: This is a standard cofactor expansion formula for a 3×3 matrix.
a | b | c |
d | e | f |
g | h | i |
Step 3 - Substitute values to compute det( A ).
In this problem: Substitute values to get det( A ) = 1.
Step 4 - Compute the cofactor matrix C and the adjugate adj( A ) = Cᵀ.
In this problem: Compute cofactors, then transpose to get adj( A ).
-24 | 20 | -5 |
18 | -15 | 4 |
5 | -4 | 1 |
-24 | 18 | 5 |
20 | -15 | -4 |
-5 | 4 | 1 |
Step 5 - Compute A⁻¹ = ( 1 / det( A ) ) · adj( A ).
In this problem: Multiply adj( A ) by ( 1 / det( A ) ) to get A⁻¹.
-24 | 18 | 5 |
20 | -15 | -4 |
-5 | 4 | 1 |
-24 | 18 | 5 |
20 | -15 | -4 |
-5 | 4 | 1 |
Final answer:
-24 | 18 | 5 |
20 | -15 | -4 |
-5 | 4 | 1 |
Example 2: 3×3 matrix inverse
Step 1 - Write the matrix A.
In this problem: We start with the given matrix A.
2 | 0 | 1 |
1 | 1 | 0 |
3 | 2 | 1 |
Step 2 - Use the 3×3 determinant formula.
In this problem: This is a standard cofactor expansion formula for a 3×3 matrix.
a | b | c |
d | e | f |
g | h | i |
Step 3 - Substitute values to compute det( A ).
In this problem: Substitute values to get det( A ) = 1.
Step 4 - Compute the cofactor matrix C and the adjugate adj( A ) = Cᵀ.
In this problem: Compute cofactors, then transpose to get adj( A ).
1 | -1 | -1 |
2 | -1 | -4 |
-1 | 1 | 2 |
1 | 2 | -1 |
-1 | -1 | 1 |
-1 | -4 | 2 |
Step 5 - Compute A⁻¹ = ( 1 / det( A ) ) · adj( A ).
In this problem: Multiply adj( A ) by ( 1 / det( A ) ) to get A⁻¹.
1 | 2 | -1 |
-1 | -1 | 1 |
-1 | -4 | 2 |
1 | 2 | -1 |
-1 | -1 | 1 |
-1 | -4 | 2 |
Final answer:
1 | 2 | -1 |
-1 | -1 | 1 |
-1 | -4 | 2 |
Sign up or login to get the full step solution for free!