Question
Sun April 18, 2010 By: Chandan Tuli

RESPECTED SIR

Expert Reply
Thu April 22, 2010

Dear Student,

The solution to your problem is as follows -

Let A and B be two non-empty (or non-null) matrices.

By the hypothesis of the question, AB=0

Now, Assume that A is non-singular.

=> A-1 exists.

Therefore, multiply both sides of equation by A-1

=> A-1AB=0

=> B=0

But this is a contradiction to the fact that B is non-empty matrix. Hence A can't be non-singular matrix.

Similarly, assume B to be non singular.

=> B-1 exists.

Multiply both sides by B-1,

=> ABB-1=0

=> A=0

Again, this is contradiction to the fact that A is non-empty. Therefore, B can't be non-singular.

Regards Topperlearning.

Related Questions
Fri November 17, 2017

Given A = i + 2j +3k, B = -i + 2j + k and C = 3i + j , find a unit vector in the direction of the resultant of these vectors. Also find a vector D which is normal to both A and B. What is the inclination of D on C?

Fri November 17, 2017

Home Work Help