Matrix Multiplication Rules


1. Linearity

Matrix multiplication is linear in both the vector and the matrix arguments.

\[A(\mathbf{u} + \mathbf{v}) = A\mathbf{u} + A\mathbf{v}, \qquad (A + B)\mathbf{v} = A\mathbf{v} + B\mathbf{v}\]

This expresses how matrix operations distribute over vector addition and other matrices.


2. Scalar Commutativity

Scalars commute freely with matrices and vectors:

\[c(A\mathbf{v}) = (cA)\mathbf{v} = A(c\mathbf{v})\]

This property allows scalar constants (such as ( \(\sin\theta\) ) or ( \(\cos\theta\) )) to be moved before or after a matrix operation without changing the result.


3. Associativity

Matrix multiplication is associative when the dimensions are compatible:

\[A(B\mathbf{v}) = (AB)\mathbf{v}\]

The order of evaluation does not matter, but the order of matrices themselves must not change.


4. Non-Commutativity

In general, matrix multiplication is not commutative:

\[AB \neq BA\]

Only under special conditions (for example, when (A) and (B) are diagonal and share the same basis) will (AB = BA).


5. Distributivity

Matrix multiplication distributes over addition:

\[A(B + C) = AB + AC, \qquad (B + C)A = BA + CA\]