Sunday, March 11, 2012

Matrix-Vector Multiplication

I'm finishing up my spring break this week (I know - not much of a "spring" break, but you take what you can get) and I decided to start going through the MIT OpenCourseWare course on Linear Algebra. I want to be a graphics guy when I get out of school and I haven't taken a single linear algebra course yet! Shame on me.

Gilbert Strang is a great lecturer. He teaches linear algebra in a such a way that even the small points he makes makes you go "aha!" I'd never had matrix-vector multiplication explained to me; it was just something I memorized and had to take a leap of faith for. In his lectures, Strang really hits home the importance of this idea of "linear combinations."

Linear combinations are simple. If you have a bunch of vectors and a scalar coefficient for each vector, the linear combination is just the addition of each vector multiplied by its scalar.

As it turns out, that's exactly what matrix-vector multiplication is, too. Each column in your matrix is one of the vectors and each component in your vector is one of the coefficients (the first component of the vector is the coefficient to the first column, the second component of the vector is the coefficient to the second column, etc.).

The typical way to think of matrix-vector multiplication is to just consider your vector as a matrix and do the standard "take a row from the first matrix, take a column from the second matrix, perform the dot product" matrix multiplication.

With the linear combination concept, matrix-vector multiplication becomes much more intuitive (for me at least) because it's just a normal linear combination:

If you work this out, you'll see this is equivalent to the dot-product technique.

The linear combination repurposing of matrix-vector multiplication makes a lot of sense if you imagine the matrix to be a rotation matrix where each column is an axis that forms a basis/coordinate system, and the vector to be a point in space. If you visualize vector addition using the "head-to-tail" concept, you can almost visualize why multiplication of a rotation matrix and a point works!

2 comments:

  1. You are a smart guy. :) It took me a long time to figure out (in a deeper sense) how matrix multiplication works. I tried to reinvent bump mapping, and accidentally derived a series of linear interpolations which turned out to be equivalent to the matrix multiplication above. That was one of those rare "lightbulb" moments when everything comes together and makes sense. :)

    Having a solid understanding of vector math (and quaternion math) are probably the two most important things you can learn as far as graphics programming, IMHO.

    ReplyDelete
  2. Thank you for the kind compliment :). Don't you love lightbulb moments? I've found that in just the last couple of months having this new intuition has made life a heck of a lot easier in several occasions.

    Quaternion math is a whole other beast I haven't even attempted yet, beyond knowing how to use a Quaternion math class. Something that is on the to do list that I should tackle soon, though!

    ReplyDelete