Yeah, yeah, I know you expect from me matrix jokes. What if I told you I have no jokes on that ? So, just take the blue pill and go into serious stuff like ...
Cutting to the chase
I personally find matrix multiplication as the guy who sells stolen phones at the corner of the street. I mean, you get stuff at lower price but it can break in two days and you can get busted by the cops. Or not. I really need to find better metaphors...
Matrix multiplication is a well known method. People wrote on it before quite good articles, but I think you might get stuff simpler just by looking over some problems. For the ones who are familiar with the topic, you can skip to the last two problems.
Getting high fast
Now I need to find better subtitles... However, first of all to know logarithmic matrix multiplication, you have to know logarithmic multiplication.
Basically we have to compute xn considering the multiplication operation take O(1) time. Take it straight forward we get xn = x * x * .. * x , so O(N). Let's try to reduce it step by step. Let's take xn = x2 * x2 * .... And we multiply by x if n is odd. This should work fine and the constant is reduced at half. Right... Similarly we can go to xn = xsqrt(n) * xsqrt(n) * ... and this goes to O(sqrtN). This reasoning stops here.
To get it faster you have to simply observe that xn = xn / 2 * xn / 2 for n even and xn = xn / 2 * xn / 2 * x for n odd. The two terms are the same and the third is constant, so we really need to compute xn / 2 once. And xn / 4 once. And so on. Therefore the O(logN) complexity.
Now, notice that we did not specified that x is an integer or a number. The same rules hold for other mathematical associative structures such as matrices.
If you sayin' Y U NO REMEMBER MATRIX, then let me refresh your maths knowledge. You don't really need to know much about matrices to use put recurrences in a matrix multiplication form. Multiplying squared matrices is straight forward. Given a matrix
$M = \begin{bmatrix} x_{11} & x_{12} \ x_{21} & x_{22} \ \end{bmatrix} $