-=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- (c) WidthPadding Industries 1987 0|387|0 -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=-
SoCoder -> Article Home -> Advanced Techniques

Created : 21 September 2013
Edited : 21 September 2013

Maths 101 - Episode 4: Linear Transformations

Stretch, Shear and Rotate - you can do these to sheep; now learn how to do them to vectors!

Most of what we've covered so far in this series has been mathematical ground work, with a few not-that-exciting practical applications. In this tutorial, we will cover Linear Transformations, which are practical, easy to visualise, and let you do fun deformations of geometric shapes.

This tutorial relies heavily on vectors and matrices, so if you haven't already read those articles, do that now. You may wish to keep those open in another tab for reference, since we'll be using the techniques taught in there quite a bit! An understanding of trigonometric functions will also come in handy when we get on to rotation, but is less important.

For simplicity's sake, this tutorial will deal only with transformations in two dimensions, however, the technique can easily be extended to three (or more!) dimensions. Unless otherwise stated, all angles are in degrees.

What is a Linear Transformation?

A 'transformation' is a function applied to a vector or a collection of vectors (i.e. a shape) that produces a different vector. A 'linear transformation' is a particular type of transformation that transforms the vector space in a uniform ('linear') way and has no effect on the origin.

So stretching, shearing and rotating a vector about the origin are all linear transformations, because they do not move the origin, however translating (moving) a vector is not a linear transformation, because it would move the origin. If you're still following the terrible 'joke' in the description of this article, then you are correct to conclude that this means we cannot make our sheep go left. Moving on...

How do we perform a Linear Transformation?

To perform a linear transformation on an n-dimensional vector, we use an n by n square matrix. All linear transformations can be represented by such a matrix, and all matrices represent some linear transformation. Since we are dealing with 2D vectors in this tutorial, we will be using 2x2 matrices for our transformations.

How do we use this matrix? Well, first, we represent our vector to translate as a 2x1 column matrix. We then pre-multiply this column matrix by our transformation matrix (remember, the order in which you multiply matrices affects whether or the multiplication is possible, and the result) to give a new 2x1 column matrix representing the result vector.

I'll now describe some common linear transformations and their matrices, with examples to show how this process works. But rather than simple pictures, these examples will be interactive! PlayMyCode lets you apply transformation matrices to graphical objects, be it images or the entire drawing canvas. So, for each of the three main types of transformations I describe, I'll provide a little demo of that transformation applied to a picture of a tape dispenser on my desk, so you can appreciate how they work. Bear in mind that the orientation of a computer's drawing canvas is different to that of a graph, though, so transformations may appear different between them!

Doing Nothing - The Identity Transformation

The simplest of all linear transformations is to do nothing, and this is achieved simply by using the identity matrix!

Why this works is obvious - the key property of the identity matrix is that it does nothing during matrix multiplication, so will not change our vector. You will notice that in the transformations we discuss next, doing nothing (stretching by a factor of 1, shearing by 0, or rotating by 0 degrees) are equivalent to the identity matrix, as you would expect.


To stretch a vector, we need to decide a stretch factor in both the x and y dimensions. Once we have these values, which we shall call a and b respectively, we can write a stretch transformation matrix as follows:

To see why this works, consider its application on a vector (x, y):

...which works out as you'd expect.

Left/Right to adjust X stretch factor
Up/Down to adjust Y stretch factor


A shear transformation is one that 'slants' the vector horizontally or vertically by some factor. The amount that a vector is moved by depends on what quantity it has in the direction perpendicular to the shear direction. Its transformation matrix is, for a shear factor a:

When applied to a vector (x, y), we get:

Left/Right to adjust shear factor
Space to toggle horizontal / vertical shear


To rotate a vector about the origin by theta degrees, we can use the following transformation matrix:

When used on a vector, this expands out to:

As you can probably appreciate, this method of rotation is much easier than trying to convert your vector to Polar form, fiddle with the angle, then bring it back to Cartesian!

Left/Right to adjust rotation

Combining Transformations

You now know how to make and apply a simple linear transformation using a matrix. We can also mix them together into more complicated transformations!

If we apply a single transformation to a vector, we get a new vector. It seems logical, therefore, that applying a second transformation to this new vector would be the same as pre-multiplying our new vector by the second transformation, and that hence by the associativity property of matrix multiplication, we can calculate this as a single transformation matrix.

For instance, say we want to scale up the vector (2, 3) by a factor of 2, then rotate it by 45 degrees. If we did the steps separately, we would have:

However, with the associativity rule, we can chain the matrices together:

...and then multiply the transformation matrices together to give the entire transformation as a single matrix:

This gives the same result as before, as we'd hope:

It is important to remember here that matrix multiplication is not commutative, so the order you apply the matrices in matters. For instance, a shear followed by a rotation will give different result to a rotation followed by a shear.

The fact we can multiply transformation matrices together in this way gives rise to other geometrically-obvious facts, such as that multiplying a rotation matrix for a degrees by one for b degrees will give a rotation matrix for a+b degrees.

The Determinant and Inverses

Since our transformation matrices are square, they have a determinant, and in fact this determinant gives us a very useful property: if you apply a transformation matrix to a shape, then the area of the resultant shape is equal to the area of the original shape multiplied by the determinant of the transformation matrix!

This immediately tells us some stuff we already know:
  • Stretches change the area of shapes by the X factor multiplied by the Y factor.
  • Shears and rotations do not change the area of shapes.

The results for stretches and shears can be very easily deduced by calculating their determinants. But what about rotations? Geometrically it is obvious that they do not modify the area, but how can we show this mathematically? Well...

You're probably wondering how I immediately jumped from a bunch of trig to the number 1. Well, it turns out that this is what is known as an 'identity' - an equation that is always true regardless of the values of its variables. A rigorous proof of this is beyond the scope of this tutorial, but to convince yourself this is true in a geometric way, think about how Pythagoras' Theorem interacts with our trig-triangle from the trigonometry article.

The determinant-area property also gives us another interesting fact: transformation matrices with determinant zero destroy the shape and cannot be reversed. Such a transformation will crush the shape to a line, and the zero matrix will crush it all the way down to a single point (the origin)!

A transformation matrix with a non-zero determinant will have an inverse matrix, and by definition multiplying them in either order will give the identity matrix, which we have established does nothing to vectors it is applied to. It follows that the inverse matrix of a transformation matrix undoes the effects of the original transformation, and furthermore, since transformation matrices with zero determinant have no inverse, they cannot be undone.


  • A 'linear transformation' uniformly alters the vector space, without affecting the origin.
  • All linear transformations have a square matrix associated with them, and vice versa.
  • The identity matrix performs no transformation.
  • There are general forms for the three common linear transformations: stretch, shear and rotate.
  • Transformation matrices can be chained together with matrix multiplication to give more complicated transformations.
  • The determinant of a transformation gives the area scale factor of that transformation.
  • Transformations with zero determinant crush all vectors onto the same line or point, and cannot be undone.

That's a lot to take in, but linear transformations are quite useful. In particular, it is much easier to rotate vectors using a rotation matrix that is to use straight trigonometry - the matrix needs much less code and is easier to code correctly. It is particularly useful for things like rotating the 'camera' view on a world.

Next time, we will look at intersecting two lines in a single equation, using the magic of vector maths!



Sunday, 22 September 2013, 09:43
Awesome article shroom