## Wednesday, October 17, 2012

### Scalar

If you're reading this then it's already too late you've probably come across scalars and vectors and pondered their differences. The pair of terms is used in many contexts as diverse as operations performed by a CPU. I think the names go back as far as matrix mathematics, and - if you'll listen - I'll tell you why...

Matrices can be "multiplied" in two ways:

1. two matrices are multiplied and the result is an arrangement of dot products;
2. one matrix and one scalar are multiplied and every value in the matrix is scaled by the scalar.

Now, every time an algorithm requires to scale a collection of values by some number, will I name that variable "factor" or "scalar"?

Which would you choose?

## Tuesday, October 09, 2012

### Rectilinear

In my quest for a better panorama builder, I stumbled upon a few odd facts about cameras. Mine (the Canon G1X) for example, has a rectilinear lens. Or almost rectilinear. What this means is that for every pixel left or right (or indeed up and down) from the center of a photograph represents a constant number of degrees. I've not been able to measure the angle of view too closely, but I can report that it's slightly over 60 degrees (left to right) judging by my initial results. On a photograph 4352 pixels wide this is about 0.0138 degrees per pixel. (I've assumed the same scale factor is present on the vertical but this is not necessary - just an assumption).

Now, imagine you're standing in the middle of a massive sphere, pointing the camera outwards. If you took a photograph, each line of pixels (up/down and left/right) would represent a curved line segment of a great circle on that sphere. And if a pixel is defined by a row and column (or X and Y value) then the position of the light source of that pixel will be the intersection of those two great circles.

Mathematically it's relatively simple to calculate the intersection point of two great circles (we only consider two points and then throw one of those away - an optimisation) and thus the position in space of the pixel's light source. We could map two or more photographs of the same scene (taken from the same place, but at different angles) onto our sphere (from the inside) and with the right manipulations, we'd be able to build a 360 degree * 180 degree panoramic view (i.e. the full sphere).

And what are the right manipulations? They involve matrices, or more specifically, singular value decompositions. If every photograph shares two overlapping points with another photograph (or photographs) then we can test our inner-sphere projection by comparing the angle between the point-pairs in one photograph with the same point-pair in another. They need to be equal angles or else something's likely wrong with our assumption of the lens projection. Given two point-pairs, it's simply a case of using Procrustes analysis (or actually, just a sub-solution of it) to determine a rotation that can be applied to all pixels of one image to align it perfectly with the other image. Once the aligned images wrap around the entire sphere, you might find that the computed scale factor obtained when calibrating the lens was slightly out. So... why bother calibrating in the first place if you already know the lens is rectilinear! Readjust the scale factor and realign the images as necessary. Then fill the remainder of the sphere with your photographs. Simples!

## Saturday, March 24, 2012

### West-East

Nothing too technical here: I noticed (when viewing the current night sky in Cape Town) that West and East on the map were swapped around but North and South were still oriented as I'd expect. I puzzled for a few seconds, then held the laptop up to the sky. Eureka! When you're lying on your back (outside, on the grass, an unfamiliar concept to anyone in the UK) staring at the stars with your head and toes aligned North to South, West is your right and East is on your left. All of a sardine the map makes sense...