January 28th, 2011

The last two weeks have been most vexatious: it seems as if everything has gone wrong. First I had to complete rewrite my spherical trig equations. The crucial value needed is the expected velocity of a Leonid at any given point. If I know where in the sky a Leonid appears, I can calculate from the geometry exactly how fast and at what angle it should move. This expected velocity is crucial to recognition of Leonids, The original system had calculated the velocity by looking at the nearest stars and determining the position of the point in question by calculating spherical triangles from the point in question (which I’ll call “P” henceforth) to each of the stars. This method had worked well for a long time, but it broke down with camera AR50F. Three factors screwed everything up.


First, camera AR50F was the worst of the four cameras; it had lots of noise and not as much contrast. This made stars harder to identify against the background noise. Here’s a sample frame to give you an idea of just how noisy the images are:


As you can see, the stars in this frame are pretty difficult to make out. If you really must know, beta Cephei is about a quarter of the way in from the left edge and about a third of the way down from the top. The brightest star, which is about an eighth of the way in from the left edge and about a third of the way up from the bottom, is alpha Cephei. Ah, now you recognize the stars -- right? ;-)

Here’s another frame, this one from a different camera. Yes, it’s still noisy, but you can make out the stars much more easily:



Second problem: some jerk kept moving around the drape shielding the camera from the lights inside the aircraft cabin. Whenever this happened, a sudden glare of light blinded the camera, causing it to lose track of the stars, which then wreaked havoc with my calculations. With other cameras I had simply written little custom code snippets that would skip over the bad sections of video. But this stretch of video had lots of short periods of glare; I would have had to write dozens of little custom code snippets, and they’d end up skipping lots of good frames unnecessarily. So I set to work writing code that would identify such cases and skip each frame individually. This proved to be an effective solution -- but it cost me five days.

A third problem arose from the fact that the camera was pointed due north, just underneath the Pole Star. This meant that the calculations of azimuth and right ascension, both crucial steps in the spherical trig calculations, had to flip from positive to negative or vice versa. You will recall that trig functions are mathematically degenerate: they produce solutions that could apply in two different cases. For example, the sine of 30º is 0.50 -- but the sine of 150º is also 0.50. Spherical trig calculations never yield angles in degrees: they yield only sine or cosine values. So if my equation yields a sine of 0.50, is the answer 30º or 150º? There’s no way to know just from the basic spherical triangle. You must bring in a second spherical triangle to resolve the ambiguity. These things compound on each other, leading to a mess of overlapping spherical triangles. My system uses six different points in the sky:

C: the center of the video frame
V: the point in the sky 90º straight up (in the frame’s grid) from C. It’s always off-screen, obviously.
N: the north celestial pole
Z: the zenith
P: the point being analyzed
R: the radiant of the Leonid shower

These six points generate ten possible spherical triangles and I use eight of them. There are 18 different spherical trig equations that must be calculated for each point, and one slip with any of them ruins everything. The worst problem is the “triangle flip” arising from the aforementioned ambiguity of the equations. That’s why I have to use so many spherical triangles. Keeping all those triangles straight in your head is a most confusing task. After much trial and error, I eventually worked out a system that works in all possible arrangements of the six points -- I think. So far it has worked properly in some very messy situations. But it cost me a lot of time -- I’ve lost track of how many days this task consumed.

The last problem arises when the plane turns. The stars in the camera frame start moving, and they can move quite rapidly: up to two pixels per frame. If the image were stable, my algorithms could keep up with the motion, but when you add the large amount of noise in the system, you get a fuzzy, wiggly image that is also moving up to two pixels per second. It doesn’t happen often, but even with a 1% chance of the algorithm losing the star, it’s bound lose the star once every three or four seconds. When it does lose the star, it’s harder to catch it again, because the motion of the plane is also messing up the positions of other stars. The whole system collapses in such cases. I am sorely tempted to simply skip the time during which the plane is turning. Unfortunately, it turns at 0217, just 12 minutes after the peak of activity. There would be a lot of data lost in that one minute. However, I am at my wit’s end with this problem and I really must get moving, so unless I fix the problem today, I cut that data.

Here’s a pretty picture: it shows four Leonids in the same frame. This is, obviously, an extremely rare event -- I calculate the probability of this event, even with the high rate of Leonids, to be about one in a million.



Of course, my database includes about a million frames, so this isn’t quite that out of line.