Click to download full size image - 800 x 600, 178K


The video digitiser I use is entirely software based. The hardware is little more than a high-speed video frame 'sampler' which buffers a field from the video input for the computer to decode. This data is then interpreted by some software that can, from it, generate images in up to 16 million colours.

By and large it works quite well. It cannot do real-time video because the system bandwidth just isn't there. Remember, even with a processor running at over 200MHz, the data is being squeezed down an eight bit expansion card running between 2MHz and 8MHz (depending on cycle). This is not like PCI busmastering like one of those Hauppage things. This is more like trying to get a Hauppage card working with the ISA bus!

As I only have one field buffer fitted, I can only grab 'half' the frame, so every line in the image is doubled.

It is worth pointing out that the hardware is over a decade old, has been in several machines, and has been hit my mains across the Video In (ooops!). So it isn't the greatest bit of kit, but it marches on.

Thing is, the software can be quite easy to 'fool' with a bad input source. You get artefacts ranging from magenta or green stripes (the chromiance is decoded incorrectly), or the start-of-line sync is decoded incorrectly.
In the screenshot, the upper image is the original digitised frame. You can very clearly see the sync misses, and also the incorrect colourisations.


I figured that, despite such corruption, there isn't any reason why useful data cannot be extracted from the messy input. It is probably not something that could be done automatically, but it sure is something that can be done manually.

The first plan of action was to come up with a name. I'm a weird one in that I find it quite difficult to begin working on a project until I have a name for it. Maybe it's just some peculiar kind of anthropomorphism? I dunno...
The next step was to decide on a language. For me, hackability was paramount, which meant only one choice. BBC BASIC. I could have the program running in four mouse clicks, in seconds. No bother with compiling and linking. And, against C, BASIC offers decent built-in error handling, and a built-in assembler.

So I cobbled up a program to display the picture on-screen, and to allow me to use the left/right cursors to shift a line around and move the line choice up and down. It sucked. Bigtime.

So I rewrote it. Only, this time my priorities were slightly different. I decided that I would code the framework of the application in BASIC, but the core would be written in assembler.

I included stuff like 'black out line', and the expected shifting. But one thing I just couldn't get my head around was the colour problem. It should be possible to regenerate the line because the luminance seems to be present, it's just the colourisation that is messy. After a while of head scratching, I hit on a method that was a lot less bloody work! Each 'line' is doubled, right? So why not make the top row of our line be a copy of the line above, and the bottom row of or line be a copy of the line below?
I tried it, and it was magic. All those little quibbles went any and the problems were mostly cured.
I bodged the routine to it could start and end from defined points. This might seem a little odd until you realise that it is very important in the line-shift routine so we know what we can fill in the missing space with.

You can see a snippet of ARM code in the blue/yellow window. Yes, I write with my editor set to show me yellow text with a blue background - I'm using that colour setup right now to write this HTML. I cannot get on with black on white... or, worse, white on black. Yuck.
The ARM processor is simply a delight to code. Take, for example, the line:

    LDR     R5, [R10, R12, LSL #2]
This means take the word pointed to by R10 + (R12 * 4) and stuff it into R5.
When you compare that with the powerful STMFD, which can restore user-defined registers, updating the stack pointer as applicable, push a return value into PC, update the flags, and cause a return-to-caller in the one instruction, which can provide an ass-kickingly simple way to exit from subroutines (oh, and it can be executed conditionally!), you start to see why I love the ARM processor.

You've probably also noticed my backdrop piccy. :-)


Click here to download PiccyHack v0.01

Several assumptions are made:

It may work in other screen modes. Probably okay in 16 million colours; I'm not sure about the palette in 256 colours. That's both for viewing the image, and the image itself.

Please note, at this moment in time I don't have any impetus to do much with this program, as the code performs as I need it to, I'm just releasing this in case others have the same sort of problems...

Return to "My Stepmother Is An Alien" index
HTML and text © 2002 Richard Murray