Tuesday, 11 May 2010

Battlezone with lasers

This is a project I've had simmering on the back burner for a while. Still at the early stages but thought it might be fun to keep track of each step here

A few months back I got a 20kps laser scanner galvo set off ebay with the intention of making my own laser projector and a vision of using it to play some old vector arcade games... particularly my old fave Atari Battlezone. The arcade game bit seemed pretty easy, since you can play BZ on the open source MAME emulator so I thought I could hook into the vector terminal emulation.

I found the asynchronous UART on an Arduino board was not quite fast enough to cope with the data... dropping bits all over the place, so I started looking at a USB conneciton to a PIC2455. As a SourceBoost C user I was not able to find any easy to understand USB CDC (Communication Device Class, a.k.a serial port) implementations for the PIC - so I decided to make my own, leaning heavily on sample code I found online.

Well I finally got to the point where my PIC would connect via USB show up as a COM port and be easy to access from a Windows program. Then I hooked up an 12-bit SPI dual DAC and connected it to the galvo setup and tried the first random hacking into MAMEs vector module.

I didn't expect it to work first time, and didn't! but my impatient hacking did produce some interesting squiggles at about 2 fps. I needed to use a long exposure photograph to actually make sense of it, but eventually I recognised a couple of parts of the display and got quite excited that the concept was proved!

The coordinate handling is obviously messed up and the image is wrapping on itself multiple times, also there is no attempt at blanking yet - so there are stray lines all over. The big job will be to find some way to optimise the render list to stop throwing the galvos all over the place and improve on the 2 fps refresh!

As you can see I have a long way to go!

Here is the plot showing the bits I recognised


Here is an actual MAME screen showing what it should look like


If things improve I will post an update!

Monday, 3 May 2010

Motion detection to midi with puredata

another experiment with puredata, webcam image is passed through pix_movement object and a pix_blob turns the result into two midi note streams which are sent into reason via midi yoke. the image is the output of the pix_movement (difference between frames)



I put the pd patch at http://sites.google.com/site/skriyl/Home/pd-projects (motion noise.pd)



The patch outputs on midi channels 1 and 2. I used midi yoke and PD's midi output, then piped this into propellerheads reason, where you can use the "advanced midi" to set up midi bus A then lock down channels 1 and 2 to specific instruments in the rack. I used an NNXT with glockenspiel patch and nn19 with strings patch

Sunday, 2 May 2010

First play with PureData

I was thinking about making a midi "harp" and thought of trying with a webcam, using puredata to analyse the images and generate midi/sounds. This is the first time I have played with PD and I don't really know what I am doing with it yet... you'll see I am not quite there with the midi harp yet, but I am impressed with how quickly you can get something fun working in PD



Here is the PD sketch, which I got to by hacking about with one of the GEM tutorial sketches