It's written in Java and outputs the data via a couple of common multi-touch formats so it can be used by game engines such as Unity3D.
The hard part is calibrating the cheap and nasty camera equipment since we're using cheap stuff we can find around the house but as this is a prototype quality isn't an issue.
Today we're converting an old MAME cocktail arcade cabinet to use as the table. We'll fit a relatively inexpensive projector (around 200 UKP) under the table to project the game onto the underside of the table.
The little paper Bullseye targets are glued under objects (LEGO works fine but you can use model tanks/Battletech/Star Wars figures etc). The web cam under the table sees the shadow of the target and the Bullseye software sends the filtered camera image to the shader on the GPU. This scans the image for the geometry of the targets and writes the information to a small texture. Each target has an ID, position and rotation value. Fingers and other shadows will be tracked as objects with NO ID or rotation. The rotation is worked out from the shapes on the outer circle. The inner circle contains a binary representation of the target ID.
Tracking over 64 objects at 10 fps using a $10 camera is reasonably OK so long as the calibration is fine. As you increase the camera resolution and frame speed you can track more objects. One of the tables built already by the Bullseye developers is fast enough to run a virtual air-hockey game using some very expensive cameras.
For our prototype we're going to try and put together a simple Apache helicopter arming table and send that data into Combat-Helo. Failing that a simple snake or sound game. As time is running out we need to get a move on. More updates as we go.
Information on TUIO protocols and multi-touch displays can be found here http://www.tuio.org/