Our touch table is coming together beautifully. The physical interface is up and running. It use infrared sensors to detect user touch and movement. A camera picks up the FTIR data and, using Community Core Vision, uses blob detection to keep track of each blob, or finger. The data is sent in TUIO format, then parsed and analyzed for use in gestures.
to come!: music, refined images, multiple game levels and environments, variable play