EQUIPMENT: I have a kinect from my project last year, a nyko zoom lens which i picked up from Gamers Hub the last week (S$40), and i just updated my trusty old Macbook Pro's OS to 10.6. Sherwin has a projector, and we improvised some temporary setup with the furniture, a big white cloth for a projection screen, and lots of duct tape...
Installing the Nyko Zoom Lens for XBOX 360 Kinect
Since it was a small space, I thought I would try the Nyko Zoom lens to see if it would help for installations in tiny spaces. When installing it over the Kinect you will hear a loud click when the lens snaps into place. I took a before and after screenshot so you can see the different in view.
Without Nyko Zoom Lens on (as seen via flkinect demo example)
With Nyko Zoom Lens on (as seen via flkinect demo example)
According to the documentation, this is what the nyko zoom can do.
I noticed that the corners are not visible with the zoom but i have always had problems with those areas (esp for blob detection) so maybe i won't really miss those spots. Another thing is that some people on the internet seem to be complaining that removing the zoom device will scratch the kinect's plastic body. That seems a bit unfair since the product actually does come with a BIG BRIGHT YELLOW INSTRUCTION SHEET warning you to apply the protective sticker so you can prevent this from happening. I had no issue with clipping and unclipping the zoom lens to be honest so the possibility of scratching the kinect is a non-issue to me.
My verdict is that the zoom lens can provide the comfort of knowing you could reduce the range if you really had to, but i find that the data returned is actually more "noisy" so I wouldn't use it for an installation that is returning "blob" data.
Installation is a breeze. Why did I panic or struggle the last time around? I suppose its all about pretending that you know what is going on and then slowly figuring it out along the way.
Download the OpenKinect wrapper v0.9c for Mac OSX. Install it and then open up terminal and run as3-server as follows:
When you see "### Wait client" its ready.
Screenshot of the demo package on the as3kinect site.
my housemate tries the blob detection - success!
as3-server Isochronous Transfer Errors
And then, of course it had to happen. I spoke too soon. On my Macbook Pro (OSX 10.6), it keeps dropping the packets - something that i never experienced with openkinect on Windows 7 (via Parallels). Not sure what to do about this. Playing with depth settings cause it to completely stop communicating... At this point I didn't feel confident that this would stand up to the rigours of being used in an installation so I scrapped this.
Tried downloading tuiokinect. TUIO is a popular protocol used for blob detection, fidicual detection, and gestures and it works brilliantly with many things such as Flash, Processing, Java, etc. I wonder why I hadn't looked at this earlier!
You can use udp-flashlc-bridge to send TUIO messages to Flash.
DISCOVERED THAT IT WORKS OUT OF THE BOX WITH OTHER TUIO CLIENT APPLICATIONS! Now, this is a winner. Opened up my \\ application which was from my solo show in 2010 which also used TUIO, and the blob detection works immediately. The people working on TUIO are doing something right here, it is seriously awesome.
Here, the gestures demo shows what it can do. After tweaking depth, I can use it to move squares around, rotate them, and make them larger or smaller.
I was going to build a tesla coil simulation and had scraped together something on the flash side, but it was impossible for me to mash it together with tuiokinect within the amount of time so I ended up just projecting this gestures demo which seemed to entertain people quite a bit, if not at least induce them to exercise for a few minutes by trying to push a square from one end of the screen to the other.
CALIBRATION: I ended up using flkinect inbetween to adjust tilt and other motor functions. If the kinect is accidentally disconnected, this will cause TUIOkinect and flash to crash! This happened when people tripped over the cabling which had to be run across the room. I also found myself constantly recalibrating it in TUIOkinect when people approached it from a different angle!
LOGISTICS: The next issue I think is if you want to build kinect things then projector cables need to be a lot longer because the computer and kinect have to sit in the front of the room and the projector needs to be high enough so that people's heads do not get in the way. We were forced to use a slightly awkward or should i say "cosy" arrangement of equipment because of cable length, and in my previous installation at MBS ArtScience Museum, a few dozen metres of VGA cables were used to connect the projector in the front of the room, with a cable that ran all the way up the wall to the ceiling of the exhibition hall across the entire space to the back and to a large metal projector rig hanging a good few metres from the ceiling.
PROOF-OF-CONCEPT: I guess the point of this is that a few years ago I wouldn't have imagined that it would be possible for me to even control or put together something like this on my own but this basic demo can be easily put together by someone with no formal technical background within one afternoon. If I can put this together in one day, then clearly, if given sufficient time and motivation, anyone could build something like right out of the interface in minority report, or something like microsoft surface - entirely from scratch, powered by a few smart opensource sockets and libraries. The next question would then be: but why would you want to do that....?"
Thanks to Kathleen and Kent for putting together the show, the other artists for bringing their works, and Sherwin for so graciously having us invade his place until the wee hours of the morning (including that remarkably focused artist talk that inexplicably went on until 3am). Photos I took at The $100 Exhibition can be seen here.