Fun with Nintendo 3DS and Kinect
First, an amazing video showing off the augmented-reality prowess of the Nintendo 3DS (found via Kotaku)
At SXSWi last week, I bumped into a friend who is studing at Carnegie Mellon’s Human-Computer Interaction Institute. That friend introduced me to another friend at HCII, who introduced me to another friend at HCII, and now I’m in thick with these geniuses! Anyway, one of these new friends, Nisha Kurani, sent me this note this morning, regarding the program’s mind-bending Kinect-hacking projects:
As we discussed, the Kinect Hack was a class project that lasted a little over 2 weeks. Our course titled, “Special Topics in Interactive Art & Computational Design” taught by Golan Levin and teaching assistant Dan Wilcox (you may know him from the Titty Tracker Kinect Hack) is an interdisciplinary class with students from Art, Design, Architecture, Computer Science, Human-Computer Interaction, and Robotics. Our professor passed out Kinects to the entire class and asked us to work in teams to push the limits of our creativity while exploring the many uses of its depth-camera. The unique mix of Golan’s exemplary experience, Dan’s mad skills, and innovative students who are eager to learn has resulted in several really cool projects that you should definitely check out: http://golancourses.net/2011spring/projects/project-3-interaction/.
Here are a couple of my favorite student projects:
“Boo, a Super Mario ghost character, appears when the Kinect senses a person’s body. He’s shy and always stays behind the person he is following, while floating gently and laughing manically. The size of Boo is depth dependent as well.”
“Alex and Ray created a particle system that exhibits flocking and swarming behaviors when the user is moving, and flocks to the participant’s depth field when they’re standing still. The resulting simulation ebbs and flows between the recognizable and the abstract.”
“roboScan is a 3D modeler + scanner that uses a Kinect mounted on an ABB 4400 robot arm. Motions planned in Robot Studio and Robot Master control to the robot as well as the 3D position of the camera. The Kinect depth data is then used to produce an accurate model of the environment.”
“A Kinect-controlled DMX spotlight automatically tracks a person within a space. The spotlight follows any people it sees, and jumps back and forth between them.”
Check out all the Kinect-hacking projects here.