Monday, February 17, 2014

Post 25: Hope for the Kinect at Last!

   BIG NEWS: Corey loaned Mike a Kinect for the week end, and he got it to work and do pretty much what I was asking for in my emails to them last week:

Hi Guys,
   I am playing with the idea of a set that could be projected on at a roughly 30 degrees angle WITH A SINGLE PROJECTOR. I made a cardboard model to test it:


    And it works. My 5K could cover a 20 ft wide stage in the dark, floor and walls.
    It can be mapped with an image like this, which requires precise projector positioning and accurate alignment:


     Or I can use MadMapper with an image like this, and map each panel separately, which makes it a lot easier:
     

 Either way, this is the way it ends up looking on the model:


   As you can see, the image projected on the "dancers" is mostly of the floor(which could be interesting as long as they stay at the front of the stage).
   Now, WOULD IT NOT BE NICE if I could use the Infra Red Kinect camera to map BOTH the shape and movement of the dancers, and replace it image in image, on the fly, with either a solid color or an entirely different image. That would be like having several precisely fitted infinitely adjustable projector/spotlights following them around. I expect some kind of delay due to processing, but the movements could be kept slow.
   I can dream it up, but I honestly don't know how to do it (yet), and won't have time to learn the Processing language and the Kinect Hacks in the book I bought until after Light Dreams.
   So, do you see any way you could figure it out?
   Keeping you on your toes… Let me know what you think.
   Thanks.
JJ

  This is a neat idea.  I think it can be done without a lot of work, but there will definitely be spillover from the dancers to the ground/walls (both from latency and registration errors).
   I'll run a couple of tests to see what the issues look like; maybe we can come up with a way that they either aren't very apparent, or otherwise work into the overall projection.
Corey Shum
Technical Director, Enabling Technology Laboratory
School of Engineering, Dept of Mechanical Engineering
University of Alabama at Birmingham — Knowledge that will change your world

Corey,
  I am more concerned about the latency than the registration errors, but that in itself could create interesting effects, with the Dancer sometimes getting ahead of her projected image. 
  Because of the way the IR sensor in the Kinect works (projected dot pattern reflected towards the IR camera), the limited density of dots, and the fact that surfaces at almost 90 degrees (like the edges of a body) don't reflect the dots, there is some raggedness around the edges, but that can be minimized with a flat distant backdrop, and could be smoothed and feathered to some extent.
  They are also going to come out with a new much improved higher definition Kinect with skeletal tracking, that is also supposed to act like a green screen and key out the background:
  That would be fabulous, and I hope it comes out soon, and can be hacked for the Mac…
  Thanks for your help. Keep me posted.
JJ
PS: When we get this one figured out, I would like to find a way to harness the amazing point cloud rotating 3D imaging capability of the Kinect:


      The best software I have found for the Mac to visualize the images is cocoaKinect.:
                      http://fernlightning.com/doku.php?id=randd:kinect


       Synapse does a pretty good job at recognizing the skeleton:
                               http://synapsekinect.tumblr.com/post/6610177302/synapse


JJ,
   Corey was kind enough to loan me a Kinect over the weekend.  I played around with it and was able to create a proof of concept application.  I like your idea.  It lets you do a green screen effect without the green screen.  I wrote some code that scans through the depth data from the device.  For each pixel that belongs to the player, I replaced with the color information from the RGB camera on the device.  All other pixels were replaced with a background image. I also went ahead and added the particle effect we were talking about last time.  If you look at the screen shot, you can see that the particles are being emitted from my hand:


    The latency wasn't nearly as big of an issue as I thought it would be.  But, I wasn't moving that fast.  A dancer may see a little more.  The only major issue I ran into happened when the Kinect lost me and had to reacquire.  Sometimes when it reacquires a body, it will stop sending depth and color information from the device.  The tracking still worked though.  So, in the application, you could still see the particles but the image of me disappeared.
    BTW, we can add any 3D objects to the scene.  not just particles.  Here is a scene with me holding the ghost and PacMan models from my game in each hand.
   Thanks,

 Mike




Good Morning my Friends,
  Whaooooo!… Look at that! Super cool! That is some great news to wake up to and start a new week.
  Mike, you are a Godsend. You are the quiet man that delivers… And quickly!
  The images look great by the way, and I think you really need to do something with the Kinect either for PacMan, or another interactive game.
  My big problem is that I use a Mac, so can it work for me. I actually did manage to move an object one time with Synapse and Quartz composer, but now keep getting an error message.
  In any case, great job Mike, as always.
  Thanks a bunch to both of you.
JJ

No comments:

Post a Comment