12 April 2008
I recently spent a couple of hours in Second Life attending a Caroline’s grand re-opening party. This was the longest stretch of time (about 3 hours) I had been in world in quite a while. Apart from the occasional customer service issue which requires me to jump in world, I don’t visit SL much any more. I’ll have more thoughts in a post marking my 5 year rez-day in a couple of weeks.
At the party eveyone was dancing and there was a central “dance ball” that anyone could touch that would animate his/her avatar. The dances were nothing to write home about—I recognized many bits and pieces that were ripped straight from old Poser 4 (I think it was 4) stock animation and combined with other “found” animation. (Test animations I had uploaded during the 1.4 Preview back in June 2004, if that tells you anything). Other individuals had a variety of different dance animations, some of which I really liked.
While everyone dancing together and listening to the same music/dj is great fun, the only difference between last week’s experience and parties we had in 2003 was the scripted dance animation. Don’t get me wrong, I think animation is great and SL 1.4 was probably one of the most exciting releases to date. And while there are advantages to pre-scripted animation allowing everyone to type and chat, I really longed for more direct interaction.
As chance would have it I was alerted to two bits of info last week which seemed to provide a possible solution to my desire for more direct human-avatar interaction. The first was a tweet from Lordfly about a project that was started back in 2006 by a dev team at LL (Cube Linden, Aura Linden, and Ventrella Linden) called Avatar Puppeteering. Please do check out some of the videos on the site for a working example of puppeteering in action. The project certainly showed a lot of promise. That is, before it was put on indefinite hold so that the team members could work on “viewer stability, bug fixing, and performance” issues. Tateru over at Massively has done a little digging and found out that Ventrella (and, yes, he was responsible for flexi-prims) left LL last year. Her conclusion is that this project has suffered perma-death.
Which is unfortunate because Mitch Kapor, LL’s Chairman, seems to have become interested in human-avatar interaction himself. According to this article Kapor and developer Philippe Bossut have been developing a hands-free, camera-based interface for Second Life. You can visit Kapor’s site to view a demonstration.
Given these projects and the success of accelerometer-based interaction of the Nintendo Wii and Apple iPhone and camera-based interaction like Sony’s Eye-Toy, some form of more advanced human-avatar interaction is coming. Will it come from Linden Lab? I wouldn’t get your hopes up.