For anyone who recalls my earlier post regarding the very cool Reactable interface (reLink), this will be immediately appreciated.
The above video is a real game being played on a real table but recreated in real time inside a virtual world (though I suspect this is turn-based play). As the real hands move the paper cutouts with the identifying marks, a webcam picks up the image which is then processed and sent into Second Life. You can see a video of the hands being streamed inside SL and, of course, the virtual submarines shooting torpedoes at each other.
Nice little mixed-reality effort.
Wonder what we’ll see next. Maybe someone will put those markers on snails and people can organize virtual races. Gamblers will love that.
Dibs on “Escargot Downs”.
Hell yes. That’s a new way to play, and I think I really like it. I’m glad to see so many of your posts (esp. lately) are pointing out how arbitrary the distinction is between virtual and actual. Gimme “augmented”. Gimme reactability.
Regarding “distinctions”, what I’ve not posted about is something going on in the background and which fell out of just deciding to “play” along. I’m thinking of it in terms of Hirshberg’s “creative abrasion” (reference – Link), only the other person has been unaware.
It’s something about which I’ll probably write at some point in the future.
Pingback: reactability gaming… to book groups? « Ghostfooting
I was actually there when they were making the video at SLUK07 last Saturday. The system has a big TV camera set up to point downwards at the “board”; it isn’t exactly turn-based in that each player has six “subs” (triangles) to place, and can do so at any time, as long as they only use one hand.
Once all of the triangles have been placed – they are just bits of paper with a design on the back, which is presumably used by the image capture to work out what and where they are – the data was uploaded to SL and the subs began to appear. I’m not sure quite what the relationship between the time a triangle was placed, and the time a torpedo was shot, was – I’m not actually sure that the time placed had any effect. The hits and misses were calculated purely in SL it seems and the winner declared.
There were people playing it for hours and hours… I lost my only game, mind you.
That’s what I was wondering, Ordinal. Watching the vid it’s hard to know just how real-time it is (which is why I chose snails) and what’s happening when. I suppose more info will fall out soon enough, but either way it’s a pretty neat mashup.
The bit where they place the triangles is real-time when you’re watching it in-world (as there’s a direct feed) but, in the video, they have cut some of it, as well as the delay between the end of the game and the subs appearing.
It isn’t _much_ slower than that though, to be fair. Each game only takes a few seconds, and the time between the end and the subs is only seconds. Whatever system it uses is certainly quick. The biggest wait is actually for the torpedoes to move and hit their targets, or not.
From the t-shirts collaborators apart from Babbage were wearing, I think they are BBC types – a lot of smart, experimental people in some of the BBC units these days.
Thanks for the additional info.
Now the real question: Do you think we can have multi-reality snail races in (near) real time?
You bet your slimy rear-end we could.
Excellent. Shells will be fun to model. I do, however, wish SL had shader effects with vertex deformation. It might make the undulating body easier to model.
I should ask Qarl about that sometime.
SLorpedo uses a single snapshot of reactivision data at the end of the game to determine the positions and orientations of the subs. The results are then determined according to the rules of the IceHouse game Torpedo: first all the small subs fire and sink large subs, then the large subs fire and sink small subs. Players score 2 points for each large sub afloat at the end of the game and 1 point for each small sub.
The RL portion of the game is played in real time and then the SL porition of the game is simulated afterwards. The RL/SL temporal split is by design as we didn’t know how resposive the path from RL to SL through reactivision would be, however I think you could get down to sub 1s latencies via HTTP and latencties of the order of 100s of ms if you modified the SL viewer to read data from the reactivision and modify prims in world.
There are lots of other potential reactivision/SL applications.
Thanks for the follow-up, Jim. I’ll be looking into setting up snail farms and breeding for speed in the near future.
Just to clarify the make-up of the Supernova team – couple of Lindens, couple of independent developers, few IBM folks. I wasn’t at SL:UK, but the BBC t-shirts were probably hangovers from HackDay the previous weekend. I’m wearing my BBC Backstage t-shirt right now :-)
Got a few fuzzy pics of the real world action on my phone:
Link
Hopefully the video Paul shot will be on YouTube as well soon.
As Jim mentions above, there are a few ways we could make SLorpedo a little more real time. It might be entertaining if the model subs appear before the video feed shows the piece being placed!
(The BBC t-shirt was from Hack London the week before :)
Thanks for the additional information on the team and the link to the pics (I like the projected image one). Will keep an eye out for any additional video.