Monday, October 26, 2009

AR, VR, MAR, MVR, ISMAR and other acronyms


ISMAR 2009 was a mixed bag, some awesome talks and demonstrations and on the other hand very few vendor kiosks with disappointing demos that looked more like 2001. Yes, some AR headgear was almost tolerable but with poor image quality (basically little low-res tv's hung in front of your eyes). Although there are already working prototypes of retinal projection none was available, and this is the technology which I believe will be both less intrusive (almost ubiquitous) and at the same time totally invasive with a direct path to your brain.

I think that all head worn displays suffer from an extreme case of risk-aversion, incrementing oh so slowly, sensors and devices that are already mainstream, like accelerometers, magnetic compasses etc. They need a serious garage-shock! anybody there?

The conferences, on the other hand, were fabulous. It was hard to choose which one to attend, but we stayed with the Arts&Humanities chapter most of the time, except when some very well known presenters gave their demos, like Pattie Maes from MIT wearable computing lab which demonstrated the "SixthSense""a device developed by Pranav Mistry that we had already seen in a TED video, or the HitLab guys from New Zealand with which we had very interesting conversations about their various tools and projects. We also found out that they are already working very closely with UF's department of Engineering.

Augmented Reality (AR) Joiner series, Waterfall - Outdoors from Augmented Stories on Vimeo.


And talking about projects, I was really impressed by Helen Papagiannis from York University in Toronto. She presented work utilizing marker tracking with a custom library created at their Augmented Reality Lab in the Department of Film. This was the first time that I have seen a truly creative an original work which has gone beyond technology and in the process of becoming a new language, a new form that she might not even be aware of. I am sure we will hear more from her in the future. The video above is only a small taste. But in a shorter term she agreed to make a virtual presentation to our class, just need to coordinate times! Very exciting.


The other high point for me was the presence of Natasha Tsakos, whose work we have also seen in class. She too agreed to visit us online. It was nice to see her after almost seven years, from a theater student to a full fledged international performer and media star. We had a chance to talk about the ethics of technology and our responsibility as technoartists in this world and time that will be crucial to our survival as a species, not to mention other life forms that we are decimating as we boldly go where no one has been before.

Our presentation of the virtual alien controlled real-time from Digital Worlds was a smashing success. Eyes and mouths open, plenty of smiles and excitement. To be honest I myself was surprised of the enthusiastic response, being a very techie and sophisticated audience. That made me realize that we are on the right place at the right time...! Using an Open Source game engine running code by Anton Yudi and my 3D character and environment we were able to advance a few steps towards physical-virtual avatar control. Still a long way to go to be completely untethered.

Jarrel Pair, the organizer who introduced us, said we had taken a great risk by presenting something live, in real-time and involving so many variables. In reality it was nerve wrecking! Internet connectivity via ethernet, which we needed because wireless was so spotty, was non existent until about 4 minutes before our group began (we were 3 presenters in a panel). But once we started everything went absolutely smooth, even beyond our expectations.

Pictures coming soon...

1 comment:

  1. Sounds like great fun, I wish I had been able to see it (not the stressful part about the ethernet, but the demo).

    ReplyDelete