Augmentation

Through the looking glass, one subsystem at a time.

Archive for the ‘Uncategorized’ Category

leave a comment »

In case you hadn’t noticed, there has been lots of press in the past week about the rumor that Google is working on a pair of HUD glasses. I don’t doubt it, but having asked a several Googlers about this at CES, I invariably got a sarcastic reply along the lines of “yup, and we’ve also got a space elevator coming out later this year.” But I’ve never spoken to somebody from Google X. I did hear an account of Sergei Brin spending a nice chunk of time at the Vuzix booth at the show, so HUD glasses are clearly on their radar, if nothing else.

One should note that there has been mention of image analysis being performed using cloud resources in Google’s scenario. This is part of the scenario that I envisioned after hearing Microsoft’s Blaise Aguera y Arcas introduce Read/Write World at ARE. While I haven’t heard anything about it since, I wouldn’t be surprised if it pops back up this year. What I think will happen is that a wearable system will periodically upload an image to a server that will use existing photographic resources to generate a precise homography matrix pinning down the location of the camera at the time that the image was taken. The GPS metadata attached to the image will provide the coarse location fix necessary to select a relatively small dataset against which to compare the image. Moment to moment tracking will be done using a hybrid vision and sensor-based solution. But at least in the first generation of such systems, and in environments that don’t provide a reference marker, I expect cloud-based analysis to be a part of generating the ground truth against which they track.

Let’s do a little recap of some of the most notable HUD glasses options these days:

Vuzix STAR1200 – I got to try these out at ARE back in June of last year and was quite impressed. I’ve since picked up a pair and love them, with some caveats. Because they use a backlit LCD micro-displays as opposed to an emissive technology like OLEDs, you don’t get perfect transparency in areas where the signal source is sending black. That means that if the glasses are on and you are sending them a blank black screen, you still see a slight difference between the display area and your peripheral vision. Also, the field of view (FOV) of the display area could definitely stand to be a little larger. The STAR1200 is intended primarily as a research and development device, and is priced accordingly at $5000. The device comes with a plethora of connectors for different types of video sources, including mobile devices such as the iPhone. The STAR1200 is the only pair of HUD glasses that I know of that come with a video camera. The HD camera that it originally shipped with was a bit bulky, but Vuzix just started shipping units that come with a second alternate camera which is much smaller and can be swapped out. The glasses also ship with an inertial orientation tracking module. Vuzix recently licensed Nokia’s near-eye optics portfolio and will be utilizing their holographic waveguide technology in upcoming products that will be priced for the consumer market.

Lumus Optical DK-32 – I finally got to try out a Lumus product at CES, and was quite impressed. I’ve spoken with people who have tried them in the past and, based on my experience, it looks like they’ve made some advances. The FOV was considerably wider than that on the Vuzix glasses, and both contrast and brightness seemed to be marginally superior. That said, you as an individual can’t buy a display glasses product from Lumus today, and they are very selective with respect to whom they’ll sell R&D models. You can’t buy the glasses unless you’re an established consumer electronics OEM, and it would set you back $15k even if you could get Lumus to agree to sell you a pair. I’ve heard that part of the issue is the complexity of their optics manufacturing process. As I was several years ago, I’m looking forward to seeing a manufacturer turn the Lumus tech into a consumer product.

Seiko Epson Moverio BT-100 – I’m rather ashamed that I didn’t know about this device before heading to CES, and so didn’t get to hunt them down and try them. I love that these come with a host device running Android. I can’t, however, find mention of any sort of video input jack. It’s a shame if they have artificially limited the potential of these ¥59,980 ($772) display glasses. Also, with a frame that size, I’m genuinely surprised that they didn’t pack a camera in there. I’m looking forward to getting a chance to try these out.

Brother Airscouter – Announced back in 2008, Brother’s Airscouter device has found its way into an NEC wearable computer package intended for industrial applications.

I don’t mean to come off as a fanboy, but I like Vuzix a lot. This is primarily because they manage to get head-mounted displays and heads-up displays into the hands of customers despite the fact that this has consistently been niche market. I have to admire that kind of dedication to pushing for the future that we were promised. I also love that they are addressing the needs of augmented reality researchers specifically. It will be interesting to see how these rumors about Google will affect the companies that have been pushing this technology forwards for such a long time. I’m hoping that it will help broaden and legitimize the entire market for display glasses, which have long been on the receiving end of trivializing jokes on the tech blogs and their comment threads.

Written by bzerk

February 13, 2012 at 8:42 am

Posted in Uncategorized

ISMAR

leave a comment »

After having my cab get rear-ended on the way to JFK, and sitting on the runway for half an hour in a plane full of crying and whining kids, I’m finally in the air on the way to Orlando for ISMAR. Unfortunately it is sans a mature demo. I wasn’t able to get a built set of my hardware sent to Seac02 in time for them to integrate it. Actually, it’s because I got a bit distracted by my new job, for which I was out at Ames, week before last, assisting with a set of tests in the Vertical Motion Simulator. I know ISMAR is a big deal, but one doesn’t get many chances to play with that kind of hardware.

Team_VMS

Anyhow, I tried to use my free time to work on the project, but things just didn’t really come together without being in my lab at home.

So I’m off to ISMAR without my project in the shape that I’d intended, but I’m actually thinking that that’s just as well. What I do have is a press badge, and rather than trying to impress the guys with the big brains with my little DIY VR project, I’m going to try to learn and see as much as I can this week, and blog about it every chance I get.

As always, if you want to hear the latest and greatest news in the field, head over Ori Inbar’s blog at www.gamesalfresco.com. The big news right now is that the private API code for accessing the iPhone camera frame buffer is now being freely distributed, and Ori and company are the ones giving it out!

Starting tomorrow, you might want to begin checking back here daily to find out what I’m seeing at ISMAR, and what I make of it all. And if you’re at the conference, drop me an email if you’re interested in meeting up or have something you want me to see.

And if you want to try your code on an ION-powered netbook, I’ll be driving my VR920 and CamAR with my shiny new HP Mini 311 ;-)

Written by bzerk

October 18, 2009 at 9:16 am

Posted in Uncategorized

Tagged with , , , ,

Zerkin Glove!

with one comment

Check out the awesome vid that Ori Inbar shot and edited for me!

Written by bzerk

September 22, 2009 at 10:40 am

Posted in Uncategorized

Augmented Reality Roundup (some of the exciting stuff from the last few months)

with 3 comments

Okay. Time for a long overdue update.

First off: a teensy bit of self promotion. I know the demo is still VR, not AR, but give a little more time on that. In the meantime, here’s another vid of me and my glove, this time at Notacon (to which the always awesome Matt Joyce dragged me as an auxiliary driver of Nick Farr‘s car), where Jeri Ellsworth and George Sanger of the Fat Man and Circuit Girl Show (some of the sweetest, nicest people ever!) awesomely invited me up on stage to show it off, which I then bumblingly did :-D This is from a couple of months ago, so the software is as it was in the first video. It’s a little long (I told you it was a bumbling demo! cut me some slack!) so you may want to read the rest of the post first, before the overwhelming lethargy has set in ;-)

The software is actually coming along, though it’s still self-contained. There are a few cool new features that’ll be ready to show soon.

So, now I’d like to do a little recap of some of the many interesting developments on the AR scene since my last post. So that’s what I’ll do ;-)

I’ll follow this post later (when? I have no idea. don’t hold your breath or plan your week around it. =P) with one exploring the implications of the makers of Wikitude and Layar opening up their APIs and looking for user-generated content. It’s a big deal. In the meantime, read this article in the NY Times… but first read the rest of my post. :-D

Also, I haven’t posted since the introduction of the iPhone 3GS, which contains a compass that will, I think everyone already knows, enable optically referenceless AR apps (with the registration accuracy issues that that entails) on the iPhone. Here’s something you may not have seen yet: Acrossair has used their engine from their Nearest Tube app to write a Nearest Subway app for NYC.

Wikitude, Layar, and Zagat’s NRu apps are all coming to the iPhone 3GS. What can’t yet come the iPhone, however, is full-speed optical AR. This is because Apple still hasn’t released a public API for direct access to the video feed from the iPhone’s camera. Ori Inbar, creator of the Games Alfresco blog (the definitive AR blog, in my opinion and in those of many others), has written a beseeching Open Letter to Apple, gathering the signatures of almost all of the major players in the field.

So… some of the most exciting demos I’ve seen in past few months, from a technical perspective, are…

George Klein‘s iPhone 3G port of his PTAM algorithm:

My understanding of his previous demos was that they utilized a stereo pair. Seeing this kind of markerless environment-tracking on a single-camera device, and a mobile one at that, is extremely exciting.

The second one which REALLY got me psyched was Haque Design+Research‘s Pachube + Arduino + AR demo (they are the initiators and principal developers of Pachube).

I’ve seen a number incredibly cool things done with Pachube, starting with Bill Ward’s Real-world control panel for Second Life. Anyhow, my mind is pretty well blown by the fusion with AR. It’s also worth checking out the Sketchup/Pachube integration video in Haque’s YouTube channel.

And the third would have to be (drumroll, please)…
Eminem?!?!


Yup. This neat bit of AR marketing for Eminem’s new album, Relapse, sets itself apart from other, more “run-of-the-mill” FLARToolkit marketing (run-of-the-mill AR? in what reality am I living?! uhm… yeah… I know… an increasingly augmented one =P), by featuring creative user interaction. This is, so far as I know, the first AR app which allows one to paint a texture onto a marker-placed model.

The fourth is equally surprising, is one of the coolest things I’ve ever seen (specifically because it is useful, and not particularly trying to be cool), and is from the United States Postal Service.

The USPS Priority Mail Virtual Box Simulator (Would one simulate a virtual box? I think one simulates a box, or generates a virtual box… why would you simulate something that’s already virtual? =P) is by far the most practical FLARToolkit consumer-facing AR app I’ve seen to date. Kudos to those involved.

Next up: Aaron Meyers and Jeff Crouse, both heavily involved in the OpenFrameworks community, and the interactive art world in general, created a  the very cool game which they call The World Series of ‘Tubing. (Presumably in a nod to the World Series of Poker. Aaron and company were wearing card dealer’s visors when I saw the project demoed at Eyebeam’s MIXER party last month.) Here’s a RocketBoom interview with them.

Marco Tempest‘s AR card trick is superb, and just plain awesome. He’s also just a really nice guy.


Marco’s project was worked on by Zach Lieberman and Theo Watson, the creators and curators of the OpenFrameworks project.

I was lucky enough to see Marco perform his trick live at the OpenFrameworks Knitting Circle, held at Parsons. Aaron Meyers also demoed his kickass work on the World Series of ‘Tubing engine. I haven’t watched the Rocketboom interview yet, but the coolest feature of the ‘Tubing game, the ability to shuttle back and forth through the video clips by rotating the markers, wasn’t actually shared with the participants in the presented incarnation. I actually tried to use it to get an edge when I played, but ended up botching it. It turned out that the player on either side had to tilt their cards towards the other player to accelerate playback (so the directions are reversed, and I would have had to, unintuitively, tip the card to the left to accelerate my clips, as I was on the right side of the stage).

Also awesome and of note is the Georgia Tech/SCAD collaboration, ARhrrrr.

I can’t really argue with blowing up zombies with Skittles. I’d love see something like this implemented using PTAM, so that you could play on any surface with trackable features. I also have some gameplay/conceptual quibbles with ARhrrrr, but it is a technically very impressive piece of work.

Those are the ones that I checked out and in which I saw something I thought to be fundamentally new or to constitute a breakthrough in technology or implementation. Here are some other projects since I last wrote:

There have been animated AR posters for the new Star Trek movie and the upcoming movie Gamer, Blink-182 videos stuck on Doritos bags, etc. Meh. Passive playback crap. I guess it brings something new to the table for somebody who doesn’t know how to rotate a 3D model with a mouse, but I’d rather watch a music video on an iPod than on a simulated screen on a bag on a screen in front of which I’m holding the bag. Dumb. It kinda’ reminds me of the Aquateen Hunger Force episode where Meatwad wins tickets to the Superbowl, and a holographic anthropomorphic corn chip in a sombrero serenades him with this news… except you need to hold it up to your webcam. Whatevs.

Anyhow, hop over to Games Alfresco for comprehensive coverage of the AR field.

Finally, I’ll leave you with TAT. Don’t watch this video if you don’t want to have to pick your jaw up off of the floor. Enjoy :-)

Written by bzerk

July 14, 2009 at 2:04 pm

Posted in Uncategorized

First public vid of one of my gloves

with 8 comments

There are lots of exciting things going on with marker-based AR right now. I’ll get back to covering them soon, after I’ve worked out a few kinks in my own development plan :-)

In the meantime, here’s a little look at part of what I’m working on.

Written by bzerk

April 10, 2009 at 6:48 pm

Posted in Uncategorized

Joke No. 2

leave a comment »

Written by bzerk

April 1, 2009 at 6:23 pm

Oh Frabjous Day!

with 3 comments

On few occasions have I been so happy to be wrong. I’m usually the first to admit it when I am, but I can’t usually claim honestly that it makes me happy to have made a mistake.

Over the weekend, Robert A. Rice, Jr., one of whose blog posts actually prompted me to start this blog (see my first post), stated in a comment thread for one of my previous posts that he had firsthand experience of the AV920 Wrap prototype, and that it is, in fact, a true optical see-through HMD. Well, always the skeptic, and generally not one to take another’s word for it without being able to see and read their face, I called up Vuzix. They didn’t have Press Relations or General Inquiry options on their PBX, so I wrote them an email instead. Well, email being the easily put-off medium that it is, I received no response to my inquiry regarding the true nature of the AV920 Wrap’s display technology.

So… I finally called back yesterday and figured that Vuzix is a small enough company that I could probably speak to pretty much anybody there. To cut myself off before this post gets any MORE long-winded: The AV920 Wrap is, according to its manufacturer, a true optical see-through HMD. That is very exciting news. Now, I don’t think that they’d tell me this if it weren’t true, but I’ll truly believe it when I’ve had the chance to look through a pair for myself. I just don’t understand why they wouldn’t state this explicitly on their web site or in their literature, unless maybe it was a strategic decision to lure their potential competitors into a false sense of security.

People (myself among them) have been bitching about the lack of an optical see-through HMD, or really a HUD, on the consumer market. If we’re actually going to see one this Fall, I can wait (somewhat) patiently. We’ve waited this long, and the AV920 Wrap was only just announced this past CES. If another CES rolls around and it isn’t on the market, then we have a problem. But Vuzix has delivered everything they’ve promised up until this point, and I adore my VR920 HMD, so I’m more than happy to give them the benefit of the doubt.

Written by bzerk

March 25, 2009 at 1:19 pm

Posted in Uncategorized

LOL

leave a comment »

I just rewatched World Builder on a whim and noticed that she’s in room 2048. Nice power of two, and the year tagged by Intel CTO Justin Rattner as a possible estimate for Kurzweil’s Singularity.

Written by bzerk

March 10, 2009 at 5:17 pm

Posted in Uncategorized

Good reading.

leave a comment »

Yup, that’s pretty much what I was gettin’ at. Damn the questions, let’s build it already.

http://blogs.discovery.com/good_idea/2009/02/augmented-reality.html

Still writing my next real post. Other stuff to do.

Also, IDG650 gyros are on sale to the public now. Self-zeroing. I’m so conflicted about exerting the effort to design a board around the IC, after hand-soldering SMD multiplexers that Sparkfun put on a breakout two weeks later. Progress being made, for those keeping track.

Also… since they don’t really show anything to convince you they aren’t vaporware on the Lumus site, they look real enough to me in this vid.

Now when can we have one, for crying out loud? Seeing through cameras sucks! Not that I don’t want the NY company to get the market (oh my goodness, I do… and I mean Vuzix), but between Lumus and Microvision…

Written by bzerk

February 28, 2009 at 4:57 am

Posted in Uncategorized

Tagged with

Y’all grok?

with 5 comments

Not to be greedy, but a comment or two today would be nice. It’s boring musing to an empty room. =)

Written by bzerk

February 16, 2009 at 7:44 am

Posted in Uncategorized

Follow

Get every new post delivered to your Inbox.