Help - Search - Members - Calendar
Full Version: Augmented Reality
Dumpshock Forums > Discussion > Shadowrun
ShadowPavement
Neeto

Augmented Reality
suoq
If I shrunk those individual photos down to where they would fit on my phone, that would be extremely difficult to read.

And why just two constellations in the sky?

Why does only one of the screenshots have twitter?

I've run a bunch of apps like this. Either he's changing the settings for what information is displayed incredibly fast (based on the dude walking) or that's all just photoshop. My bet is it's all just photoshop. Deciding what constellations to show based off a consistent sky color just isn't worth coding and given that clouds are white, showing constellations in white wouldn't be my personal choice. I'd at least give them a good solid outline or drop shadow.
WiFu
That pretty much the basics of what I imagined AR to be.
Check this one out_wouldn't allow me to hyper link but here is the URL

http://www.datadrivenconsulting.com/2010/0...d-hyperreality/
Rand
Snug: Augmented Reality isn't seen onyour phone, it is seen by your eyes - through whatever device you are using (cybereyes, goggles, glasses, or contacts).

I think that the backgrounds on the icons would be more clear. (As in see through.)
suoq
This thread has a RL (Real Life) tag. It's not about 2072. It's about 2010. And the article specifically states "Smart Phone Applications added layers of information".

There are plenty of AR smart phone apps. For example, I use Layar, Google Places, and Google Sky Maps, Layar is probably the most "Shadowrun of the three". I should give Wikitude a test run, but I don't really have a use for it.

Slower forms of AR (i.e fast but not realtime) that I use would be Bar Code Scanner and Google Goggles. They look at photos from the cell phone camera and look up relevant information. The image processing is currently too slow to be done in real time. The other tools just overlay the image based off the internal data in the phone, such as the GPS, accelerometer, and orientation sensors ( http://appinventor.googlelabs.com/learn/re...ts/sensors.html )
Rand
QUOTE (suoq @ Aug 26 2010, 07:00 PM) *
This thread has a RL (Real Life) tag. It's not about 2072. It's about 2010. And the article specifically states "Smart Phone Applications added layers of information".

Oops. Dur. Missed that part. Then, I agree. The screens on phones are too small to use in the manner of AR (except in some very specific and limited ways, maybe). Heck, I don't like my laptop to be less than 13" so that I can have a big enough screen to not have to scroll as much. (Damn, getting old sucks.)
hobgoblin
i just wish android would get support for usb provided sensors, as there are some glasses out there that would then be able to provide SR style AR.

still, there is a guy that mounted a beagleboard inside a cdcase and added a mutilated myvu. A similar setup (3g dongle and all) could provide SR style AR. Freaking ting would even runs of AA's.
kjones
This is sexy as hell.
Draco18s
QUOTE (suoq @ Aug 26 2010, 04:54 PM) *
I've run a bunch of apps like this. Either he's changing the settings for what information is displayed incredibly fast (based on the dude walking) or that's all just photoshop. My bet is it's all just photoshop. Deciding what constellations to show based off a consistent sky color just isn't worth coding and given that clouds are white, showing constellations in white wouldn't be my personal choice. I'd at least give them a good solid outline or drop shadow.


Its obviously a photoshop job. He only showed any given type of information once along the collage.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Dumpshock Forums © 2001-2012