From the nation that brought you http://www.sankakucomplex.com/2009/01/20/cyber-figure-aris-molest-a-maid-voiced-by-yukana/ as a commercial product comes http://www.sankakucomplex.com/2009/04/12/3d-virtual-mikumiku-dancing/ towards the glorious AR-enhanced future. The impressive thing? This appears to be an amateur application of the technologies, given the use of community favourite virtual idol http://en.wikipedia.org/wiki/Vocaloid#Hatsune_Miku (widely used in amateur music production for vocal elements).
Edit: Please alert me if you can't see the videos. Niconicodouga normally requires a signup, but I had heard that Sancon had an agreement to allow hotlinking.
Videos are visible. Slow loading, but visible.
Having watched the whole first video I know exactly what they're doing. It's not really all that hard, either.
It all has to do with the motion capture markers placed in the scene.
Thing is, I could do that without the need for the motion capture markers: http://en.wikipedia.org/wiki/Nuke_(software) (a Weapon of Mass Creation) has a method of finding those same data-points (that is, still points in a scene such that camera movement can be calculated based on those points' relative movement to each other--typically used to "undo" a shaky camera hand), which is the key to having real-time AR. Nuke can find the points (as well as track moving objects), but it isn't fast. Tracking a seagull across the footage and replacing it with blank sky, for example, takes about...3 minutes per 30 seconds of film.
Powered by Invision Power Board (http://www.invisionboard.com)
© Invision Power Services (http://www.invisionpower.com)