Dubbed Mobicast, the system requires two sets of software, one for the phone and one for the server receiving the footage. When two or more phones are in the same place capturing the same scene, the software synchronizes their clocks so the framing lines up correctly. Image recognition technology on the server then figures out how the footage should physically mesh, using features of the landscape or scene to recognize parts of the images that match. It then blends the images to create a wider, more detailed view of the scene, sort of like PhotoSynth for video (but without the 3-D ? for now).
The coolest part, of course, is that Mobicast can do all this in real time, so an event can be captured and broadcast live to the Web by several cameras at once. Users also receive feedback to their phones showing stills of the stitched video with their contributions highlighted, helping them to see how they can better position themselves for the best contribution.
Before going public, there are some issues to sort out, like how to tell if several phones are in the same vicinity filming the same scene (GPS?). Until then, all we can do is keep on filming and dream of the day that celeb scandals break in full 360-degree 3-D.
Popular Science has been a leading source of science, technology and gadget news since 1872. With up-to-the minute latest space news, insightful commentary on the new innovations and concept cars ...if it's new or future technology you'll find it at popsci.com.au.
WW Media - Popular Science © 2010
Cameras - Home Entertainment - Mobile Phones