How iPhone 4 Could Change Augmented RealityJune 28, 2010
When Apple announced iOS 4.0 earlier this year, some additions to the SDK (software development kit) caught the attention of augmented reality (AR) developers – specifically, open access to the phone’s camera APIs. But with the introduction of the new hardware in the iPhone 4 made this past Monday, the possibilities for AR on the popular smartphone have skyrocketed. Today I had the opportunity to chat about the device’s impact on AR with Stefan Misslinger, lead iPhone developer for metaio, one of the leading AR companies and makers of the mobile AR browser junaio.
(source: Read Write Web)
Enhanced Image Tracking
For some AR experiences, image recognition and tracking are essential to a quality customer experience. The application needs to be able to analyze the data being taken in from the camera in order to properly overlay 3D objects into space, but previously this functionality wasn’t available for iPhone developers. When Apple announced iOS 4.0, the inclusion of access to live camera data in the iPhone SDK provided AR developers with the ability to bring image tracking to the iPhone.
Other mobile operating systems, like Android, have had the capability to use image tracking for a while now, but the iPhone had been closed off from raw camera data. Last week, metaio introduced an image tracking application for Android, known as junaio Glue, and now with iOS 4.0, it will be able to bring this same functionality to the iPhone. The company also provides its own mobile AR SDK that allows developers to build apps leveraging metaio’s technology, and Misslinger says metaio will include image tracking in the iPhone version of its SDK in the next few weeks.
New and Improved Cameras
But it’s not just the software that will make image tracking on the iPhone easier and more available; the forthcoming iPhone 4 also includes a 5 megapixel camera capable of recording 720p video. The higher resolution of images captured by the phone makes the tracking of AR markers and image-based triggers much easier, but, as Misslinger points out, it comes with a catch.
Image tracking for AR requires that visual data be analyzed 30 times per second, and using a high-resolution image could slow down this process and make the tracking less accurate, or lag-ridden. To avoid this, Misslinger says the use of the full resolution will likely be on a case-to-case basis. For closer experiences, a lower resolution image should suffice, but for tracking markers that are larger or father away, the full power of the camera may be required.
The iPhone 4 also includes a second forward-facing camera, and everyone seems excited to see how Apple’s vision for the future of video communication will play out. For augmented reality, however, the front facing camera opens up an entire new realm of possibilities on the mobile device. AR experiences traditionally developed for desktop webcams, like virtual mirrors that let users try on sunglasses and clothes, can now be experienced from a handheld device.
Gyroscopic Motion Sensing
One of the biggest surprises at the iPhone 4 announcement was the addition of a gyroscope to the device’s arsenal of sensors. As Steve Jobs demonstrated, the gyroscope will allow the device to interpret its specific location as it relates to gravity – as the person turns in 3D space, the phone will recognize this motion based on sensed velocity. Most people immediately thought of augmented reality when this feature was introduced, and Misslinger says this is an obvious tool that AR developers will quickly adopt.
Certainly the gyroscope will help apps stabilize their results and track a user’s movement, but there are additional uses for the gyro for AR. At the moment, if a user is using image tracking technology, quick movements of the device can cause the image to blur – interrupting the tracking of a marker or image. Misslinger says he is excited to use the gyroscope to help bridge the gap between when image tracking software loses and regains its capture on an image or marker. With the gyroscope working in tandem with image tracking technology, brief interruptions in tracking due to blurred images could be eliminated.
Faster Processor & High-Resolution Display
Call it what you will, but Apple’s “Retina Display” on the iPhone 4 packs a serious high-res punch, and that bodes well for augmented reality. AR’s success, in someways, relies on its ability to create a seamless merging of real and virtual worlds, and with a better display comes better graphics. Misslinger points out that the enhanced display will allow for the inclusion of better 3D models in mobile AR experiences – greatly enhancing the overall user experience.
Additionally, faster processor speeds on the device will allow these larger models to run much smoother than before. Apple’s home-brewed A4 processor will allow AR apps to not only render 3D models faster and at a higher level of quality, but it will also help analyze the camera data at more close-to-real-time speeds.
iPhone 4 vs. The World
Apple is famous for admitting that it may not be first to include seemingly basic and simple functionality (like copy/paste and multitasking), but the company aims to do it as seamlessly and efficiently as possible. In the case of image tracking, Apple wasn’t first to the game, but it is likely that with the ease at which developers can implement APIs from the SDK that the iPhone could soon establish itself as the leading platform for mobile AR. Add that to the phone’s hardware additions and the large number of devices the company is likely to sell and you’ve got fertile soil from which AR can blossom.
(source: Read Write Web)
Ok ok… nothing really futuristic or anything to do with Augmented Reality, but still… I just had to share it because its hilarious.
Episode 1 “iPhone 4 vs Evo”
CHECK EPISODE 2 HERE >>>
Episode 2 “Evo vs iPhone 4”