Where 2.0 Online – Jeffrey Powers on iPhone vision

Jeffrey Powers, Occipital – Computer Vision and the iPhone Camera

All about getting the iPhone to do the things Iron Man’s helmet can do. REcognize faces and structures, show objects in augmented reality fashion, using the camera. They created RedLaser, a vision-based barcode scanner.

Getting started. First need to interface with the camera. UIImagePickerController – the interface with the camera. Need to check for camera, because iPod Touch doesn’t have one.

Snapture – pinch to zoom in live image capture. Uses CameraViewTransform to do the transform.

RedLaser – can handle blurry barcodes. That matters because the camera doesn’t have autofocus until 3gs. Custom overlay – puts UI on top of the live camera image. Rapid screenshot captrue – currently requires an unpublished function. Image processing – ignores gray scale. Cleans up images. How do you access raw pixels in code? Use DataProvider to read the pixels.

Future of iPhone computer vision – still can’t access video frames. Can’t show objects on top of screen shots for vision-based augmented reality. Mobile computer vision will eventually become a part of most apps we already use today.

Where 2.0 Online – Alasdair Allan

I’m participating in O’Reilly’s Where 2.0 Online conference – fall 2009 – the topic is An emphasis on iPhone sensors.

First up is Alasdair Allan – http://www.dailyack.com/ – author of a book on iphone programming. The sensors in your iphone.

GPS (core location) – abstraction layer in front of different methods. Abstracts cell towers (12km falling to 1-3km), Skyhook wireless (approx 100m), GPS (approx. 40m). Have to check if location services are enabled first. iPhone simulator will always report location of 1 Infinite Loop in Cupertino.

Distance filter – can set to update based on distance of a change, so you don’t get so many update messages.

Can set desired accuracy using locationManager.desiredAccuracy

Delegate methods:locationManager gets messages when location changes – new location and old location.

Accelerometer – measures linear acceleration of device – roll and pitch, not yaw (except iphone 3gs if you combine with magnetometer) x=roll, y=pitch. 1.0 = approx 1 gravity. z=front side up or front-side down. 0.0=edge-on to the ground.

Declare view controller class UIAccelerometer instance. Start the accelerometer. Can set update frequency (e.g. .1 sec) – can calculate orientation from that in radians.

Magnetometer (digital compass). Combining heading info (yaw) with roll and pitch, can determine orientation in real time. Only 3gs has this, so important to check whether heading info is available in core loocation with locationManager.headingAvailable.

Magnetometer is not amazingly sensitive – 5 degrees is good for most purposes. Check to see that new heading is >0. Returns magnetic heading, not true. If location services are enabled, then you can also get true heading.

Heavily affected by local magnetic fields.

Camera – you can have the user take a picture and grab it.

Proximity Sensors – turns device’s screen on and off when you make a call – infrared led near earpiece that measures reflection. UIDevice object. Sensor has about a 3.5 cm range.

phonegap is an open source framework for building web apps that become native apps on iPhone and android. http://phonegap.com/

Alasdair recommends the iSumulate app from Vimov.com to be able to simulate acceleromater events in the iPhone SDK – http://vimov.com/isimulate/sdk

Run static analyzer (in xcode in snow leopard) to check your code before shipping to Apple – because they will.