Where 2.0 Online – Jeffrey Powers on iPhone vision

Jeffrey Powers, Occipital – Computer Vision and the iPhone Camera

All about getting the iPhone to do the things Iron Man’s helmet can do. REcognize faces and structures, show objects in augmented reality fashion, using the camera. They created RedLaser, a vision-based barcode scanner.

Getting started. First need to interface with the camera. UIImagePickerController – the interface with the camera. Need to check for camera, because iPod Touch doesn’t have one.

Snapture – pinch to zoom in live image capture. Uses CameraViewTransform to do the transform.

RedLaser – can handle blurry barcodes. That matters because the camera doesn’t have autofocus until 3gs. Custom overlay – puts UI on top of the live camera image. Rapid screenshot captrue – currently requires an unpublished function. Image processing – ignores gray scale. Cleans up images. How do you access raw pixels in code? Use DataProvider to read the pixels.

Future of iPhone computer vision – still can’t access video frames. Can’t show objects on top of screen shots for vision-based augmented reality. Mobile computer vision will eventually become a part of most apps we already use today.

Where 2.0 Online – More on iPhone sensors

Derek Smith (SimpleGeo)- augmented reality SDK for the iPhone

3 important technologies – camera, location, and compass. With data from location and compass can plot objects. Can calculate bearing and distance of objects from the device of an object. Device will be at origin of graph. That’s the first stage. The viewport (what the device can see) is the second stage. The third is sizing the objects according to distance. Implemented in OpenGL ES for you. Most of the UI framework doesn’t gel with OpenGL so you have to get creative. If you work in 2D you have to implement your own pipeline, but you can use the standard UI framework.

This was not a very together presentation, but the SDK looks like it will be very interesting when it gets released.

Nicola Radacher – Mobilizy –

example of wikitude client.

GPS signal – accuracy can be bad due to city density or fog. What can you do to improve it? One way is image recognition. Take a picture, send it to a server, compare to data in database, correct user’s location. You need a lot of data for any big city.

What to do if there’s no compass – Calculate position through GPS signal changes. Don’t need user feedback, but it’s inaccurate. Alternatively, ask the user to help – tell them to adjust the phone to point north, for example, or point it to the sun (not good in Seattle!). More accurate than GPS (perhaps), but still not great.

Alok Deshpande (loopt) – CoreLocation in Practice

Nice abstraction built on several technologies. Shields you a little bit from having to worry about which technologies are available. It’s a subscription model. You can specify accuracy and how often you want to be updated. You’re then sent location events with location info. What accuracy do you need? How frequently do you need to be notified of changes? Tradeoff is response time and battery use vs. accuracy. Example: Where’s my car? Simplest way to start is with MapKit framework instead of CoreLocation. Supports showing a user’s location. To do anything more substantial you need to use CoreLocation itself. Probably want to set user’s location to as accurate as possible and continuous update (as they’re walking to the car).

Nick from Skyhook Wireless

CellID, WiFi, and GPS. Skyhook uses WiFi to calculate location. Available on many platforms.

Cell – Universal, 150-700m accuracy, 1-2 sec response, low power.
WiFi – Urban indoor/outdoor, 20-40m accuracy, 1-5 sec time to fix, low power
GPS – Outdoor/ limited indoor, 10m accuracy, 1-60 sec time to fix, medium power

Typical GPS receivers need -140dBm or better. Most cannot decode below -145dBm, or -155dBm with assitance. 140dBm = 10(-14)mW.

WiFi positioning – scan for signals, trilaterate to determine location. in iPhone reports lat/long to CoreLocation

They drive around collecting wifi signal fingerprints then calculate AP position by reverse trilateration.

Martin Roth (Reality Jockey)- Augmented Audio – A new musical world (the mic as sensor)

http://rjdj.me/

What is RjDj? a reactive music player. Reactive music? it changes with your environment and actions.

Uses PureData – visual signal flow programming language to do the input processing.

iPhone has a number of audio frameworks. Media Player gives you access to iphones library. Av Foundation Framework gets you up and running. Audio Toolbox framework plays audio with synchronization capabilities, access streams, convert formats,etc.

Audio Unit framework uses audio processing plugins

OpenAL framework – meant for games.

Where 2.0 Online – Alasdair Allan

I’m participating in O’Reilly’s Where 2.0 Online conference – fall 2009 – the topic is An emphasis on iPhone sensors.

First up is Alasdair Allan – http://www.dailyack.com/ – author of a book on iphone programming. The sensors in your iphone.

GPS (core location) – abstraction layer in front of different methods. Abstracts cell towers (12km falling to 1-3km), Skyhook wireless (approx 100m), GPS (approx. 40m). Have to check if location services are enabled first. iPhone simulator will always report location of 1 Infinite Loop in Cupertino.

Distance filter – can set to update based on distance of a change, so you don’t get so many update messages.

Can set desired accuracy using locationManager.desiredAccuracy

Delegate methods:locationManager gets messages when location changes – new location and old location.

Accelerometer – measures linear acceleration of device – roll and pitch, not yaw (except iphone 3gs if you combine with magnetometer) x=roll, y=pitch. 1.0 = approx 1 gravity. z=front side up or front-side down. 0.0=edge-on to the ground.

Declare view controller class UIAccelerometer instance. Start the accelerometer. Can set update frequency (e.g. .1 sec) – can calculate orientation from that in radians.

Magnetometer (digital compass). Combining heading info (yaw) with roll and pitch, can determine orientation in real time. Only 3gs has this, so important to check whether heading info is available in core loocation with locationManager.headingAvailable.

Magnetometer is not amazingly sensitive – 5 degrees is good for most purposes. Check to see that new heading is >0. Returns magnetic heading, not true. If location services are enabled, then you can also get true heading.

Heavily affected by local magnetic fields.

Camera – you can have the user take a picture and grab it.

Proximity Sensors – turns device’s screen on and off when you make a call – infrared led near earpiece that measures reflection. UIDevice object. Sensor has about a 3.5 cm range.

phonegap is an open source framework for building web apps that become native apps on iPhone and android. http://phonegap.com/

Alasdair recommends the iSumulate app from Vimov.com to be able to simulate acceleromater events in the iPhone SDK – http://vimov.com/isimulate/sdk

Run static analyzer (in xcode in snow leopard) to check your code before shipping to Apple – because they will.