Derek Smith (SimpleGeo)- augmented reality SDK for the iPhone
3 important technologies – camera, location, and compass. With data from location and compass can plot objects. Can calculate bearing and distance of objects from the device of an object. Device will be at origin of graph. That’s the first stage. The viewport (what the device can see) is the second stage. The third is sizing the objects according to distance. Implemented in OpenGL ES for you. Most of the UI framework doesn’t gel with OpenGL so you have to get creative. If you work in 2D you have to implement your own pipeline, but you can use the standard UI framework.
This was not a very together presentation, but the SDK looks like it will be very interesting when it gets released.
Nicola Radacher – Mobilizy –
example of wikitude client.
GPS signal – accuracy can be bad due to city density or fog. What can you do to improve it? One way is image recognition. Take a picture, send it to a server, compare to data in database, correct user’s location. You need a lot of data for any big city.
What to do if there’s no compass – Calculate position through GPS signal changes. Don’t need user feedback, but it’s inaccurate. Alternatively, ask the user to help – tell them to adjust the phone to point north, for example, or point it to the sun (not good in Seattle!). More accurate than GPS (perhaps), but still not great.
Alok Deshpande (loopt) – CoreLocation in Practice
Nice abstraction built on several technologies. Shields you a little bit from having to worry about which technologies are available. It’s a subscription model. You can specify accuracy and how often you want to be updated. You’re then sent location events with location info. What accuracy do you need? How frequently do you need to be notified of changes? Tradeoff is response time and battery use vs. accuracy. Example: Where’s my car? Simplest way to start is with MapKit framework instead of CoreLocation. Supports showing a user’s location. To do anything more substantial you need to use CoreLocation itself. Probably want to set user’s location to as accurate as possible and continuous update (as they’re walking to the car).
Nick from Skyhook Wireless
CellID, WiFi, and GPS. Skyhook uses WiFi to calculate location. Available on many platforms.
Cell – Universal, 150-700m accuracy, 1-2 sec response, low power.
WiFi – Urban indoor/outdoor, 20-40m accuracy, 1-5 sec time to fix, low power
GPS – Outdoor/ limited indoor, 10m accuracy, 1-60 sec time to fix, medium power
Typical GPS receivers need -140dBm or better. Most cannot decode below -145dBm, or -155dBm with assitance. 140dBm = 10(-14)mW.
WiFi positioning – scan for signals, trilaterate to determine location. in iPhone reports lat/long to CoreLocation
They drive around collecting wifi signal fingerprints then calculate AP position by reverse trilateration.
Martin Roth (Reality Jockey)- Augmented Audio – A new musical world (the mic as sensor)
http://rjdj.me/
What is RjDj? a reactive music player. Reactive music? it changes with your environment and actions.
Uses PureData – visual signal flow programming language to do the input processing.
iPhone has a number of audio frameworks. Media Player gives you access to iphones library. Av Foundation Framework gets you up and running. Audio Toolbox framework plays audio with synchronization capabilities, access streams, convert formats,etc.
Audio Unit framework uses audio processing plugins
OpenAL framework – meant for games.