NY Times on Maps, 'Augmented Reality' and LBS in Mobile

John Markoff, a terrifically thoughtful writer on technology for the NY Times, has written a sweeping piece on mobile mapping, navigation, LBS and augmented reality. It touches on most of the issues and possibilities for "local mobile search" as a metaphor for finding things locally on a mobile device. 

Most fascinating to me is the notion of "augmented reality," which exists already to varying degrees in Japan and somewhat on the Android compass function. This is a very strong future direction for mobile mapping and LBS. Imagine holding up your camera at a restaurant from the outside and seeing all the reviews for that restaurant. That's a very mundane example. Imagine doing that at a store and seeing what brands were on sale on the inside, etc.

Something along these lines is the reconciliation of the realities of privacy and user behavior with the long-held LBS "Starbucks coupon" fantasy. Regardless, such use cases exist or are not far away.

I've long been fascinated by the "point and search" functionality that already exists from GeoVector, NeoMedia, SnapTell, Mobot and others. There's keyboard/keypad as a query entry mechanism and more recently voice search (e.g., Vlingo, Tellme, Google, Nuance, etc.). But the camera on a smartphone represents another input system that may prove equally if not more powerful. ShopSavvy on Android, as well as other bar code and QR code readers, are starting to gain attention and some adoption in the US.

I could be mistaken about the emergence of augmented reality and/or camera-phone based search functionality in the US but I don't think I am. It's really just a question of time.