The Exceptional iPhone SIRI

 

Siri, or something similar, would become the norm for interfacing with smart phones and in doing so it would define the future of local search (and everything else). Well it seems to have succeeded on the everything front but just not on the local search front. The Siri natural language interface is a metaphor for interaction that will supplant the need for typing and can provide a hands free way to interact with smaller devices when typing is dangerous (ie driving) or awkward (ie all the time).

It works incredibly well and as John Gruber noted: “I wouldn’t say I can’t live without Siri. But I can say that I don’t want to.” It is that good.

It is hands down the best way to speedily create and send text messages regardless of whether you are driving or sitting. It is the best way to get driving directions detailed on the iPhone Google Map app. It is the best way to search the web whether you want to use Google, Yahoo or Bing. In fact it even fixes what was so miserably wrong with voice search in the Google app.

Its ability to understand what you want and what you are saying is uncanny. Even with background noise. I am a convert and while I will most definitely use it while driving, it may very well become my preferred interface for many other things as well.

It truly is a harbinger of a new level of functionality for interacting with your phone (and any small device for that matter). I won’t leave home without it.

EXCEPT FOR LOCAL SEARCH.

Siri can either interact with other apps or it can answer some things directly. For example you can say “Text Aaron I will be late picking you up” at which points it interprets your instructions, performs a voice to text translation, double checks its accuracy with you, understands that you want to text and then sends the note via the iMessage app. With some data types it will just answer you inside of the Siri environment. That is the design for interaction with both Wolfram Alpha and Yelp.

Danny Sullivan noted yesterday that when searching for local businesses, Siri accurately provides a list from Yelp but then doesn’t allow you to call the location, look at reviews or even get more details. For whatever reason, Apple and Yelp have decided to limit the functionality of the local search in such a way so that it is essentially useless, forcing a user to a different data source for the information.

Having marvelled at Siri’s capability, it is easy to imagine saying to Siri – “make a reservation at the Ho-Ste-Geh restaurant for 2″, “read me the reviews for the Rennas” or even “Add the Robins Nest’s contact details to my address book”. But the local search capability, doesn’t do any of that.

There are alternatives for a user of Siri to get local information. You just need to use the web search functionality of Google, Bing or Yahoo (use one or all three) by saying “Look on the web for a nearby restaurant” or “Google breakfast restaurants”. On the plus side, it no longer takes that 6 touches that Google voice search required to make a hands free call. Now when doing a local recovery search on Google it takes just one touch after the voice interaction to complete the call. And Siri does such a significantly better job of getting the search right the first time than Google voice search ever did. You wonder where Siri has been hiding.

But in limiting the functionality of the built in local search functionality, Apple and Yelp are missing a chance to change user behaviors. In not changing user behaviors from the gitgo they may miss the opportunity to break the habit later on. Natural language voice search on the smartphone is a long game, and the 1 million iPhones so far sold are just a drop in the bucket of the market. The real game is yet to come.

There is every reason to believe, seeing what else Siri can do, that increased local search functionality will arrive. But regardless of whether this was Yelp’s choice or Apple’s, from where I sit, this is an opportunity lost to win a battle in a long war.

The iPhone Siri & Changes in the Local Search

 

With Apple’s announcement of the release of the new iPhone 4s and iOS5 came the expected announcement of the tight hardware and software integration of their Siri natural speech technology. It appears to be integrated at the highest level of the user interface.

Mobile search on the iPhone has been broken. Google offers up voice search but it is by no means a hands free affair and takes way too much screen interaction to provide a result. Google’s product has been quirky and buggy on the iPhone and crashes frequently. Since the discontinuation of Goog-411, the 411 service on the iPhone has been only marginally functional due to a weak data sources for even the simple task of call completion.

Hopefully tight integration of voice with the phone will change all of that. The voice recognition system will read back texts, allow you to schedule an appointment AND do local search.

How does it do with local search? And where will it get its data?

The answer to the first question is that voice search on the existing Siri app is far better than Google’s voice product or the built in 411. That is a step in the right direction but passing the hurdle set by Google on the iPhone is not that difficult. It does a good job on the recognition side and it currently has an easy to use interface which will improve with integration and returns reasonable results.

Interestingly the current Siri app pulls data from a wide range of data sources to answer your questions. That is true with business listing data as well. Depending on the local search it might show results from Yelp, Yahoo, CityGrid, Localeze or BooRah. I presume that it uses even more sources than I have so far discovered and it appears to be agnostic as to where it gets its data. Siri also seems to mix and match sources when necessary.

Ultimately local will be almost 100% about mobile. That battle, at least for now, is Google vs. Apple. In local data that appears to mean Google vs. Everybody else (that Apple partners with). Does Siri solve the mobile search problem?

It is in the very early stages and users behaviors are not yet defined. Testing and time will tell if it offers up enough substance to be fluidly integrated into every day phone activity and local search. Here’s hoping.

Google Chrome OS Providing Click To Call Functionality

Google Chrome OS, their cloud based operating system currently being tested, provides click to call functionality on all Google blended local search results as well as phone numbers visible in general search results or on web pages. The call is connect over the Google VOIP network. The functionality is available on Google apps on smartphones but not in any other current desktop environment (unless you are using the buggy Skype plug in).

The click to call functionality is “turned off” if the user does not have a Google Voice account or is logged in as a guest.

As long as the number is formatted with parenthesis i.e (716) 204-1297, the click to call functionality is available across the web and not just on Google products.

Click the image to view the original Chrome OS full screen shot:

Several months ago Google was testing an enhanced Google Places Dashboard that included tracking of calls by mobile devices using Google Maps for Mobile.

Given the lack of an industry wide call tracking standard for local it appears that Google will be going it alone in providing call tracking information from software, services and devices that they control.