1. Thoughts on Google Search for iOS, Siri, and The Next Interface

    I recently installed the new Google Search app on my (still great) iPhone 4. I have decided to wait as long as possible on upgrading to an iPhone 5 (no immediate compelling need) and, therefore, still have no devices that have Siri. I saw many on the Internets raving about how good and fast the speech recognition was in Google Search. In fact, everyone was saying it was better than Siri. For me, the fact that I could at least get a sense of the usefulness of such a feature on my aging-too-soon iPhone 4 was all I really needed to give it a shot.

    What can I say other than that fact that it is fast? Like, really fast. Much faster than Siri. In fact, I would describe it in a way that I have yet to see Siri described: Useably fast.

    Now, to be fair, Google had a big head start over Apple on the sort of data mining that is required to execute such amazing results in voice recognition. I mean, after all, that is Siri’s promise — That the more people that use it the more data Apple can capture on the server side and the better both the recognition and the results will get. But, Google has been doing the same thing for years — though not in such an obvious way.

    Remember Google 411? I do. Before Siri or even smartphones as we have come to know them today, it was the best way to get sports scores, restaurant info, weather, and all sorts of other smart data all by calling a number and asking the question.

    How about Google Voice? Do you use it? I do. Been using it for years (since before Google owned it and it was called Grand Central). I love that when people leave me a voice message it translates it into text and sends it to me via email.

    See where this is going? Yep, that’s a big part of how Google Search is so much better and faster than Siri. They have been doing this voice recognition data mining using a host of other services for years before Apple even stepped into the game.

    Another thought, seeing this in action, I imagined what things would be like if Google Search, like Siri, also had the ability to interact with other apps like Reminders and the Calendar.app. Or, if Siri were to become just as good (which is inevitable but will take some time). It immediately became clear to me — this is the next interface.

    In other words, what if when we slid to unlock instead of being met with rows and pages of icons we, instead, were met with Siri? What if our primary interaction with such devices was not touch, but voice? What would that look like? What would that feel like?

    I don’t have answers to any of those questions but I bet Apple is thinking them up in a lab somewhere right now. Feel free to file away for future claim chowder.

  1. montezumabagel reblogged this from minimalmac
  2. case-custom reblogged this from minimalmac
  3. fsintel reblogged this from minimalmac
  4. penguinpuddles reblogged this from minimalmac
  5. engineering-22 reblogged this from minimalmac
  6. bocher101 reblogged this from minimalmac and added:
    don’t understand why...article makes excuses...- e.g....
  7. kulturvampir reblogged this from minimalmac and added:
    Thoughts on the next interface
  8. talldarklefty reblogged this from minimalmac
  9. iphoneyou reblogged this from minimalmac and added:
    Apparently Google’s Voice Search is kicking Siri’s ass.
  10. minimalmac posted this