Hey Siri, OK Google, Is my train on time?

Emma and I were running late the other morning, the clock ticking nearer 7:41 and causing a bit of rush with the morning commute. We took separate cars since we were coming home at different times. As I drove towards the station, the question I really wanted to know was “is the train on time?”. If there were two people in the car, one person could look it up but that wasn’t the case.

With current voice activated features I would need to say something very descriptive like “is the 741 from Hackbridge station to City Thameslink on time?”*. This is not the language I speak, it’s computer-speak not human language. Thinking human would be answering a question I would ask to the person next to me in the car.
*Train times command not available yet but it would be something like this so the computer could interpret it more easily.

This is a perfect scenario for an Intelligent Personal Assistant and it already has access to the data points it needs to answer my question:

  • The current time
  • My location using GPS
  • Train times using the National Rail’s API

The data that is currently difficult for a Siri-like assistant to get is my habits like “which train do I normally take, from which station to my normal destination”. This is currently not available in intelligent personal assistants, to my knowledge. However, it could guess based on the frequency of a location I visit at a certain time of day or, less technically, it could be manually provided by me as a setting.

Intelligent personal assistants are becoming standard in mobile devices with the likes of Siri, Google Now, Microsoft’s Cortana and Amazon Echo all competing for our voices. A recently added a feature activates the assistant by saying a phrase like “Hey Siri” or “OK Google”.

With this, combined with the knowledge of my daily routine, I expect “Hey Siri, is my train on time?” will not be far away. Apple, let me know when this is ready.

Advertisements

iPhone keyboards designed for diverse human needs

Whilst taking the train home the other day, I spotted someone making interesting gestures on their iPhone screen. I was intrigued. It looked like they were using a drawing app but in a text message. Upon closer inspection, I saw that they were writing Chinese characters in a blank space at the bottom of the screen which were then auto-recognised by the phone, presenting a number of characters to select. I found this fascinating.

Apple definitely took a “thinking human” approach for their keyboards and designed with a globally diverse human needs in mind. You can choose from keyboard layouts for Chinese and other languages in your settings. They could’ve easily just offered a single keyboard with Chinese characters mapped to it and left it at that but instead they thought (and most likely observed) how people message each other and created a handwriting focused one as well. Great design.

Apple included this early on iOS software, according to this article on MacRumors: “Apple Includes Chinese Handwriting Recognition in iPhone 2.0 Beta”.

Mac OSX Help Menu – The way help should be

Using the help feature on Mac OSX applications is a thing of beauty. It’s inspired me to write a post because it saves me time and helps me learn the application better. When I search for a feature and it’s in the menu bar, the menu location is automatically expanded and displayed. This does two extremely helpful things for me:

  1. Shows me where to find it – helps train me on using the application by showing me instead of having to read and follow a bunch of text instructions with screenshots (if I’m lucky)
  2. Allows me to run the command immediately – no need to navigate through the menu following the instructions, I just hit ‘enter’ and it’s done!

This is application help designed perfectly for humans.

mac_help_awesomeness