Emma and I were running late the other morning, the clock ticking nearer 7:41 and causing a bit of rush with the morning commute. We took separate cars since we were coming home at different times. As I drove towards the station, the question I really wanted to know was “is the train on time?”. If there were two people in the car, one person could look it up but that wasn’t the case.
With current voice activated features I would need to say something very descriptive like “is the 741 from Hackbridge station to City Thameslink on time?”*. This is not the language I speak, it’s computer-speak not human language. Thinking human would be answering a question I would ask to the person next to me in the car.
*Train times command not available yet but it would be something like this so the computer could interpret it more easily.
This is a perfect scenario for an Intelligent Personal Assistant and it already has access to the data points it needs to answer my question:
- The current time
- My location using GPS
- Train times using the National Rail’s API
The data that is currently difficult for a Siri-like assistant to get is my habits like “which train do I normally take, from which station to my normal destination”. This is currently not available in intelligent personal assistants, to my knowledge. However, it could guess based on the frequency of a location I visit at a certain time of day or, less technically, it could be manually provided by me as a setting.
Intelligent personal assistants are becoming standard in mobile devices with the likes of Siri, Google Now, Microsoft’s Cortana and Amazon Echo all competing for our voices. A recently added a feature activates the assistant by saying a phrase like “Hey Siri” or “OK Google”.
With this, combined with the knowledge of my daily routine, I expect “Hey Siri, is my train on time?” will not be far away. Apple, let me know when this is ready.