A sign of great technology is when it does what you want in a seamless manner, without getting in your way. If a particularly technology is truly great, you'll sometimes forget you're even relying on electronic parts to do your bidding, just as you don't put much though into the head of a hammer of when mashing a nail into a plank of wood. That's the aim with digital assistants, and for Alexa in particular, Amazon is teaching it to know which skill to use even when the user doesn't know which one is most appropriate.
This is a new feature for developers called CanFulfillIntentRequest, and it's currently being tested in beta. it's all about making interactions with Alexa feel more natural, like talking to a real person. Using CanFulfillIntentRequest, developers can have Alexa query multiple skills to fulfill a request by an end-user, even when that request is vague.
"For example, if a customer asks, 'Alexa, where is the best surfing today near Santa Barbara?' Alexa can use CanFulfillIntentRequest to ask surfing skills whether they can understand and fulfill the request. A surfing skill with a database of California beaches might be able to both understand and fulfill the request, while one with a database of Hawaiian beaches might only be able to understand it. Based on these responses, Alexa would invoke the skill with the database of California beaches for the customer," Amazon explains.
CanFulfillIntenRequest is rooted in machine learning, as with many modern technological conveniences. The machine learning model allows Alexa to sift through skills and find the best one, and then apply it, all without the end-user having to specifically call out a particular skill. The only caveat is that it's not built into Alexa in such a manner that it just works with everything—developers have to implement the function by way of the Alexa Developer Console or Alexa Skills Command Line Interface (CLI).
Developers who want to implement this neat addition can find more details in the quick start guide.