MindMeld Understands Human Conversation, Could Put Siri To Shame

They’re calling it “anticipatory computing”, and it’s a big step forward in the way we’ll interact with our computing systems, or at least that’s what Expect Labs will be working on with the $2.4 million it received in investor funding today from Google Ventures and Greylock Partners.

MindMeld, which was unveiled last month, is reminiscent of tools like Siri, but it goes far beyond what Siri does and can do, and Expect Labs wants it to be available on every computing platform you have, from your tablet to your car’s slick new smart capabilities to Google Glass.



What the app does currently is bring up relevant information and links while you’re videochatting (with up to 8 people); as the conversation changes and evolves, so too does the information at your fingertips. You can click those links and poke around without leaving the app or the video chat, and in fact there’s a sharing panel that lets you click and drop items to share with whoever you’re chatting with.

To get sense of how this anticipatory computing works, here’s a visualization of the type of searching MindMeld does when it hears you say something:

MindMeld search

This is compared to the traditional search query/list of results paradigm we’re all so familiar with:

old search

MindMeld portends an intriguing step forward in voice-assisted computing. There’s no word on whether or not you can frustrate MindMeld with a string of non-sequiturs, though.