While some phenomenal strides have been made in the development of artificial intelligence (AI) in the last few years, the opportunity to redefine the relationship between mankind and machines is just now being explored.
At an IBM Watson Developer conference this week, IBM unveiled Project Intu, an effort to create new forms of AI that can proactively interact with humans across multiple dimensions.
Rob High, IBM vice president and CTO for IBM Watson, says most of the AI technologies developed thus far interact with humans on a single dimension. The goal with Project Intu is to create AI that can interact with humans across multiple dimensions. For example, a robot concierge should be able to not only give directions, but also point to where the person needs to go.
AI systems will also be able to observe human behavior in a way that enables them to offer helpful suggestions. They can even listen to a conversation and provide additional relevant insights or conversely determine that now is not the best time to interrupt someone.
“We’re now talking about AI on multiple dimensions,” says High. “This goes way beyond simply using speech and microphone.”
IBM is setting up Project Intu as a mechanism for developers to create applications that provide these kinds of cognitive experiences. Via Project Intu, developers will be able to invoke a broad range of application programming interfaces (APIs) using a broad range of platforms.
High notes that IBM is already working with Hilton Worldwide on a proof-of-concept that infuses AI into a robot concierge known as “Connie.”
AI as most people interact with it today may not live up to the imaginations of science fiction writers just yet. But it should be clear to everyone by now that the day when AI finally does is much closer at hand then most people ever thought possible.