By Adam Wilkins, Thought Leader
If you haven’t heard about this fascinating technology, you soon will. It’s where computers start to figure out what we want, even when we don’t know what we are looking for.
Consider a Google search for a moment. It does an impressive job of finding content on the web because web designers are now trained to tag each website page with identifying information – this is called SEO (Search Engine Optimization). Without these tags, google and other search engines would not be as impressive as they are with identifying which of the 1,055,152,105 (as of July 19, 2016) websites to display for you.
Now within a company, we aren’t just looking for websites. We are looking for other “pages” like emails, tweets, office documents, policies, procedures, journals, blogs, research etc. (“unstructured content”), that amounts to trillions of pages of content that don’t have “web designers” to tag or index them. Without such tagging, searching becomes difficult at best, and in many cases impossible.
Companies have long realized this and have attempted various methods of indexing and tagging their “unstructured content” with varying results. But with the volume of content doubling every few years, the systems in place can’t cope with growing demands and content complexity.
These systems understand how humans communicate with each other, the idiosyncrasies of our language and context. They can “read” massive amounts of content and form, what amounts to an educated guess, on what we need without the need for tagging. When imagining such things people immediately think of Apple’s Siri, but she is merely what would be termed a “shallow” natural language system mimicking what we can expect in the future. For example when asking Siri “what is 4×4” the answer would be a precise “16” but the accurate answer depends on context and could be a 4×4 truck or a piece of timber. Cognitive systems recognize local context and have the ability to learn.
Watson from IBM is perhaps the most advanced version of such a system. It was initially put to task against the best Jeopardy contestants of all time to see if it could quickly and accurately understand the questions and win. It did and now IBM is now putting Watson to work “reading” mountains of medical and financial information in an attempt to save lives and make better investment decisions. In the not too distant future this type of technology will be able to make informed decisions for you based on information you would never have considered.
Yet it’s not just about finding answers to questions. It works both ways, so you can give it a result and it will formulate a hypothesis on why you ended up there. I wonder if anyone has punched “42” into Watson yet?
This technology will continue to evolve over the next decade, and economies will eventually see it arrive at on your handheld device. No, it will not become self aware and take over any time soon like movie portrayals in The Terminator, The Matrix, 2001 A Space Odyssey and many others. And if you’re wanting to start a project in this space, IBM may just have some money for you.