Watson In The Real World
September 25, 2017 Dan Burger
The fundamental value of IBM’s Watson technology and the ability to put it to work effectively in IBM i shops is still being sorted out. Certainly, technical challenges exist, but that’s not unique to new technology like cognitive computing. IT by nature is challenging. Learning to do more with technology that’s been around for years also has its challenges. Acquiring knowledge, skills and best practices should be, and generally is, an ongoing habit.
Sharing what we know about solving challenges helps everyone in the community. So it is with Watson and IBM i. Who is making it work and who is making it work effectively?
References and examples are trickling out. Earlier this month, at an event called the “IBM i Driveway to Watson,” the integration between Watson, the Bluemix cloud and i took a few more steps forward with guidelines for building and implementing applications that go a long way toward removing the mystery of Watson and Bluemix.
The value of learning from what others are doing. Knowing that IBM i and Watson integrations are being done is encouraging — certainly better than wondering who is getting any value out of cognitive computing. But learning how IBM i shops are doing this really moves the needle on the value meter. That’s when fundamental concepts, tools and skills are applied to working projects. It’s a little bit like having a magician explain how he does tricks, but this is no sleight of hand.
Alex Roytman, CEO at Profound Logic, has been involved with multiple Watson-IBM i integration projects with real world capabilities. He made presentations explaining them to the attendees at the “Driveway” event. Two of the examples are available as YouTube videos and the third will be available in approximately one week.
The first session highlighted Watson’s facial recognition capabilities, which were connected to an RPG application so the process of employees clocking in and out would be verified. (Maybe you’ve heard of situations where one employee clocks in another employee who is running late.)
In the video, Roytman explains and Watson demonstrates its capability to distinguish one employee from another and what happens when the camera used for clocking in and out is not providing enough information for Watson to authorize the process.
The video also explains how the application was built including aspects created with Watson APIs and HTML5, which is used to capture images. The identification feature is especially interesting. It grabs a frame of the image from the video camera used as employee clock in and out and compares that frame with information Watson knows based on previously uploaded photos that Watson has been trained to identify. Roytman’s explanation includes the use of a tool called the Watson Visual Recognition Training Tool.
Here’s the link to access the video: https://www.youtube.com/watch?v=BqoLkuQ6zvQ&feature=youtu.be
After you’ve seen what it takes to build a facial recognition application, a second video pertains to how Watson’s natural language API can be used to create a keyword search.
Video #2 shows how this was done using Node.js (actually Profound.js) to integrate with Watson’s natural language API. The instructional video begins with logging into IBM Bluemix and installing the Watson developer cloud package and then shows how to use the keyword feature by bringing in the API. The instruction includes setting up parameters for the Watson API, including the expectations for the information to be returned by Watson.
Additional topics include defining database tables for the searchable keywords, creating a proxy program and a process for building the keyword list.
One other real world example that Roytman discussed at the “Driveway” was deploying Watson’s image recognition API to classify accidents in an insurance application. An instructional video for this is expected to be posted within a week. You’ll find it on the Profound Logic video page.
In addition to Roytman’s sessions at the event, there were other real-world examples, demonstrations, and hands-on lab sessions explaining how Watson can be implemented on IBM i using open source technologies as well as RPG and SQL.
“Some of these examples were IBM i examples and some were not,” Roytman said. “There were examples for analyzing weather data, labor statistics, news articles, imagery of roofs to determine damage and automate claims processing, language translation, analyzing and categorizing incoming email, and more. Alison Butterill [product offering manager for IBM i] did an entire session on real-world case studies for Watson, IoT, and cloud.”
“What really stood out for me were how many different ways there are to connect to Watson,” said Charles Guarino, one of the event attendees. “Scott Forstie, Jesse Gorzinski and Paul Tuohy each demonstrated very different ways to interact, in all cases using data from IBM i. There was a lot of discussion as well on IBM Bluemix and using that both independently and with Watson.”
“Overall, it was a positive event,” Roytman said. “But there were some gripes and room for improvement, too. For example, Paul Touhy, during his session, pointed out that Watson documentation can be greatly improved. Some of the API docs were not up to date, and it was necessary to grab the latest code from GitHub to figure out how to implement things right. He also pointed out that performance for using RPG/SQL with Watson was a little slow, because the SQL feature that calls out to Web Services uses Java (rather than being a native implementation) – so, it may not be great for high-volume tasks.”
Roytman expects IBM to improve in these areas, noting that the Watson, Bluemix and IBM i integration is still very new.
RELATED STORIES
IBM i, Watson & Bluemix: The REST Of The Story
Talking Modernization With Profound Logic
Profound Logic Taps Node.js and COBOL For New Directions
Profound Survey Adds To ‘Why i Matters’ Discussion
Profound Logic, ARCAD Partnership Targets Modernization Projects