Profound Logic Explores AI Paths for IBM i
August 23, 2023 Alex Woodie
The burgeoning world of artificial intelligence may seem far away to IBM i shops, but in fact, the technology is quite accessible to the users. One vendor that’s helping to guide IBM i users through the growing AI jungle is Profound Logic.
Profound Logic is best known as an application modernization vendor, and one that provides tools for developing Web and mobile interfaces for IBM i applications, as well as working with APIs on the platform.
In a series of YouTube videos, Profound Logic co-founder and CEO Alex Roytman provides a rundown on the latest developments in the world of AI and how IBM i users can leverage them. The videos make it clear that the company’s focus on Web-based modernization isn’t changing so much as what it actually means to modernize using the latest the Web has to offer.
“We’ve been in the business of futurizing our customers’ technology stacks for many, many years and traditionally that meant improving the user interface, improving the application code, integrating applications with APIs,” Roytman said. “But lately we’ve been exploring — quite deeply I might say — the use of artificial intelligence to extend your applications. And what we’re finding is that the possibilities there are endless.”
Roytman explores three main avenues for utilizing AI with IBM i, and the first is the use of so-called AI “copilots,” or programmer utilities.
AI Co-Pilots
OpenAI put large language models on the map when released ChatGPT in November 2022. But LLMs built on the revolutionary Transformer model from Google (it’s always Google) were already several years old by then.
OpenAI’s partner, Microsoft, pioneered the use of the term copilot with the launch of GitHub Copilot in June 2021. When used in an IDE like VSCode, the products can help the developer by doing things like documenting code, completing lines of code for the developer, or even writing whole blocks of code.
“It’s really an amazing productivity tool, something that works best with languages such as JavaScript and Python,” Roytman said. “If you haven’t explored GitHub Copilot just yet, I highly recommend that you try it.”
IBM i developers can certainly use GitHub Copilot to develop IBM i applications in open source languages. They can also build AI capabilites directly into their applications, according to Roytman, who provided a demo of a copilot type application that helps an IBM i operator to navigate the operating system, and eventually find the right commands to launch a full backup.
AI Plug-Ins
The second AI-related development trend is the use of plug-ins, which is a way for third-party applications to be directly integrated with a LLM, such as ChatGPT. In fact, OpenAI provides an array of plug-ins to folks who subscribe to ChatGPT Plus.
“There are many publicly available plugins that are accessible to you within the Plus account. But what about developing your own plugins, such as the ones that would connect AI to your IBM i applications?” Roytman said. “The secret to developing your own plugins is to provide documented API that the artificial intelligence model can consume. At Profound Logic, we offer a fully managed, low code API solution that produces the appropriate API ready to be consumed by the artificial intelligence model.”
Profound Logic’s API tool, dubbed Profound API, can function as the bridge to build plug-ins connection AI and the IBM i. In his demo, Roytman showed how he designed an API to perform a simple database lookup on the IBM i database, and expose that API to ChatGPT.
After registering the IBM i API with the ChatGPT plugin store, Roytman was able to use API from the ChatGPT interface. What’s worth noting is how ChatGPT figured out on its own how to get the data that Roytman later requested.
“All of this happens 100% automatically,” Roytman said. “It is not directly programmed. Instead, it’s using its own language and its own reasoning capabilities to figure out how to get the information that I’m asking for.”
AI Chatbots
The third product category that Roytman explored was AI chatbots. Called conversational interfaces by their enthusiastic supporters, these LLM-powered interfaces are not your grandfather’s chatbots, and in some cases can be indistinguishable from humans.
In his video, Roytman demoed Aleck, a Slack bot built on OpenAI’s GPT-4 model. Aleck has been trained with thousands of pages documentation on Profound Logic products, including public and private information, Roytman said.
“At any point in time, he has the ability to go and search our docs for the relevant information using semantic search,” he said. “In other words, he takes the meaning into account rather than searching for literal words.”
Roytman showed how one can “prompt” Aleck to get a specific piece of information, such as the different ways that one can read a database table using Profound JS, the company’s low-code tool for developing JavaScript and Node.js solutions on IBM i and other platforms.
You can prompt GPT-4 with up to 6,000 tokens, or about 8,000 words, Roytman said. But sometimes, that’s just not enough. One way to get around that limitation is by pre-caching results of previous searches as vectors, using a database that supports vector embeddings, according to Roytman. In Aleck’s case, Postgres and its vector search extension fit the bill.
“Once we fetch relevant semantic data via SQL, we simply pass the relevant data and only the relevant data in the system message to the API,” Roytman said. “And there you go, my friend Aleck, the smart Aleck assistant, magically elicits the perfect response to all of your Profound questions. Who knew that a combination of GPT-4, API wizardly, and a sprinkle of SQL could bring so much wisdom to our fingertips?”
The IBM i server may not be at the cutting edge of AI development, or used for training large language models. But that doesn’t mean this technology doesn’t have a place in the midrange, as Roytman’s videos show.