Data Analytics And Business Intelligence: Too Little, Too Much, Too Late?
November 27, 2017 Dan Burger
There’s no slowing down the amount of data and the number of data sources. Drinking from that informational firehose is a success for a few, an in-progress experiment for others, and an incentive to find some other way to quench analytical thirst for many. Improving the data mining, to use a term that was popular before the big data firehose was turned on, of structured data continues to be a useful endeavor for many IBM i shops. And, by the way, that doesn’t make them dinosaurs stumbling down the path of extinction, unless you get your news from press releases.
Delivering the right data to the right people and controlling those outcomes while being mindful of data value remains priority one. That’s ahead of problem solving and making data analyzing fast and easy, which are generally the positives that dominate big data news flow. That’s not to discount the constraints of current databases designed with 25-year-old technology and a time when data silos were the preferred architecture. There are plenty of those circumstances to be found.
Gaining increasingly useful information has taken an evolutionary path. Most companies are happier there than the path that requires great leaps and bounds. The tools and technology to extract better information and perform more useful analytics exist, but let’s not underestimate the complexities involved with data analytics. This is no can of corn with a new and improved can opener.
“When most people use the term ‘analytics,’ they assume you mean ‘big data’ and that tends to shift the conversation away from operational data and away from those who support IBM i,” says Bill Langston, director of marketing and channel development at New Generation Software, a business analytics provider in the IBM midrange market since 1982. “That’s unfortunate, because most companies, regardless of size, can still learn a lot more from their operational data.
“Today’s analytics or business intelligence and reporting software is pretty mature and it’s fairly easy to access and export data in the formats needed to perform a range of tasks. I think most small to midsize companies would do well to place more emphasis on developing operations analysts who understand the complexity of their operational databases, business processes, and analytics software. If you can eliminate the disconnect that frequently exists between those who know the data and those who want to analyze it, you can achieve great results no matter how small your staff or company. Maybe if we call that ‘operational analytics’ it will get more respect.”
That disconnect between those who know the data and those who want to analyze it is the Catch-22 for many analytics projects. In order to provide truly useful data in a format that’s accurate, fast and easy to consume, it’s necessary to understand how it’s going to be used. There are complex queries that can tie a database in knots. Poor performance will attract complaints quicker than an office thermostat that can’t distinguish December from July. Over estimating what a database can deliver is not uncommon. Neither is underestimating the cost and the time required to accomplish complex analytics. Over the years, bad outcomes have frustrated companies and added to the fears of others.
Pete Elliot has been an eye witness to data warehousing projects, the data analytics option that he describes as a very arduous job of gathering data, curing data, filtering data, and building reports that fit ill-defined parameters at the executive level.
“We’re entering an era where new code will accelerate outcomes, provide data governance and expand user capability so the end user controls the application in a self-service way,” he predicts. “It will give users more independence in the use of the data. Decentralizing data has been happening and it has a ways to go. But this is all coming to fruition. People are looking for a self-service model with ease of use that deliver the outcomes that are really needed.”
Elliot, who has decades of experience working with IBM midrange (AS/400, iSeries) shops and business partners, believes the time has come for IBM i shops that are accustomed to report generators handling strictly structured data to recognize that analytics relies as much on front end systems as it does backend systems. He also predicts that analytics tools will shift from being IT focused to becoming end user focused.
That doesn’t mean you won’t need technicians, but it will be a trend toward analytics that can be used by anyone, which is another way of saying there won’t be a need for data scientists who control the data analytics and pass it along to the executives. The executives will have the control and the capabilities to change questions and get outcomes to those questions.
That kind of user flexibility – including the capability to analyze unstructured as well as structured data–combined with modern data visualization and governance capabilities, will be what the IBM i shops will demand.
“When IBM brought out WebQuery as its modern replacement for Query/400, I thought the development would go further than it did,” Elliot notes. “It’s still not easy to use and it doesn’t surprise me that Query/400 is still popular as a reporting tool.”
RELATED STORIES
Infor Bolsters Cloud Analytics Play with Birst Buy
Mobile, Cloud, And Analytics Spark Growth For VAI
“Delivering the right data to the right people and controlling those outcomes while being mindful of data value remains priority one.”
Data analytics and business intelligence go hand in hand. However, the data is only useful if it comes from a quality source. In order for the process to work, both structured and unstructured data need to be capable of being analyzed.