Blog

The Elephant in the Room at the Gartner BI Summit

Posted by Jon Millis

15.04.2015 10:00 AM

An attendee approached our booth two weeks ago at the Gartner Business Intelligence & Analytics Summit in Las Vegas, one of the biggest events of the year for analytics professionals across the board. “You guys are some of the only people addressing the two elephants in the room,” he said. Among the hundred or so vendors present, he first pointed across the aisle. “They’re solving the data cleansing problem.” He then turned and faced us. “You’re solving the analytics problem. Nobody else here is doing anything I haven’t seen a dozen times before.”

 

And to that man, we tip our hat. As usual, Gartner excelled in bringing together the brightest minds in infrastructure, software, and market analysis for its annual U.S. BI & Analytics Summit. Breakout sessions were packed to the brim with Fortune 500 companies, aggressive data-driven start-ups, and big data vendors, all trying to get a better understanding of where the market stands and where it’s going to go. What’s needed, who’s needed, what should we do, could we do, are reluctant to do but have to do? Gartner analysts and industry luminaries gave their guidance, and we listened.

What was conspicuously absent from all of these sessions was a focus on anything new. Big data is getting hammered everywhere for being nothing more than a fad, and the continuing complexity of running advanced analytics isn’t exactly helping its cause. Infrastructure is still difficult to maintain, data is difficult to access and blend, and real insights are difficult to glean from raw data. One of Gartner’s showcase events – a late-addition due to popular demand – was a “BI bake-off” between Qlik, SAS BI, Tableau and TIBCO Spotfire, in which each vendor showed off different map and chart interfaces to examine homelessness vs. available shelters by U.S. state. The companies took turns presenting their software and confirmed what we already knew about each technology: 1) you can create charts, graphs and dashboards relatively seamlessly in every one of them, and 2) Tableau is strongest in just about every category, from data loading to ease of use and visual appeal.

But while visuals are nice, aren’t answers nicer? Yes, the purpose of the bake-off was to demonstrate each technology’s dashboarding and visualization capabilities. If we’re trying to evaluate the homelessness epidemic to see where there are shortfalls in emergency housing, a visualization tool makes a nice piece of equipment to have in our toolbox. But what if we want to solve problems that are a little more complex? Perhaps more difficult ones that can’t simply been “seen” or highlighted with a heat map? What if I were a U.S. Senate policy analyst or economist and wanted to know what actually caused homelessness in these areas, so I could address the heart of the issue? Or if I were a manufacturer of an intricate medical device: would it be more helpful to see that there’s a problem occurring in the manufacturing and assembly process, or would it be more helpful to have a fast, machine-generated answer informing me where the problem is happening and how to fix it?

The_Analytics_Problem
“Oh, that little guy? I wouldn’t worry about that little guy.”

I don’t mean to patronize visualization tools (not too much, at least). There’s a time and a place for them. If Boston is being hammered with foot after foot of snow, and government workers need to figure out which counties have available capacity in their homeless shelters, a visualization tool can be a great means of identifying vacant space and information-sharing across the organization. But too often, visualization companies are trying to message (i.e., dip their toes into) problems that they’re just not equipped to solve. What’s causing engine failure? How do we maximize sales? What’s the best way to treat this patient based on her symptoms? BI tools, even the best of them like Tableau, may help “people see and understand their data” in certain instances. But as the problems get more complex, with more money and lives riding on fast answers, a cobbling together of databases and BI tools is not sufficient to produce critical answers. We need holistic data models that transparently explain how systems and businesses fundamentally “work”.

Customers are tired of talking about “analytics for the masses”. Everyone says it, nobody does it. We have to transition to a place where machines themselves become smarter and are the ones delivering on analytic tasks that nobody wants, or is equipped, to do along every step of the big data pipeline. We now have access to incredible computing power and data infrastructure. Let’s let machines do the computing, so we can do the understanding. Nutonian’s modeling capabilities – Automatic Input Transformation™, Human Interactive Modeling™, Evolutionary Nonlinear Inference™, Inverse Pattern Identification™ – make low-touch Machine MAPPing (Modeling, Analyzing, Predicting, Prescribing) possible. Upload your data to Eureqa, and let it ask the right, simple questions to guide you from raw data to answers on any data set.

That’s Nutonian’s vision for Machine Intelligence. Gartner, we love you. But if we’re going to make “big data” something real for the everyday person and make a dent in some of society’s most pressing problems like chronic homelessness, let’s talk about technology that’s capable of making it happen now, rather than one-dimensional legacy software that’s been around for decades.

Topics: Automatic Input Transformation (AIT), Eureqa, Evolutionary Nonlinear Inference (EVI), Gartner Business Intelligence & Analytics Summit, Human Interactive Modeling (HIM), Inverse Pattern Identification (IPI), Machine Intelligence, Machine MAPPing

Follow Me

Posts by Topic

see all