Blog

The Golden Age of Analytics

Posted by Guest Author

17.09.2015 10:30 AM

by Dan Woods

The supply chain of data in the modern world has evolved beyond careful curation in controlled data warehouses. A fundamental change to the analytic workflow is needed in order to make advanced analytics available to a mass audience.

Attaining the golden age of analytics requires the democratization of advanced analytics. We need systems that separate the ability to create data science analysis from the ability to consume it, allowing anyone to intuitively interact with the results. In this golden age, users shouldn’t need to know the difference between a decision tree and logistic regression, or debate the benefits of R^2 over MAE, in order to create personalized action plans for thousands or millions of products. The growing need for predictive models to uncover these hidden, data-driven business solutions will continue to outstrip the limited numbers of data scientists who can create them.

Nutonian’s machine intelligence leads the way to the golden age of analytics. Companies who adopt machine intelligence can automate the discovery of analytic models, bringing predictive modeling out of the shadows and into the light. Creating complex, non-linear models is no longer a virtuoso activity with machine intelligence, but something the average business user can accomplish on his own to quickly generate viable business actions. As a data science productivity tool, machine intelligence also empowers already proficient data scientists to automate menial data tasks and extend their existing abilities.

How is this possible? Instead of simple, incremental improvements over existing, decades-old data science processes, machine intelligence combines the virtually unlimited computational power available today with a proprietary evolutionary search process to take a fresh approach to analytics. Hand in hand with machine intelligence, anyone can: 1) sift big data down to the right data, 2) generate completely new models to describe previously unknown systems, 3) optimize the complexity and application of a solution for the exact situation at hand, and 4) incorporate human expertise and creativity into the machine through interactive iteration – all within one user-friendly system.

Machine intelligence is key to unlocking the golden age of analytics, as it transforms predictive modeling into a company-wide application for developing optimal strategies and driving sustainable competitive advantages.

 


ABOUT THE AUTHOR

Dan Woods is CTO and founder of CITO Research. He has written more than 20 books about the strategic intersection of business and technology. Dan writes about data science, cloud computing, mobility, and IT management in articles, books, and blogs, as well as in his popular column on Forbes.com.

Topics: Eureqa, Machine Intelligence

Trading Necessitates Speed Along Every Step of the Data Pipeline

Posted by Jon Millis

10.06.2015 01:43 PM

We just returned from Terrapinn’s The Trading Show, a data-driven financial services conference that brings together thought leadership in quant, automated trading, exchange technology, big data and derivatives. With more than 1,000 attendees and 60 exhibitors gathering at the Navy Pier in Chicago, this year’s event was an excellent way not only for us to educate the market about using AI to scale data science initiatives, but for us to learn about the most pressing needs faced by financial services companies.

The first day, Jay Schuren, our Field CTO, presented to an audience of 50 executives. His demo used publicly available data from Yahoo Finance – such as cash flow, valuation metrics and stock prices – to predict which NYSE companies were the most over- and undervalued compared to the rest of the market. To say the least, Jay’s discoveries, as well as the seamless and automated way in which he created his financial models, spurred heavy booth traffic for the rest of our trip.*

Finance is an interesting animal. Many industries have relatively straight-forward applications for machine intelligence. Utilities companies are often interested in daily demand forecasting. Manufacturing companies look to optimize processes and design new materials. Retailers want to determine the best locations to build new stores, while healthcare providers want to preemptively detect and treat diseases. But finance is a bit different.

Let’s take a timely analogy. As I was walking home last Friday, I saw probably half a dozen limos of Boston high-schoolers posing for photos and heading to prom. Most of our customers purchase Eureqa and just can’t help but gush to us how excited they are to go to prom with us. Leading up to the big day, we show off our dance moves (give them a live demo), and take them out for a few dates (send them a free two-week trial), and by the end of our brief tryout, they’re bursting with energy and telling us all about their plans for the big dance with us. Trading firms, on the other hand, are the stunning mystery girls.** They’re smart, they’re confident, and you don’t think they should be shy, but when you ask them to prom, they shrug their shoulders and indifferently and say, yeah, I guess that sounds cool. You raise an eyebrow unsure if you just got a date or got slapped in the face with a frozen ham. But then she sees you drag racing around the neighborhood, and all of a sudden, you’re the biggest heartthrob on the planet. What in the world just happened?

In the trading world, everything is about speed. It’s not only about the speed at which a company can execute a trade (though there were plenty of vendors there offering to shave off fractions of a second to do this), but it’s also about the time it takes for a firm to arrive at an answer about how their market works, whether that’s determining when a currency is undervalued, an asset is likely to significantly appreciate, or a large loan is too risky. Everything in the trading game revolves around timing. And everyone. Loves. Speed. Where Eureqa instantly became interesting to attendees was the automation from raw data to accurate analytical/predictive model, a process that Eureqa consolidates – and accelerates discovery – by orders of magnitude.

A majority of trading technology on display was new hardware and software that incrementally improves time-to-execution. Milliseconds are important, but implementing a trading strategy that no one else has thought of or discovered could be game-changing. Nutonian will never compete with these other products and services directly. But we’re bringing more than one date to prom.

 

* Email us at contact@nutonian.com for a live demo of this particular application. We’d love to share our current use cases in financial services and explore how we might be a fit for others. 

** We’ll ignore the fact that, in reality, it seems like a “trading” prom would be about 95% guys. Woof.

Topics: Big data, Eureqa, Financial Services, Machine Intelligence, The Trading Show

Nutonian Takes its Shot at NHL Playoff Predictions

Posted by Jon Millis

16.04.2015 05:14 PM

The National Hockey League: where grown men skate after a small cylinder, whack it with sticks and beat each other to a pulp. God Bless America (and Canada). Hockey is popular for good reason. It’s fast, action-packed and highly-skilled. It also produces a decent amount of freely available data for fans – and now software – to crunch and predict the winners of this year’s holy grail, the 2015 Stanley Cup. Here’s how our virtual data scientist, Eureqa, believes this year’s playoffs will shake out:

 

NHL_Bracket_15We pulled team statistics (about 200 variables) from puckalytics.com and hockey-reference.com for the past five seasons to examine how regular season performance governs playoff performance. For the sake of this exercise, we excluded much potentially valuable data, such as teams stats dating back more than five years and advanced metrics that are not available to the world. Our resulting analytical model had 77% accuracy for playoff picks and placed particularly high predictive power on a few variables:

  • Team normalized Corsi (total shots taken) for 5-on-5 play
  • Late season record
  • Penalty kill %
  • Regular season head-to-head record
  • Regular season rating (“Simple Rating System”; takes into account average goal differential and strength of schedule)

The puck drops at 7pm EST tonight. We’ll see if, once again, regular season play is a strong predictor for achievement in the playoffs, or if this year was meant to be wild.

Topics: Big data, Eureqa, Machine Intelligence, NHL Playoffs

The Elephant in the Room at the Gartner BI Summit

Posted by Jon Millis

15.04.2015 10:00 AM

An attendee approached our booth two weeks ago at the Gartner Business Intelligence & Analytics Summit in Las Vegas, one of the biggest events of the year for analytics professionals across the board. “You guys are some of the only people addressing the two elephants in the room,” he said. Among the hundred or so vendors present, he first pointed across the aisle. “They’re solving the data cleansing problem.” He then turned and faced us. “You’re solving the analytics problem. Nobody else here is doing anything I haven’t seen a dozen times before.”

 

And to that man, we tip our hat. As usual, Gartner excelled in bringing together the brightest minds in infrastructure, software, and market analysis for its annual U.S. BI & Analytics Summit. Breakout sessions were packed to the brim with Fortune 500 companies, aggressive data-driven start-ups, and big data vendors, all trying to get a better understanding of where the market stands and where it’s going to go. What’s needed, who’s needed, what should we do, could we do, are reluctant to do but have to do? Gartner analysts and industry luminaries gave their guidance, and we listened.

What was conspicuously absent from all of these sessions was a focus on anything new. Big data is getting hammered everywhere for being nothing more than a fad, and the continuing complexity of running advanced analytics isn’t exactly helping its cause. Infrastructure is still difficult to maintain, data is difficult to access and blend, and real insights are difficult to glean from raw data. One of Gartner’s showcase events – a late-addition due to popular demand – was a “BI bake-off” between Qlik, SAS BI, Tableau and TIBCO Spotfire, in which each vendor showed off different map and chart interfaces to examine homelessness vs. available shelters by U.S. state. The companies took turns presenting their software and confirmed what we already knew about each technology: 1) you can create charts, graphs and dashboards relatively seamlessly in every one of them, and 2) Tableau is strongest in just about every category, from data loading to ease of use and visual appeal.

But while visuals are nice, aren’t answers nicer? Yes, the purpose of the bake-off was to demonstrate each technology’s dashboarding and visualization capabilities. If we’re trying to evaluate the homelessness epidemic to see where there are shortfalls in emergency housing, a visualization tool makes a nice piece of equipment to have in our toolbox. But what if we want to solve problems that are a little more complex? Perhaps more difficult ones that can’t simply been “seen” or highlighted with a heat map? What if I were a U.S. Senate policy analyst or economist and wanted to know what actually caused homelessness in these areas, so I could address the heart of the issue? Or if I were a manufacturer of an intricate medical device: would it be more helpful to see that there’s a problem occurring in the manufacturing and assembly process, or would it be more helpful to have a fast, machine-generated answer informing me where the problem is happening and how to fix it?

The_Analytics_Problem
“Oh, that little guy? I wouldn’t worry about that little guy.”

I don’t mean to patronize visualization tools (not too much, at least). There’s a time and a place for them. If Boston is being hammered with foot after foot of snow, and government workers need to figure out which counties have available capacity in their homeless shelters, a visualization tool can be a great means of identifying vacant space and information-sharing across the organization. But too often, visualization companies are trying to message (i.e., dip their toes into) problems that they’re just not equipped to solve. What’s causing engine failure? How do we maximize sales? What’s the best way to treat this patient based on her symptoms? BI tools, even the best of them like Tableau, may help “people see and understand their data” in certain instances. But as the problems get more complex, with more money and lives riding on fast answers, a cobbling together of databases and BI tools is not sufficient to produce critical answers. We need holistic data models that transparently explain how systems and businesses fundamentally “work”.

Customers are tired of talking about “analytics for the masses”. Everyone says it, nobody does it. We have to transition to a place where machines themselves become smarter and are the ones delivering on analytic tasks that nobody wants, or is equipped, to do along every step of the big data pipeline. We now have access to incredible computing power and data infrastructure. Let’s let machines do the computing, so we can do the understanding. Nutonian’s modeling capabilities – Automatic Input Transformation™, Human Interactive Modeling™, Evolutionary Nonlinear Inference™, Inverse Pattern Identification™ – make low-touch Machine MAPPing (Modeling, Analyzing, Predicting, Prescribing) possible. Upload your data to Eureqa, and let it ask the right, simple questions to guide you from raw data to answers on any data set.

That’s Nutonian’s vision for Machine Intelligence. Gartner, we love you. But if we’re going to make “big data” something real for the everyday person and make a dent in some of society’s most pressing problems like chronic homelessness, let’s talk about technology that’s capable of making it happen now, rather than one-dimensional legacy software that’s been around for decades.

Topics: Automatic Input Transformation (AIT), Eureqa, Evolutionary Nonlinear Inference (EVI), Gartner Business Intelligence & Analytics Summit, Human Interactive Modeling (HIM), Inverse Pattern Identification (IPI), Machine Intelligence, Machine MAPPing

Do Investors Rely on Too Much Information or Not Enough?

Posted by Jon Millis

07.04.2015 10:00 AM

Information is beautiful. It helps us learn about the world around us. It helps us better understand people and subject matters. And in the professional world, it helps us make informed decisions that are likely to help us achieve specific goals.

In finance, those goals frequently come down to success in investing. But as financial reporter Maria Bartiromo pointed out in her recent blog post, this information age comes with a pivotal caveat: it’s remarkably difficult to mine, analyze and digest everything, because there’s a whole lot of it: “The onus is on the individual to not get lost in the noise and analyze and act on portions of it that they believe to be important.”

Investing is particularly tricky. In manufacturing, for example, you’re likely to collect a finite amount of data about a problem. You’re trying to improve the fatigue life of a material: you might collect information about applied load, temperature, oxygen content, cooling rate, etc. We know, or at least have an idea, about the variables in play. Investing, on the other hand, is intertwined with so many external factors – weather patterns, energy shocks, political instability, recent executive changes, and interest rates in developing countries – that deciphering what’s driving financial performance and to what extent is excruciatingly difficult. As a result, nearly all investment firms focus on financial and economic indicators and disregard other data; it’s simply too time-consuming to isolate its impact.

Yet sometimes this other data is exceptionally important. Firms are pretending to exist in a vacuum. Using a subset of purely financial metrics to predict future asset performance is the equivalent of Brian Fantana in “Anchorman” unveiling his Sex Panther cologne and boasting to his colleagues that “60% of the time, it works every time”. It just doesn’t make sense. Acting off of wildly incomplete information doesn’t have to be the norm for quantitative financial analysis. A disruptive trend like Machine Intelligence™, pioneered by Eureqa, enables users to throw any potentially influential variables into their models, and Eureqa will churn through the data to identify the most significant factors that drive financial performance and automatically build the most accurate models possible. Eureqa can generate incomparably complete predictive models, because including additional data sets is fast, simple, and improves accuracy without increasing complexity for the end user.

As Ms. Bartiromo well knows, information is ubiquitous. But where we believe she’s wrong is that the onus will continue to be on the individual to detect the signal from the noise and analyze/act on the information that he/she believes to be important. For too long, we’ve been relying on individual peddling to power financial analyses. It’s time for a hydroelectric dam.

Topics: Big data, Eureqa, Machine Intelligence

Follow Me

Posts by Topic

see all