categoryReporting and Analytics

Analytics and AI: Humans and Machines are Good at Different Aspects of Prediction

Mostly driven by growth in the IoT, and the widespread use of internet, social media and mobile devices to perform search, send text, email and capture images and videos, the amount of data that we are producing on a daily basis is startling.

Consequently, companies are turning to data analytics and AI technologies to help them make sense of all the data at their disposal, predict the future, and make informed decisions that drive enterprise performance.

Although adoption of analytics and AI systems is increasingly extending in more mission-critical business processes, the implications of these emerging technologies on busines strategy, management, talent and decisions is still poorly understood.

For example, the single most common question in the AI debate is: “Will adoption of AI by businesses lead to massive human job cuts?”

Borrowing lessons from historical technological advances, yes, certain jobs will be lost, and new ones also created. However, machines are not taking over the world, nor are they eliminating the need for humans in the workplace.

Jobs will still be there albeit different from the traditional roles many are accustomed to. The majority of these new roles will require a new range of education, training, and experience.

For instance, nonroutine cognitive tasks demanding high levels of flexibility, creativity, critical thinking, problem-solving, leadership, and emotional intelligence do not yet lend themselves to wholesale automation.

Analytics and AI rely on data to make predictions

As more and better data is continually fed to the machine learning algorithms, the more they learn, and improve at making predictions.

Given these applications search for patterns in data, any inaccuracies or biases in the training data will be reflected in subsequent analyses.

But how much data do you need? The variety, quality and quantity of input, training and feedback data required depends on how accurate the prediction or business outcome must be to be useful.

Training data is used to train the predictive algorithms to predict the target variable, while the feedback data is used to assess and improve the algorithm’s prediction performance.

Undoubtedly, advanced analytics and AI systems are only as good as the data they are trained on. The data used to train these learning algorithms must be free of any noise or hidden biases.

You therefore need to understand how predictive technologies learn from data to perform sophisticated tasks such as customer lifetime value modeling and profitability forecasting.

This helps guide important decisions around the scale, scope and frequency of data acquisition. It’s about striking a balance between the benefits of more data and the cost of acquiring it.

Humans and machines both have shortcomings

In the context of prediction, humans and machines both have recognizable strengths and weaknesses.

Unless we identify and differentiate which tasks humans and machines are best suited for, all analytics and AI investments will come to naught.

For instance, faced with complex information with intricate interactions between different indicators, humans perform worse than machines. Heuristics and biases often get in the way of making accurate predictions.

Instead of accounting for statistical properties and data-driven predictions, more emphasis is often placed on salient information unavailable to prediction systems.

And, most of the times, the information is deceiving, hence the poor performance.

Although machines are better than humans at analyzing huge data sets with complex interactions amidst disparate variables, it’s very crucial to be cognizant of situations where machines are substandard at predicting the future.

The key to unlocking valuable insights from predictive analytics investments involves first and foremost understanding the definite business question that the data needs to answer.

This dictates your analysis plan and the data collection approaches that you will choose. Get the business question wrong, conclusively expect the insights and recommendations from the analysis to also be wrong.

Recall, with plentiful data, machine predictions can work well.

But, in situations where there is limited data to inform future decision making, machine predictions are relatively poor.

To quote Donald Rumsfeld, former US Secretary of Defense:

There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know.

Donald Rumsfeld, former US Secretary of Defense

Thus, for known knowns, abundant data is readily available. Accordingly, humans trust machines to do a better than them. Even so, the level of trust changes the moment we start talking about known unknowns and unknown unknowns.

With these situations, machine predictions are relatively poor because we do not have a lot of data to ingest into the prediction model.

Think of infrequent events (known unknowns) that occur once in a while, or something that has never happened before (unknown unknowns).

At least for infrequent events or happenings, humans are occasionally better at predicting with little data.

Generally so because we are good at comparison and applying prudent judgement, examining new situations and identifying other settings that are comparable to be useful in a new setting.

We are naturally wired to remember key pieces of information from the little data available or the limited associations we have had in the past.

Rather than be precise, our prediction comes with a confidence range highlighting its lack of accuracy.

Faced with unknown unknowns, both humans and machines are relatively bad at predicting their arrival.

The simple truth is that we cannot predict truly new events from past data. Look no further than the current Brexit conundrum.

Nobody precisely knew the unintended consequences of the UK leaving the EU. Leavers and Remainers both speculated as to what the benefits and disadvantages of leaving the EU maybe.

Of course, nobody knows what will happen in the future but that doesn’t mean we can’t be prepared, even for the unknown unknowns.

In their book Prediction Machines: The Simple Economics of Artificial Intelligence, Ajay Agrawal, Joshua Gans, and Avi Goldfarb present an additional category of scenarios under which machines also fail to predict precisely – Unknown Knowns.

Per the trio:

Unknown knowns is when an association that appears to be strong in the past is the result of some unknown or unobserved factor that changes over time and makes predictions we thought we could make unreliable.

PREDICTION MACHINES: THE SIMPLE ECONOMICS OF ARTIFICIAL INTELLIGENCE

With unknown knowns, predictive tools appear to provide a very accurate answer, but that answer can be very incorrect, especially if the algorithms have little grasp of the decision process that created the data.

To support their point of view, the authors make reference to pricing and revenue analysis in the hotel industry, although the same viewpoint is applicable elsewhere.

In many industries, higher prices are analogous to higher sales, and vice versa.

For example, in the airline industry, airfares are low outside the peak season, and high during peak seasons (summer and festive) when travel demand is highest.

Presented with this data, and without an understanding that price movements are often a function of demand and supply factors, a simple prediction model might advocate raising different route airfares to sell more empty seats and increase revenues. Evidence of causal inference problems.

But, a human being with a solid understanding of economics concepts will immediately call attention to the fact that increasing airfares is unlikely to increase flight ticket sales.

To the machine, this is an unknown known. But to a human with knowledge of pricing and profitability analysis, this is a known unknown or maybe even a known known provided the human is able to properly model the pricing decision.

Thus, to address such shortcomings, humans should work with machines to identify the right data and appropriate data analysis models that take into consideration seasonality and other demand and supply factors to better predict revenues at different prices.

As data analytics and AI systems become more advanced and spread across industries, and up and down the value chain, companies that will progress further are those that are continually thinking of creative ways for machines to integrate and amplify human capabilities.

In contrast, those companies that are using technology simply to cut costs and displace humans will eventually stop making progress, and cease to exist.

Finance Analytics: Using The Right Data to Generate Actionable Insights

We are living in the information age. Data is everywhere and affecting every aspect of our lives. As the use of data and analytics become pervasive, in future it will be uncommon to find an industry not capitalizing on the benefits they can provide.

CFOs are increasingly leveraging data and analytics tools to optimize operational and financial processes, and drive better decision-making within their organizations. Finance is no longer simply a steward of historical reporting and compliance.

Today, the expectation on finance is to engage with new technologies, collaborate with various business stakeholders to create value and act as a steward of enterprise performance.

That is, deliver the right decision support to the right people at the right time.

Delivering impactful data-driven insights does not happen on its own. You need to have the right tools, processes, talent and culture working well together.

Here are some of the questions you can start asking to kickstart you on this journey:

  • Does your organization have the necessary tools and capabilities to collect and analyze the data (structured and unstructured) and make sense of it all?
  • How robust are your data governance and management processes?
  • How do you define the structure of your finance analytics team? Are you focused heavily on traditional and technical skills or diversity?
  • How are people armed to make better decisions using the data, processes, and analytical methods available?
  • As a finance leader, are you promoting a culture that bases key decisions on gut feel versus data-driven insights?

Both intuitive and analytical decisions are important and can create significant value for the business.

However, I’d recommend against promoting a culture that embraces the randomness of intuitive thinking and avoids analytical thinking completely.

What are you doing with your data?

In a world where data is ubiquitous and plentiful, it can be overwhelming to separate the noise from the important.

According to IBM, 80 percent of today’s data is originating from previously untapped, unstructured information from the web such as imagery, social media channels, news feeds, emails, journals, blogs, images, sounds and videos.

And this unstructured data holds the important insights required for faster, more informed decisions. Take news feeds as an example. Breaking news on economic policy can help you reevaluate production and supply chain decisions, and adjust forecasts accordingly.

But, for fear of missing out, many businesses end up hoarding data with little to zero analysis performed on this data. You don’t need to collect and store each and every single piece of data type out there.

Instead, only data which matters most and is key to unlocking answers to your critical business performance questions. The rest is all noise.

That’s why it’s critical to regularly ask the question, “What are we doing with all the data we have?” This way you would also be able to identify any information gaps that require filling via new data sources.

For analytics to work, people need the foresight to ask the right questions

People play a very critical role in an organizational analytics efforts. Without insightful guidance on the right questions to answer and the right hypotheses to test, analytics efforts are foiled. They cannot direct themselves.

Asking the right questions is key to drawing the right conclusions. Let’s look at the steps involved in making better decisions using data and analytics for one of your product lines. After a strong start, you’re now starting to witness a decline in online revenue but are not sure why.

Before jumping into complex analytics:

  • First, identify the problem that the data need to answer. You already know What happened. However, this is not enough. To address the issue, it’s important that you have a better understanding of the impacted segment and potential reasons for the revenue decline. This is where you start asking questions such as why is revenue declining? When did the revenue decline start? Where did it happen (country, region, city)? What might have caused this (potential drivers)? What actions can we take to address the decline?
  • Then come up with hypotheses of the issue to be addressed. A hypothesis describes a possible solution, such as a driver or a reason behind your business question. Many people make the mistake of simply jumping into data collection and analysis before forming a hypothesis to answer the critical business question. Going back to our revenue decline example, supposedly you have identified several hypotheses and then prioritized two to explain the revenue decline – A slowing down economy and impeding recession causing customers to tighten their purse strings quite a bit, and a new website design with enhanced security features but also additional online check out steps.
  • Based on these two hypotheses you can then identify the exact data that should be collected and analyzed to prove or disprove each hypothesis.

Irrespective of how advanced your analytical tools are, asking the wrong business performance questions results in flawed data being collected, and ultimately poor insights and recommendations. Hence the importance of diversity within the team.

Remember, no data or analytics model is perfect. Finding good data and making sure it’s cleaned and validated is fundamental but the organization also shouldn’t wait for perfection.

It’s easy to get fixated on perfection and fail to see the forest for the trees.

Building Trust in Your Data, Insights and Decisions

Over the years, the adoption of advanced analytics and data visualization tools by enterprises in order to generate key insights that can support quality business performance decisions has considerably evolved.

Nonetheless, not all investments in this arena are producing the desired outcomes.

Although significant sums of money, resources and time have been channeled towards data and analytics initiatives, the quality of insights generated, and ultimately the decisions are far from convincing.

Some of the possible reasons for explaining these failures include:

  • A sole focus on technology, and less on processes and people.
  • Misguided belief that the more data you have the better.
  • Lack of or poor organizational alignment and cultural resistance.

Delivering trusted insights is not a simple matter of investing in the right technology alone. Neither will simply collecting and storing large amounts of data lead to better decisions.

For instance, investing in state-of-the-art data analytics and visualization tools at the expense of data quality doesn’t automatically empower you to generate valuable insights that drive business performance.

Or having the right technology and quality data, but poor organizational alignment coupled with intense cultural resistance.

So how can your organization use data and analytics to generate trusted insights, make smarter decisions and impact business performance?

Prior generation of any business insights, trends or activity patterns, it’s imperative to ensure that the performance data to be analyzed is trusted. That is, the data must be fit for purpose, current, accurate, consistent and reliable.

With so much data and information available today, it’s increasingly difficult for decision makers to determine which data points are essential in understanding what drives the business and incorporate these into business decision making.

Resultantly, many organizations end up data hoarding and incurring unnecessary expenses along the process.

To ensure your data is fit for purpose, first and foremost, develop an understanding of the real business questions that the data needs to answer. By asking questions at the very beginning, you will be able to identify and define any problems or issues in their context.

Further, asking relevant questions will help you develop an improved understanding of the present factors, past events, or emerging opportunities driving the need for advanced analytics and data visualization investments in the first place.

Concurrently, asking the wrong questions results in ineffectual resolutions to the wrong business problems.

Generally, people want to kick off their data analysis effort by first collecting data and not asking the relevant questions, establishing the business need and formulating the analysis plan.

But, the quality of the insights and the value of the recommendations delivered are rooted in the quality of the data you start out with.

Data collection should only begin once the purpose and goals of the data analysis are clearly defined. Thus, the more and better questions you ask, the better your insights, recommendations, decisions and ultimately business performance will be.

Data accuracy, consistency and reliability

Ensuring that data used in analysis is accurate, consistent and reliable remains one of the common hurdles hampering analytics initiatives. One of the reasons for this is multiple data sources.

For example, suppose you want to analyze and evaluate the impact of a given marketing campaign on customer experience, in the past to do so you would have managed with data from your CRM system only.

Today, in order to get a 360-degree view of your customers, analyzing CRM data alone is not sufficient. You also need to analyze data from your web analytic system, customer feedback surveys, data warehouses, email, call recordings, social media or production environments.

Thus, you need to bring these disparate data sources together to generate valuable, actionable insights.

Issues arise when the multiple data sources define key business performance terms or metrics (e.g. sales, conversion, margin growth) differently and there is no common definition shared across the business.

This lack of shared understanding of business performance has far reaching implications on your analysis and makes it futile to compare results.

Data collection, followed by data cleansing and validation, are all therefore important. After you have collected the data to be used in the analysis per your analysis plan, it’s important to clean and validate the data to make it useable and accurate. Select a small data sample and compare it with what you expected.

Ensure the data type matches your expectations. Remember the adage: garbage in, garbage out. Even if you have the best of the breed analytics tools to slice and dice information, with poor data quality practices, your insights and decisions will remain unsubstantiated, erroneous and worthless.

For this reason and to ensure the consistency of the data used in your analysis, know and understand the source of every piece of your data, the age of the data, the components of the data, what is not included in the data, when and how often the data is refreshed.

Without data integrity, it is difficult to generate trusted insights and deliver recommendations that decision makers can confidently act on. Also, delivering trusted insights goes beyond merely attaching a colourful report in an email and hitting the send button.

Instead, the insights must be delivered in a digestible and actionable format. That is, communicated in a format that is more readily understood by decision makers in the organization. Poorly communicated insights can result in no action, negating the value of the insights in the first place. Thus, avoid getting bogged down in details.

Knowing is powerful, but acting based on knowledge is what drives smart decisions, creates true value and impacts business performance.

Converting Data Into Insights: The Right Technology Alone is No Guarantee of Success

The fact that technology is playing an increasingly significant role in executing many traditional finance tasks while at the same time generating greater insights that drive business performance is irrefutable.

However, even though many organizations are significantly investing in advanced data and analytics technologies, majority of these investments are reportedly yielding disappointing returns.

This is often because they focus mostly on the implementation of new tools, and less on the processes and people. For instance, there is a widespread misconception that data analytics is a technology issue, and also about having the most data.

As a result, many organizations are ultimately spending large amounts of money on new technologies capable of mining and managing large datasets.

There is relatively little focus on generating actionable insights out of the data, and building a data-driven culture, which is the people aspect of analysis.

Data analytics is not purely a technology issue. Instead, it is a strategic business imperative that entails leveraging new technologies to analyze the data at your disposal, gain greater insight about your business, and guide effective strategy execution.

Furthermore, having the right data is more important than having the most data. Your organization might have the most data, but if you’re not analyzing that data and asking better questions to help you make better decisions, it counts for nothing.

Reconsider your approach to people, processes and technology

The success of any new technology greatly depends on the skills of the people using it. Additionally, you need to convince people to use the new technology or system, otherwise it will end up being another worthless investment.

Given digital transformation is here to stay, for any technology transformation to be a success, it’s imperative to have the right balance of people, IT and digital skills.

It’s not an issue of technology versus humans. It’s about striking the right balance between technology and people, with each other doing what they do best. In other words, establishing the right combination of people and technology.

For example, advanced data analytics technologies are, by far, better than humans at analyzing complex data streams. They can easily identify trends and patterns in real-time.

But, generating useful insights from the data all depends on human ability to ask the right key performance questions, and identify specific questions that existing tools and techniques are currently not able to answer.

Rather than hastily acquiring new data and tools, start by reviewing current data analytics tools, systems and related applications. This helps identify existing capabilities including tangible opportunities for improvement.

It’s not about how much the business has invested or is willing to invest in data analytics capabilities; it’s about how the business is leveraging existing tools, data and insights it currently has to drive business growth, and where necessary, blending traditional BI tools with new emerging data analytics tools.

That’s why it’s critical to build a clear understanding of which new technologies will be most beneficial to your business.

Technology alone will not fix broken or outdated processes. Many businesses are spending significant amounts of money on new tools, only to find out later that it’s not the existing tool that is at fault but rather the process itself.

Take data management process as an example, mostly at larger, more complex organizations. Achieving a single view of the truth is repeatedly challenging. This is often because data is often confined in silos, inconsistently defined or locked behind access controls.

A centralized data governance model is the missing link. There are too many fragmented ERP systems. Data is spread across the business, and it’s difficult to identify all data projects and how they are systematized across the business.

In such cases, even if you acquire the right data analytics tools, the fact that you have a weak data governance model as well as a non-integrated systems infrastructure can act as a stumbling block to generating reliable and useful decision-support insights.

To fix the faltering data management process, you need to establish a data governance model that is flexible across the business and provides a centralized view of enterprise data – that is, the data the organization owns and how it is being managed and used across the business.

This is key to breaking down internal silos, achieving a single view of the truth, and building trust in your data to enable effective analytics and decision making.

Is your organization spending more time collecting and organizing data than analyzing it?

Have you put in place robust data governance and ownership processes required to achieve a single version of the truth?

Are you attracting and retaining the right data analytics talent and skills necessary to drive data exploration, experimentation and decision making?

More Data Doesn’t Always Lead to Better Decisions

Thanks to advancements in technology, our digital lives are producing expansive amounts of data on a daily basis.

In addition to this enormous amount of data that is produced each day, the diversity of data types and data sources, and the speed with which data is generated, analyzed and reprocessed has increasingly become unwieldy.

With more data continuously coming from social spheres, mobile devices, cameras, sensors and connected devices, purchase transactions and GPS signals to name a few, it does not look like the current data explosion will ebb soon.

Instead, investments in IoT and advanced analytics are expected to grow in the immediate future. Thinking back, investing in advanced data analytics to generate well-informed insights that effectively support decision making and drive business performance have been the paragon of big corporations.

And smaller businesses and organizations, as a result, have for some time embraced the flawed view that such investments are beyond their reach. It’s no surprise then that adoption has been at a snail’s pace.

Thanks to democratization of technology, new technologies are starting to get into the hands of smaller businesses and organizations. The solutions are now being packaged into simple, easy-to-deploy applications that most users without specialized training are able to operate.

Further, acquisition costs have significantly reduced thereby obviating the upfront cost barrier, that for years, has acted as a drag on many company IT investments.

While the application of data management and advanced analytics tools is now foundational and becoming ubiquitous, growing into a successful data-driven organization is about getting the right data to the right person at the right time to make the right decision.

Distorted claims such as data is the new oil have, unfortunately, prompted some companies to embark on unfruitful data hoarding sprees. It is true that oil is a valuable commodity with plentiful uses. But, it is also a scarce resource not widely available to everyone.

The true value of oil is unearthed after undergoing a refinement process. On the contrary, data is not scarce. It is widely available. Nonetheless, akin to oil, the true value of data is unlocked after we have processed and analyzed it to generate leading-edge insights.

It’s a waste of time and resources to just hoard data and not analyze it to get a better understanding of what has happened in the past and why it has happened. Such insights are crucial to predicting future business performance scenarios and exploiting opportunities.

More data doesn’t necessarily lead to better decisions. Better decisions emanate from having a profound ability to analyze useful data and make key observations that would have otherwise remained hidden.

Data is widely available, what is scarce is the ability to extract informed insights that support decision-making and propel the business forward.

To avoid data hoarding, it is necessary to first carry out a data profiling exercise as this will assist you establish if any of your existing data can be easily used for other purposes. It also helps ascertain whether existing records are up to date and also if your information sources are still fit-for -purpose.

At any given time, data quality trumps data quantity. That is why it is important to get your data in one place where it can easily be accessed for analysis and produce a single version of the truth.

Unlike in the past where data was kept in different systems that were unable to talk to each other making it difficult to consolidate and analyze data to facilitate faster decision making, the price of computing and storage has plummeted and now the systems are being linked.

As a result, companies can now use data-mining techniques to sort through large data sets to identify patterns and establish relationships to solve problems through data analysis. If the data is of poor quality, insights generated from the analysis will also be of poor quality.

Let’s take customer transactional data as an example. In order to reveal hidden correlations or insights from the data, it’s advisable to analyze the information flow in real-time; by the hour, by the day, by the week, by the month, over the past year and more. This lets you proactively respond to the ups and downs of dynamic business conditions.

Imagine what could happen if you waited months before you could analyze the transactional data? By the time you do so, your insights are a product of “dead data”. Technology is no longer an inhibitor, but culture and the lack of leadership mandate.

As data become more abundant, the main problem is no longer finding the information as such but giving business unit managers precise answers about business performance easily and quickly.

What matters most is data quality, what you do with the data you have collected and not how much you collect. Instead of making more hay, start looking for the needle in the haystack.

© 2019 ERPM Insights

Theme by Anders NorénUp ↑