TagData Analytics

Analytics and AI: Humans and Machines are Good at Different Aspects of Prediction

Mostly driven by growth in the IoT, and the widespread use of internet, social media and mobile devices to perform search, send text, email and capture images and videos, the amount of data that we are producing on a daily basis is startling.

Consequently, companies are turning to data analytics and AI technologies to help them make sense of all the data at their disposal, predict the future, and make informed decisions that drive enterprise performance.

Although adoption of analytics and AI systems is increasingly extending in more mission-critical business processes, the implications of these emerging technologies on busines strategy, management, talent and decisions is still poorly understood.

For example, the single most common question in the AI debate is: “Will adoption of AI by businesses lead to massive human job cuts?”

Borrowing lessons from historical technological advances, yes, certain jobs will be lost, and new ones also created. However, machines are not taking over the world, nor are they eliminating the need for humans in the workplace.

Jobs will still be there albeit different from the traditional roles many are accustomed to. The majority of these new roles will require a new range of education, training, and experience.

For instance, nonroutine cognitive tasks demanding high levels of flexibility, creativity, critical thinking, problem-solving, leadership, and emotional intelligence do not yet lend themselves to wholesale automation.

Analytics and AI rely on data to make predictions

As more and better data is continually fed to the machine learning algorithms, the more they learn, and improve at making predictions.

Given these applications search for patterns in data, any inaccuracies or biases in the training data will be reflected in subsequent analyses.

But how much data do you need? The variety, quality and quantity of input, training and feedback data required depends on how accurate the prediction or business outcome must be to be useful.

Training data is used to train the predictive algorithms to predict the target variable, while the feedback data is used to assess and improve the algorithm’s prediction performance.

Undoubtedly, advanced analytics and AI systems are only as good as the data they are trained on. The data used to train these learning algorithms must be free of any noise or hidden biases.

You therefore need to understand how predictive technologies learn from data to perform sophisticated tasks such as customer lifetime value modeling and profitability forecasting.

This helps guide important decisions around the scale, scope and frequency of data acquisition. It’s about striking a balance between the benefits of more data and the cost of acquiring it.

Humans and machines both have shortcomings

In the context of prediction, humans and machines both have recognizable strengths and weaknesses.

Unless we identify and differentiate which tasks humans and machines are best suited for, all analytics and AI investments will come to naught.

For instance, faced with complex information with intricate interactions between different indicators, humans perform worse than machines. Heuristics and biases often get in the way of making accurate predictions.

Instead of accounting for statistical properties and data-driven predictions, more emphasis is often placed on salient information unavailable to prediction systems.

And, most of the times, the information is deceiving, hence the poor performance.

Although machines are better than humans at analyzing huge data sets with complex interactions amidst disparate variables, it’s very crucial to be cognizant of situations where machines are substandard at predicting the future.

The key to unlocking valuable insights from predictive analytics investments involves first and foremost understanding the definite business question that the data needs to answer.

This dictates your analysis plan and the data collection approaches that you will choose. Get the business question wrong, conclusively expect the insights and recommendations from the analysis to also be wrong.

Recall, with plentiful data, machine predictions can work well.

But, in situations where there is limited data to inform future decision making, machine predictions are relatively poor.

To quote Donald Rumsfeld, former US Secretary of Defense:

There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know.

Donald Rumsfeld, former US Secretary of Defense

Thus, for known knowns, abundant data is readily available. Accordingly, humans trust machines to do a better than them. Even so, the level of trust changes the moment we start talking about known unknowns and unknown unknowns.

With these situations, machine predictions are relatively poor because we do not have a lot of data to ingest into the prediction model.

Think of infrequent events (known unknowns) that occur once in a while, or something that has never happened before (unknown unknowns).

At least for infrequent events or happenings, humans are occasionally better at predicting with little data.

Generally so because we are good at comparison and applying prudent judgement, examining new situations and identifying other settings that are comparable to be useful in a new setting.

We are naturally wired to remember key pieces of information from the little data available or the limited associations we have had in the past.

Rather than be precise, our prediction comes with a confidence range highlighting its lack of accuracy.

Faced with unknown unknowns, both humans and machines are relatively bad at predicting their arrival.

The simple truth is that we cannot predict truly new events from past data. Look no further than the current Brexit conundrum.

Nobody precisely knew the unintended consequences of the UK leaving the EU. Leavers and Remainers both speculated as to what the benefits and disadvantages of leaving the EU maybe.

Of course, nobody knows what will happen in the future but that doesn’t mean we can’t be prepared, even for the unknown unknowns.

In their book Prediction Machines: The Simple Economics of Artificial Intelligence, Ajay Agrawal, Joshua Gans, and Avi Goldfarb present an additional category of scenarios under which machines also fail to predict precisely – Unknown Knowns.

Per the trio:

Unknown knowns is when an association that appears to be strong in the past is the result of some unknown or unobserved factor that changes over time and makes predictions we thought we could make unreliable.

PREDICTION MACHINES: THE SIMPLE ECONOMICS OF ARTIFICIAL INTELLIGENCE

With unknown knowns, predictive tools appear to provide a very accurate answer, but that answer can be very incorrect, especially if the algorithms have little grasp of the decision process that created the data.

To support their point of view, the authors make reference to pricing and revenue analysis in the hotel industry, although the same viewpoint is applicable elsewhere.

In many industries, higher prices are analogous to higher sales, and vice versa.

For example, in the airline industry, airfares are low outside the peak season, and high during peak seasons (summer and festive) when travel demand is highest.

Presented with this data, and without an understanding that price movements are often a function of demand and supply factors, a simple prediction model might advocate raising different route airfares to sell more empty seats and increase revenues. Evidence of causal inference problems.

But, a human being with a solid understanding of economics concepts will immediately call attention to the fact that increasing airfares is unlikely to increase flight ticket sales.

To the machine, this is an unknown known. But to a human with knowledge of pricing and profitability analysis, this is a known unknown or maybe even a known known provided the human is able to properly model the pricing decision.

Thus, to address such shortcomings, humans should work with machines to identify the right data and appropriate data analysis models that take into consideration seasonality and other demand and supply factors to better predict revenues at different prices.

As data analytics and AI systems become more advanced and spread across industries, and up and down the value chain, companies that will progress further are those that are continually thinking of creative ways for machines to integrate and amplify human capabilities.

In contrast, those companies that are using technology simply to cut costs and displace humans will eventually stop making progress, and cease to exist.

Building Trust in Your Data, Insights and Decisions

Over the years, the adoption of advanced analytics and data visualization tools by enterprises in order to generate key insights that can support quality business performance decisions has considerably evolved.

Nonetheless, not all investments in this arena are producing the desired outcomes.

Although significant sums of money, resources and time have been channeled towards data and analytics initiatives, the quality of insights generated, and ultimately the decisions are far from convincing.

Some of the possible reasons for explaining these failures include:

  • A sole focus on technology, and less on processes and people.
  • Misguided belief that the more data you have the better.
  • Lack of or poor organizational alignment and cultural resistance.

Delivering trusted insights is not a simple matter of investing in the right technology alone. Neither will simply collecting and storing large amounts of data lead to better decisions.

For instance, investing in state-of-the-art data analytics and visualization tools at the expense of data quality doesn’t automatically empower you to generate valuable insights that drive business performance.

Or having the right technology and quality data, but poor organizational alignment coupled with intense cultural resistance.

So how can your organization use data and analytics to generate trusted insights, make smarter decisions and impact business performance?

Prior generation of any business insights, trends or activity patterns, it’s imperative to ensure that the performance data to be analyzed is trusted. That is, the data must be fit for purpose, current, accurate, consistent and reliable.

With so much data and information available today, it’s increasingly difficult for decision makers to determine which data points are essential in understanding what drives the business and incorporate these into business decision making.

Resultantly, many organizations end up data hoarding and incurring unnecessary expenses along the process.

To ensure your data is fit for purpose, first and foremost, develop an understanding of the real business questions that the data needs to answer. By asking questions at the very beginning, you will be able to identify and define any problems or issues in their context.

Further, asking relevant questions will help you develop an improved understanding of the present factors, past events, or emerging opportunities driving the need for advanced analytics and data visualization investments in the first place.

Concurrently, asking the wrong questions results in ineffectual resolutions to the wrong business problems.

Generally, people want to kick off their data analysis effort by first collecting data and not asking the relevant questions, establishing the business need and formulating the analysis plan.

But, the quality of the insights and the value of the recommendations delivered are rooted in the quality of the data you start out with.

Data collection should only begin once the purpose and goals of the data analysis are clearly defined. Thus, the more and better questions you ask, the better your insights, recommendations, decisions and ultimately business performance will be.

Data accuracy, consistency and reliability

Ensuring that data used in analysis is accurate, consistent and reliable remains one of the common hurdles hampering analytics initiatives. One of the reasons for this is multiple data sources.

For example, suppose you want to analyze and evaluate the impact of a given marketing campaign on customer experience, in the past to do so you would have managed with data from your CRM system only.

Today, in order to get a 360-degree view of your customers, analyzing CRM data alone is not sufficient. You also need to analyze data from your web analytic system, customer feedback surveys, data warehouses, email, call recordings, social media or production environments.

Thus, you need to bring these disparate data sources together to generate valuable, actionable insights.

Issues arise when the multiple data sources define key business performance terms or metrics (e.g. sales, conversion, margin growth) differently and there is no common definition shared across the business.

This lack of shared understanding of business performance has far reaching implications on your analysis and makes it futile to compare results.

Data collection, followed by data cleansing and validation, are all therefore important. After you have collected the data to be used in the analysis per your analysis plan, it’s important to clean and validate the data to make it useable and accurate. Select a small data sample and compare it with what you expected.

Ensure the data type matches your expectations. Remember the adage: garbage in, garbage out. Even if you have the best of the breed analytics tools to slice and dice information, with poor data quality practices, your insights and decisions will remain unsubstantiated, erroneous and worthless.

For this reason and to ensure the consistency of the data used in your analysis, know and understand the source of every piece of your data, the age of the data, the components of the data, what is not included in the data, when and how often the data is refreshed.

Without data integrity, it is difficult to generate trusted insights and deliver recommendations that decision makers can confidently act on. Also, delivering trusted insights goes beyond merely attaching a colourful report in an email and hitting the send button.

Instead, the insights must be delivered in a digestible and actionable format. That is, communicated in a format that is more readily understood by decision makers in the organization. Poorly communicated insights can result in no action, negating the value of the insights in the first place. Thus, avoid getting bogged down in details.

Knowing is powerful, but acting based on knowledge is what drives smart decisions, creates true value and impacts business performance.

Converting Data Into Insights: The Right Technology Alone is No Guarantee of Success

The fact that technology is playing an increasingly significant role in executing many traditional finance tasks while at the same time generating greater insights that drive business performance is irrefutable.

However, even though many organizations are significantly investing in advanced data and analytics technologies, majority of these investments are reportedly yielding disappointing returns.

This is often because they focus mostly on the implementation of new tools, and less on the processes and people. For instance, there is a widespread misconception that data analytics is a technology issue, and also about having the most data.

As a result, many organizations are ultimately spending large amounts of money on new technologies capable of mining and managing large datasets.

There is relatively little focus on generating actionable insights out of the data, and building a data-driven culture, which is the people aspect of analysis.

Data analytics is not purely a technology issue. Instead, it is a strategic business imperative that entails leveraging new technologies to analyze the data at your disposal, gain greater insight about your business, and guide effective strategy execution.

Furthermore, having the right data is more important than having the most data. Your organization might have the most data, but if you’re not analyzing that data and asking better questions to help you make better decisions, it counts for nothing.

Reconsider your approach to people, processes and technology

The success of any new technology greatly depends on the skills of the people using it. Additionally, you need to convince people to use the new technology or system, otherwise it will end up being another worthless investment.

Given digital transformation is here to stay, for any technology transformation to be a success, it’s imperative to have the right balance of people, IT and digital skills.

It’s not an issue of technology versus humans. It’s about striking the right balance between technology and people, with each other doing what they do best. In other words, establishing the right combination of people and technology.

For example, advanced data analytics technologies are, by far, better than humans at analyzing complex data streams. They can easily identify trends and patterns in real-time.

But, generating useful insights from the data all depends on human ability to ask the right key performance questions, and identify specific questions that existing tools and techniques are currently not able to answer.

Rather than hastily acquiring new data and tools, start by reviewing current data analytics tools, systems and related applications. This helps identify existing capabilities including tangible opportunities for improvement.

It’s not about how much the business has invested or is willing to invest in data analytics capabilities; it’s about how the business is leveraging existing tools, data and insights it currently has to drive business growth, and where necessary, blending traditional BI tools with new emerging data analytics tools.

That’s why it’s critical to build a clear understanding of which new technologies will be most beneficial to your business.

Technology alone will not fix broken or outdated processes. Many businesses are spending significant amounts of money on new tools, only to find out later that it’s not the existing tool that is at fault but rather the process itself.

Take data management process as an example, mostly at larger, more complex organizations. Achieving a single view of the truth is repeatedly challenging. This is often because data is often confined in silos, inconsistently defined or locked behind access controls.

A centralized data governance model is the missing link. There are too many fragmented ERP systems. Data is spread across the business, and it’s difficult to identify all data projects and how they are systematized across the business.

In such cases, even if you acquire the right data analytics tools, the fact that you have a weak data governance model as well as a non-integrated systems infrastructure can act as a stumbling block to generating reliable and useful decision-support insights.

To fix the faltering data management process, you need to establish a data governance model that is flexible across the business and provides a centralized view of enterprise data – that is, the data the organization owns and how it is being managed and used across the business.

This is key to breaking down internal silos, achieving a single view of the truth, and building trust in your data to enable effective analytics and decision making.

Is your organization spending more time collecting and organizing data than analyzing it?

Have you put in place robust data governance and ownership processes required to achieve a single version of the truth?

Are you attracting and retaining the right data analytics talent and skills necessary to drive data exploration, experimentation and decision making?

Savvy CFOs are Using Data Analytics to Mitigate Risks

Data analytics has shifted from being “just a fad” to a business necessity. Once considered the playground of marketing, data analytics has entered the mainstream stream business. Companies are no longer investing in data and analytics with the sole purpose of aiding marketers and drive revenues.

Rather, they are also exploring the opportunities of data analytics application in risk management.

The risk landscape is changing fast and this is driven mostly by increased volatility, heightened economic and political uncertainty, intense regulatory complexity, high-profile data breaches, rising employee fraud, shifting consumer habits and preferences, and increased competition.

As a result of these fundamental changes the strategic conversation around risk is changing too. Thus, business leaders should embrace risk as a tool used to create value and achieve higher performance. It is no longer something to only fear, minimize and avoid.

Applying data and analytics to an organization’s risk efforts plays an important role in strengthening internal controls. Implementing stronger controls is essential for avoiding and reducing substantial financial and reputational losses.

Companies that have previously placed little value or emphasis on strengthening internal controls have learned the hard way, and for many, the wake-up call came too late.

High-profile Data Breaches

The number of cyber attacks and ensuing data breaches is at alarming rate. Hackers are targeting companies across all industries and stealing treasure troves of data for criminal proceeds. Recently, a global cyber attack “WannaCry” halted service delivery and brought businesses and countries to their knees, locking people out of their data and demanding they pay a ransom or lose everything.

In the wake of these massive data attacks, companies are waking up to the realization that they need to strengthen their cyber resilience programs.

Investing in data analytics is one way of achieving this, and CFOs are uniquely positioned within the organization to drive the analytics efforts. Although data is the oil of the new digital economy, finance executives must look at data in two ways – as a source of risk and as a means to manage the risk.

Real-time Monitoring of Data

This is essential for reducing the potential of data breaches and better protect strategic data of the company. The CFO can help monitor the company’s data by performing real-time data-flow analysis and outlier analysis.

The former involves tracking the location of data at different times during a business process. Internet of Things (IoT) has brought about new ways of collecting and storing large quantities of data sets.

For instance, sensors are being installed in machines, clothing items, delivery vehicles, wearable devices, company products etc and these minute devices are capable of transmitting the data to an internal server for further analysis and insight generation.

Majority of the data hacking incidents happen at night when business have shut down for the day. It is this period that companies are more prone to cyber breaches.

By regularly conducting data-flow analysis, personnel responsible for data security will be able to detect any unusual data queries being made on the company’s database during a certain period, and compare that number with trends over the last month, quarter, year or longer.

If a trend is identified, this should act as a starting point for asking specific questions around data security and trigger responses.

Outlier analysis, mostly used by credit and debit card companies and other financial institutions, helps identify anomalies in the customer’s transaction history. Based on the historical transactions of the credit or debit card holder or customer over a period of time, the company is able to develop a profile for each and every customer.

Suppose one of your clients resides in Location A where he or she mostly transacts from, one day you notice that soon after recording a transaction in Location A another large sum transaction is recorded in Location B within a short period of time and the commuting distance between A and B is long making it impossible for your customer to be in one place at one time, this transaction must immediately be flagged up as an outlier and tell you that something is unusual.

Thus, as the purchasing history data of your customers increase, more focus should be placed on real-time outlier analysis. Thanks to technological innovation, today’s computers have massive computing power to store and perform this critical analysis on very large datasets.

Make Use of Both Structured and Unstructured Data

Structured data is easy to analyze because it is highly organized and predictable. Unstructured data is essentially the opposite, it takes more effort and time to compile.

However, much of the company’s data is unstructured, and this where CFOs can uncover perils and act almost immediately to avert hazards.

Thus, as social media networks continue to grow in use, finance executives need to find meaningful ways of combining data from multiple sources, regardless of location or format, for analysis.

It is through this combination and analysis of disparate datasets that finance is able to make informed analysis and provide improved decision support.

Many brands have suffered mishaps because of poor or misaligned social media strategies. For instance, a negative tweet was allowed to go viral before the company could hardly respond leading to damaged reputations.

Thus, having a coherent and well executed social media plan will help you detect any external threats to the company’s reputation. One negative tweet has the massive potential to make you lose your key customers and shut door.

In high performing companies, CFOs are taking advantage of new technologies and keeping an ear on the ground in order to hear what is being said about their companies on social media platforms.

This new software has the capabilities to gather and combine data from various social media platforms concerning the company’s products, services, competitors etc.

They have also deployed teams to provide round-the-clock monitoring of social media activities.

When this data is analyzed and insights gleaned, the company can reach out to the message source, tell its side of the story and resolve any differences. Better more, the company is also able to trigger a response ahead of any negative story.

Retail companies are making use of image-recognition software to detect product issues while they sit on market shelves and ensure these errors are corrected well in advance. Using their smartphones, sales reps can snap photos of the company’s products. The software then makes an instant visual analysis of the photos leading to corrective measures being taken.

Email Risk and Fraud Prevention

As employee fraud continue to skyrocket, email use, in its unstructured form, is getting special attention. If fraud perpetrators are not detected well in advance and their plans allowed to flourish, the organization stands to lose hundreds if not millions of dollars.

It is therefore imperative that companies invest more time and resources analyzing the email patterns of their employees. For example, real-time monitoring of patterns in the metadata of employees’ email communications can help you reveal risks before they take centre stage.

You need to look at clues such as – Who is the email being sent to? What is the subject and the nature of the content? Is the email high priority or low priority? Who is copied and blind copied? What time of the day is the email sent?

Investigating such information can help you monitor incoming and outgoing email traffic of specific groups or individuals, locate high risk areas that you need to look into and also establish if restricted company information is being released to the public either accidentally or on purpose.

The success and power of data analytics in achieving value-adding risk management depends heavily on the quality and preciseness of the questions asked, the organization’s ability to gather data that addresses these questions, the integrity of the data gathered and the ability of users to draw insights from the data in an objective manner.

Before investing in data analytics software, first identify the challenges your business is currently facing and ask the critical key performance questions that you want answers for.

Transforming Finance Into An Analytics Powerhouse

As organizations continue to change at an unprecedented pace, the role of finance also continues to expand and transform. The function must do more than just reporting results and provide forward-looking analysis that supports strategic decision-making processes and enhances business performance. This increased pressure on CFOs to be more business partners and strategic partners is renewing the call for finance to embrace and be at the forefront of data analytics to guide smarter decision making.

CFOs Must Cultivate a Data-Driven Culture

Businesses are operating in an economy that is more technologically driven and data-centric. Digitization, increased globalization, changing business models, increased volatility and a changing regulatory environment continue to pose challenges on businesses, especially with regards to decision-making.Unfortunately, data alone is not enough to make smarter decisions.

Making smarter decisions requires organizations to develop capabilities that enable them to quickly and easily transform this raw data into useful insights. These insights must be available to management in real-time otherwise they will end up working with a lot of “Dead Data.”

Finance is already used to dealing with large amounts of data and because the function is centrally positioned within the business to oversee various key decisions, CFOs should work more closely with business teams in driving their analytics agenda. For example, they can:

  • Ask business leaders critical questions they expect data analytics to answer. The more CFOs and their analytical teams continue to probe, the better the insights generated. In a constantly volatile environment, management must be able to model various what-if-scenarios and their outcomes.
  • Provide data-driven insights in the areas of pricing, inventory management, supply chain optimization, customer profitability and M&A, thereby demonstrating the value brought to the business by analytics.
  • Deploy dashboards that not only show financial metrics, but also operational, customer and process metrics and allow business leaders to drill down to the specifics themselves and make improved decisions.

Expand CFO Influence Outside the Finance Function

Traditional financial data from legacy ERP systems is no longer the main driver of decisions. Today’s businesses have more data (Structured and Unstructured) than in the past, and the rate at which this data is being produced continues to increase at alarming levels. The predicted growth in data is exponential, with some experts predicting a 4,300 percent increase in annual data production by 2020.

It is not a case of collecting data and leaving it to become obsolete and irrelevant before it can be used for the purpose it was collected for. Decision makers are depending more on insights derived from data to make better decisions. In the fight against cyber crime, companies are using predictive analytics to identify anomalies within their systems, assess vulnerabilities, predict attacks and automatically resolve. No longer are companies relying solely on threat signatures to fight cyber crime.

In the retailing industry, companies are using analytics to understand customer preferences, segment customers, create market differentiation and improve margins. In other organizations, analytics are being applied to improve and strengthen operations. IoT devices are helping companies assess, monitor and enhance machine performance.

This above examples alone show how data has become a strategic asset. By owning and driving analytics initiatives within their businesses, CFOs can continue to expand their strategic leadership role, strengthen their ties throughout the organization, expand their influence outside the finance function and become strategic business partners.

Adopt Modern Analytical Systems

Advancement in technologies and the growth in Shared Services business models has reduced the amount of time finance executives spend on transactional and routine activities. Today, much of the CFO’s time is spend on strategic issues, for example, helping the CEO and other business leaders execute strategy, identifying M&A opportunities, purchasing and implementing IT systems, creating shareholder value, assessing and monitoring risks,  and driving business performance.

In order to continue delivering on the above, CFOs must reduce their reliance on disconnected analytical data processes and legacy analytical systems, and invest in analytical capabilities that enable them to execute strategy more effectively, reduce processing cycle times, improve financial productivity and reduce finance operating costs.

Spreadsheets have their role in analytics but it is important to note that upon reaching a certain level, they become limited. As the business grows and the amount of data produced increases, it is worth investing in a data analytics system that is suited to your business needs and helps you achieve your strategic objectives. This is not just about replacing spreadsheets and the old software with the new system and tools. Instead, it is about understanding the fact that the new system is just an enabler and not your lottery ticket to riches.

By becoming an analytics powerhouse, the finance function  will be able to model various what-if-scenarios and provide the foresight to predict future outcomes, the insight to make real-time strategic decisions, and the hindsight to analyze and improve historical performance. Overall, the organization will have an advantage over its competitors.

I welcome your thoughts and comments

 

© 2019 ERPM Insights

Theme by Anders NorénUp ↑