TagData Analytics

Finance Analytics: It’s Not About the Size of The Data

As the need to make impactful operational and strategic decisions in real time increases, CFOs are playing a greater role in the adoption and integration of data analytics in their organizations to support data-driven decision making.

Executives and business unit leaders are increasingly relying on insights produced by Finance to better understand enterprise performance. That is, what has happened, why it has happened, what is most likely to happen in the future, and the appropriate course of action to take.

In an era where data is proliferating in volume and variety, decision makers have realized it’s no longer enough to base key enterprise performance and risk decisions on experience and intuition alone.

Rather, this must be combined with a facts-based approach. Which means CFOs must set up modernized reporting and analytics capabilities with one of the main goals being the use of data as a tool for business decision making.

Appropriately analyzed and interpreted, data always has a story, and there’s always something to discover from it. However, many finance functions are failing to deliver value from their existing data analytics capabilities.

There is a misconception that to deliver actionable insights, the function needs more data for analysis. As a result, the supply of data keeps rising, while the ability to use it to generate informed insights lags badly.

Yet it’s not about the size of the data. It’s about translating available data and making it understandable and useful.

In other words, it’s about context and understanding that numbers alone do not tell the whole story. Finance leaders should connect the dots in ways that produce valuable insights or discoveries, and determine for example:

  • What is being measured, why, and how is it measured?
  • How extensive the exploration for such discoveries was?
  • How many additional factors were also reviewed for a correlation?

Further, to use data intelligently and influence better decision making, CFOs and their teams should recognize that most enterprise data is accumulated not to serve analytics, but as the by-product of routine tasks and activities.

Consider customer online and offline purchases data. Social media posts. Logs of customer communications for billing and other transactional purposes.

Such data is not produced for the purpose of prediction yet when analyzed, this data can reveal valuable insights that can be translated into action which delivers measurable benefits.

Often the company already has the data that it needs to answer its critical business performance questions, but little of it is being aggregated, cleaned, analyzed, and linked to decision making activities in a coherent way.

Exacerbating the issue is the mere fact that the company has a mishmash of incompatible computer systems and data formats added over the years ultimately making it difficult to perform granular analysis at a product, supplier, geographic, customer, and channel level, and many other variables.

There is nothing grand about data itself. What matters most is how you are handling the flood of data your systems are collecting daily. Yes, data can always be accumulated but as a finance leader:

  • Are you taking time to dig down into the data and observing patterns?
  • Are the observed patterns significant to altering the strategic direction of the organization?
  • Are you measuring what you really want to know, what matters for the success of the business?
  • Or you are just measuring what is easy to measure rather than what is most relevant?

CFOs do not need more data. What they need right now is the ability to aggregate, clean and analyze the existing data sitting in the company’s computer systems and understand what story it is telling them.

Before they can focus on prediction, they first need to observe what is happening and why. Bear in mind correlation does not imply causation.

Yes, you might have discovered a predictive relationship between X and Y but this does not mean one causes the other, not even indirectly.

For instance, employee training hours and sales revenue. Just because there is a high correlation between the two does not mean increase in training hours is causing a corresponding increase in sales revenue. A third variable might be driving the revenue the increase.

Jumping to conclusions too soon about causality for a correlation observed in data can lead to bad decisions and far-reaching consequences, hence finance leaders should validate whether an observed trend is real rather than misleading noise before providing any causal explanation.

Certainly, big data can be a powerful tool, but it has its limits. Not all data is created equal, or evenly valuable. There are situations where big data sets play a pivotal role, and others where small, rich data sets trump big data sets.

Before they decide to collect more data, CFOs should always remember data is comparable to an unexploited resource.

Even though data is now considered an important strategic asset for the organization, raw data is like oil that has been drilled and pulled out of the ground but not yet refined to its finer version of kerosene and gasoline.

The data oil has not yet been converted into insights that can be translated into action to cut costs, boost revenues, streamline operations, and guide the company’s strategic direction.

Big Data, Insights and Decision Making

Today, we are living through an explosion in the amount and quality of all kinds of available information. Society is facing a deluge of data and there isn’t even a slight indication that this information glut will soon be halted.

The below statistics on data creation evidently highlight the fact that no one is going to stop creating information.

As of 2013, experts believed that 90% of the world’s data was generated from 2011 to 2012.

In 2018, more than 2.5 quintillion bytes of data were created every day.

At the beginning of 2020, the digital universe was estimated to consist of 44 zettabytes of data.

By 2025, approximately 463 exabytes would be created every 24 hours worldwide.

As of June 2019, there were more than 4.5 billion people online.

80% of digital content is unavailable in nine out of every ten languages.

In 2019, Google processed 3.7 million queries, Facebook saw one million logins, and YouTube recorded 4.5 million videos viewed every 60 seconds.

Netflix’s content volume in 2019 outnumbered that of the US TV industry in 2005.

By 2025, there would be 75 billion Internet-of-Things (IoT) devices in the world.

By 2030, nine in every ten people aged six and above would be digitally active.

Source: SeedScientific

In the US, private companies now collect and sell as many as 75,000 individual data points about the average American consumer. And that number is miniscule compared with future expectations.

Why so much interest in customer data? Because the right data can tell business decision makers which customers to avoid and which they can exploit based on the company’s strategy and its stated objectives.

While it’s important to appreciate the benefits of data, we also need to acknowledge and respond to its drawbacks.

Just as people often confuse credit cards for currency, information alone is futile. The process of creating intelligence is not simply a question of access to information.

Rather, it is about asking the right questions, and collecting the right data.

You need a lot of pixels in a photo in order to be able to zoom in with clarity on one portion of it. Similarly, you need a lot of observations in a dataset in order to be able to zoom in with clarity on one small subset of that data.

Source: Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are by Seth Stephens-Davidowitz.

Your business performance will not improve through Big Data alone. You need Rich Data. Deep Data. Even if it comes in the form of Small Data.

The biggest reason investments in data analytics fail to payoff, though, is that most companies are choking on data. They have lots of terabytes but few critical insights.

Instead of being adequately informed, they are exceedingly informed because they are taking much of what they already have for granted.

We are exceptional at storing information but fall short when it comes to retrieving the same information. As a result, we get overloaded.

Some important questions to consider before investing in new data:

  • Is more information necessarily good?
  • Does it really improve the decision-making process?
  • Can you extract value from the information you already have?
  • Are you overwhelmed but underserved by today’s information sources?
  • How much of the data under your possession is useful, and how much of it gets in the way? That is, what is your data’s Signal-to-Noise ratio?

What are the ensuing problems of information overload?

  • Indecisiveness due to paralysis by analysis. The endless analysis is so overwhelming making it difficult to know how and when to decide.
  • Endless argumentation. In the era of limitless data, there is always an opportunity to crunch some numbers, spin them a bit and prove the opposite.
  • A total reliance on evidence-based decision making can undermine logical approaches to deliberation and problem solving. The solution is not always Big Data. The judgement of humans and small data is often necessary to help. We cannot just throw data at any question. Data, whether Big or small, and humans compliment each other.

The growth in the amount of data without the ability to process it is not useful in and of itself. Once data has been analyzed, it needs to be summarized in an easy-to-understand way and presented visually to enable decision makers apply their own expertise and make their own judgements.

Although Big Data offers us an opportunity to analyze new kinds of information and identify trends that have long existed but we hadn’t necessarily been aware of, there are a few things that it does not do well.

  • Data analysis is quite bad at narrative and emergent thinking.
  • It fails to analyze the social aspects of interaction or to recognize context. Human beings are undoubtedly good at telling stories that incorporate multiple causes.
  • Big Data also fails to identify which correlations are more or less likely to be false. The larger and more expansive the datasets, the more correlations there are, both false and true.

Correlation versus Causality is a huge issue in data analysis. The mere fact that two random variables are correlated does not automatically imply causation.

To test for causality, not merely correlations, randomized, controlled experiments (also called A/B Testing) are necessary.

  • People are divided into two groups.
  • The treatment group is shown and asked to do or take something.
  • For the control group, the status quo is maintained.
  • Each group is monitored how it responds.
  • The difference in the outcomes between the two groups is the causation.

Undertaking controlled experiments help us learn interventions that work and those that do not, and ultimately improve our decision making.

As you can see, the power of data lies in what business teams do with it. Clearly define enterprise data-use cases, aligning them with business strategy.

You don’t always need plenty data to create key insights that inform decision making. You need the right data blended with other insights and observations gathered offline.

Information is now plentiful and inexpensive to produce, manipulate, and disseminate. Almost anyone can add information. The big question is how to reduce it and base critical decisions on only a tiny sampling of all available data.

Analytics and AI: Humans and Machines are Good at Different Aspects of Prediction

Mostly driven by growth in the IoT, and the widespread use of internet, social media and mobile devices to perform search, send text, email and capture images and videos, the amount of data that we are producing on a daily basis is startling.

Consequently, companies are turning to data analytics and AI technologies to help them make sense of all the data at their disposal, predict the future, and make informed decisions that drive enterprise performance.

Although adoption of analytics and AI systems is increasingly extending in more mission-critical business processes, the implications of these emerging technologies on busines strategy, management, talent and decisions is still poorly understood.

For example, the single most common question in the AI debate is: “Will adoption of AI by businesses lead to massive human job cuts?”

Borrowing lessons from historical technological advances, yes, certain jobs will be lost, and new ones also created. However, machines are not taking over the world, nor are they eliminating the need for humans in the workplace.

Jobs will still be there albeit different from the traditional roles many are accustomed to. The majority of these new roles will require a new range of education, training, and experience.

For instance, nonroutine cognitive tasks demanding high levels of flexibility, creativity, critical thinking, problem-solving, leadership, and emotional intelligence do not yet lend themselves to wholesale automation.

Analytics and AI rely on data to make predictions

As more and better data is continually fed to the machine learning algorithms, the more they learn, and improve at making predictions.

Given these applications search for patterns in data, any inaccuracies or biases in the training data will be reflected in subsequent analyses.

But how much data do you need? The variety, quality and quantity of input, training and feedback data required depends on how accurate the prediction or business outcome must be to be useful.

Training data is used to train the predictive algorithms to predict the target variable, while the feedback data is used to assess and improve the algorithm’s prediction performance.

Undoubtedly, advanced analytics and AI systems are only as good as the data they are trained on. The data used to train these learning algorithms must be free of any noise or hidden biases.

You therefore need to understand how predictive technologies learn from data to perform sophisticated tasks such as customer lifetime value modeling and profitability forecasting.

This helps guide important decisions around the scale, scope and frequency of data acquisition. It’s about striking a balance between the benefits of more data and the cost of acquiring it.

Humans and machines both have shortcomings

In the context of prediction, humans and machines both have recognizable strengths and weaknesses.

Unless we identify and differentiate which tasks humans and machines are best suited for, all analytics and AI investments will come to naught.

For instance, faced with complex information with intricate interactions between different indicators, humans perform worse than machines. Heuristics and biases often get in the way of making accurate predictions.

Instead of accounting for statistical properties and data-driven predictions, more emphasis is often placed on salient information unavailable to prediction systems.

And, most of the times, the information is deceiving, hence the poor performance.

Although machines are better than humans at analyzing huge data sets with complex interactions amidst disparate variables, it’s very crucial to be cognizant of situations where machines are substandard at predicting the future.

The key to unlocking valuable insights from predictive analytics investments involves first and foremost understanding the definite business question that the data needs to answer.

This dictates your analysis plan and the data collection approaches that you will choose. Get the business question wrong, conclusively expect the insights and recommendations from the analysis to also be wrong.

Recall, with plentiful data, machine predictions can work well.

But, in situations where there is limited data to inform future decision making, machine predictions are relatively poor.

To quote Donald Rumsfeld, former US Secretary of Defense:

There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know.

Donald Rumsfeld, former US Secretary of Defense

Thus, for known knowns, abundant data is readily available. Accordingly, humans trust machines to do a better than them. Even so, the level of trust changes the moment we start talking about known unknowns and unknown unknowns.

With these situations, machine predictions are relatively poor because we do not have a lot of data to ingest into the prediction model.

Think of infrequent events (known unknowns) that occur once in a while, or something that has never happened before (unknown unknowns).

At least for infrequent events or happenings, humans are occasionally better at predicting with little data.

Generally so because we are good at comparison and applying prudent judgement, examining new situations and identifying other settings that are comparable to be useful in a new setting.

We are naturally wired to remember key pieces of information from the little data available or the limited associations we have had in the past.

Rather than be precise, our prediction comes with a confidence range highlighting its lack of accuracy.

Faced with unknown unknowns, both humans and machines are relatively bad at predicting their arrival.

The simple truth is that we cannot predict truly new events from past data. Look no further than the current Brexit conundrum.

Nobody precisely knew the unintended consequences of the UK leaving the EU. Leavers and Remainers both speculated as to what the benefits and disadvantages of leaving the EU maybe.

Of course, nobody knows what will happen in the future but that doesn’t mean we can’t be prepared, even for the unknown unknowns.

In their book Prediction Machines: The Simple Economics of Artificial Intelligence, Ajay Agrawal, Joshua Gans, and Avi Goldfarb present an additional category of scenarios under which machines also fail to predict precisely – Unknown Knowns.

Per the trio:

Unknown knowns is when an association that appears to be strong in the past is the result of some unknown or unobserved factor that changes over time and makes predictions we thought we could make unreliable.


With unknown knowns, predictive tools appear to provide a very accurate answer, but that answer can be very incorrect, especially if the algorithms have little grasp of the decision process that created the data.

To support their point of view, the authors make reference to pricing and revenue analysis in the hotel industry, although the same viewpoint is applicable elsewhere.

In many industries, higher prices are analogous to higher sales, and vice versa.

For example, in the airline industry, airfares are low outside the peak season, and high during peak seasons (summer and festive) when travel demand is highest.

Presented with this data, and without an understanding that price movements are often a function of demand and supply factors, a simple prediction model might advocate raising different route airfares to sell more empty seats and increase revenues. Evidence of causal inference problems.

But, a human being with a solid understanding of economics concepts will immediately call attention to the fact that increasing airfares is unlikely to increase flight ticket sales.

To the machine, this is an unknown known. But to a human with knowledge of pricing and profitability analysis, this is a known unknown or maybe even a known known provided the human is able to properly model the pricing decision.

Thus, to address such shortcomings, humans should work with machines to identify the right data and appropriate data analysis models that take into consideration seasonality and other demand and supply factors to better predict revenues at different prices.

As data analytics and AI systems become more advanced and spread across industries, and up and down the value chain, companies that will progress further are those that are continually thinking of creative ways for machines to integrate and amplify human capabilities.

In contrast, those companies that are using technology simply to cut costs and displace humans will eventually stop making progress, and cease to exist.

Building Trust in Your Data, Insights and Decisions

Over the years, the adoption of advanced analytics and data visualization tools by enterprises in order to generate key insights that can support quality business performance decisions has considerably evolved.

Nonetheless, not all investments in this arena are producing the desired outcomes.

Although significant sums of money, resources and time have been channeled towards data and analytics initiatives, the quality of insights generated, and ultimately the decisions are far from convincing.

Some of the possible reasons for explaining these failures include:

  • A sole focus on technology, and less on processes and people.
  • Misguided belief that the more data you have the better.
  • Lack of or poor organizational alignment and cultural resistance.

Delivering trusted insights is not a simple matter of investing in the right technology alone. Neither will simply collecting and storing large amounts of data lead to better decisions.

For instance, investing in state-of-the-art data analytics and visualization tools at the expense of data quality doesn’t automatically empower you to generate valuable insights that drive business performance.

Or having the right technology and quality data, but poor organizational alignment coupled with intense cultural resistance.

So how can your organization use data and analytics to generate trusted insights, make smarter decisions and impact business performance?

Prior generation of any business insights, trends or activity patterns, it’s imperative to ensure that the performance data to be analyzed is trusted. That is, the data must be fit for purpose, current, accurate, consistent and reliable.

With so much data and information available today, it’s increasingly difficult for decision makers to determine which data points are essential in understanding what drives the business and incorporate these into business decision making.

Resultantly, many organizations end up data hoarding and incurring unnecessary expenses along the process.

To ensure your data is fit for purpose, first and foremost, develop an understanding of the real business questions that the data needs to answer. By asking questions at the very beginning, you will be able to identify and define any problems or issues in their context.

Further, asking relevant questions will help you develop an improved understanding of the present factors, past events, or emerging opportunities driving the need for advanced analytics and data visualization investments in the first place.

Concurrently, asking the wrong questions results in ineffectual resolutions to the wrong business problems.

Generally, people want to kick off their data analysis effort by first collecting data and not asking the relevant questions, establishing the business need and formulating the analysis plan.

But, the quality of the insights and the value of the recommendations delivered are rooted in the quality of the data you start out with.

Data collection should only begin once the purpose and goals of the data analysis are clearly defined. Thus, the more and better questions you ask, the better your insights, recommendations, decisions and ultimately business performance will be.

Data accuracy, consistency and reliability

Ensuring that data used in analysis is accurate, consistent and reliable remains one of the common hurdles hampering analytics initiatives. One of the reasons for this is multiple data sources.

For example, suppose you want to analyze and evaluate the impact of a given marketing campaign on customer experience, in the past to do so you would have managed with data from your CRM system only.

Today, in order to get a 360-degree view of your customers, analyzing CRM data alone is not sufficient. You also need to analyze data from your web analytic system, customer feedback surveys, data warehouses, email, call recordings, social media or production environments.

Thus, you need to bring these disparate data sources together to generate valuable, actionable insights.

Issues arise when the multiple data sources define key business performance terms or metrics (e.g. sales, conversion, margin growth) differently and there is no common definition shared across the business.

This lack of shared understanding of business performance has far reaching implications on your analysis and makes it futile to compare results.

Data collection, followed by data cleansing and validation, are all therefore important. After you have collected the data to be used in the analysis per your analysis plan, it’s important to clean and validate the data to make it useable and accurate. Select a small data sample and compare it with what you expected.

Ensure the data type matches your expectations. Remember the adage: garbage in, garbage out. Even if you have the best of the breed analytics tools to slice and dice information, with poor data quality practices, your insights and decisions will remain unsubstantiated, erroneous and worthless.

For this reason and to ensure the consistency of the data used in your analysis, know and understand the source of every piece of your data, the age of the data, the components of the data, what is not included in the data, when and how often the data is refreshed.

Without data integrity, it is difficult to generate trusted insights and deliver recommendations that decision makers can confidently act on. Also, delivering trusted insights goes beyond merely attaching a colourful report in an email and hitting the send button.

Instead, the insights must be delivered in a digestible and actionable format. That is, communicated in a format that is more readily understood by decision makers in the organization. Poorly communicated insights can result in no action, negating the value of the insights in the first place. Thus, avoid getting bogged down in details.

Knowing is powerful, but acting based on knowledge is what drives smart decisions, creates true value and impacts business performance.

Converting Data Into Insights: The Right Technology Alone is No Guarantee of Success

The fact that technology is playing an increasingly significant role in executing many traditional finance tasks while at the same time generating greater insights that drive business performance is irrefutable.

However, even though many organizations are significantly investing in advanced data and analytics technologies, majority of these investments are reportedly yielding disappointing returns.

This is often because they focus mostly on the implementation of new tools, and less on the processes and people. For instance, there is a widespread misconception that data analytics is a technology issue, and also about having the most data.

As a result, many organizations are ultimately spending large amounts of money on new technologies capable of mining and managing large datasets.

There is relatively little focus on generating actionable insights out of the data, and building a data-driven culture, which is the people aspect of analysis.

Data analytics is not purely a technology issue. Instead, it is a strategic business imperative that entails leveraging new technologies to analyze the data at your disposal, gain greater insight about your business, and guide effective strategy execution.

Furthermore, having the right data is more important than having the most data. Your organization might have the most data, but if you’re not analyzing that data and asking better questions to help you make better decisions, it counts for nothing.

Reconsider your approach to people, processes and technology

The success of any new technology greatly depends on the skills of the people using it. Additionally, you need to convince people to use the new technology or system, otherwise it will end up being another worthless investment.

Given digital transformation is here to stay, for any technology transformation to be a success, it’s imperative to have the right balance of people, IT and digital skills.

It’s not an issue of technology versus humans. It’s about striking the right balance between technology and people, with each other doing what they do best. In other words, establishing the right combination of people and technology.

For example, advanced data analytics technologies are, by far, better than humans at analyzing complex data streams. They can easily identify trends and patterns in real-time.

But, generating useful insights from the data all depends on human ability to ask the right key performance questions, and identify specific questions that existing tools and techniques are currently not able to answer.

Rather than hastily acquiring new data and tools, start by reviewing current data analytics tools, systems and related applications. This helps identify existing capabilities including tangible opportunities for improvement.

It’s not about how much the business has invested or is willing to invest in data analytics capabilities; it’s about how the business is leveraging existing tools, data and insights it currently has to drive business growth, and where necessary, blending traditional BI tools with new emerging data analytics tools.

That’s why it’s critical to build a clear understanding of which new technologies will be most beneficial to your business.

Technology alone will not fix broken or outdated processes. Many businesses are spending significant amounts of money on new tools, only to find out later that it’s not the existing tool that is at fault but rather the process itself.

Take data management process as an example, mostly at larger, more complex organizations. Achieving a single view of the truth is repeatedly challenging. This is often because data is often confined in silos, inconsistently defined or locked behind access controls.

A centralized data governance model is the missing link. There are too many fragmented ERP systems. Data is spread across the business, and it’s difficult to identify all data projects and how they are systematized across the business.

In such cases, even if you acquire the right data analytics tools, the fact that you have a weak data governance model as well as a non-integrated systems infrastructure can act as a stumbling block to generating reliable and useful decision-support insights.

To fix the faltering data management process, you need to establish a data governance model that is flexible across the business and provides a centralized view of enterprise data – that is, the data the organization owns and how it is being managed and used across the business.

This is key to breaking down internal silos, achieving a single view of the truth, and building trust in your data to enable effective analytics and decision making.

Is your organization spending more time collecting and organizing data than analyzing it?

Have you put in place robust data governance and ownership processes required to achieve a single version of the truth?

Are you attracting and retaining the right data analytics talent and skills necessary to drive data exploration, experimentation and decision making?

Savvy CFOs are Using Data Analytics to Mitigate Risks

Data analytics has shifted from being “just a fad” to a business necessity. Once considered the playground of marketing, data analytics has entered the mainstream stream business. Companies are no longer investing in data and analytics with the sole purpose of aiding marketers and drive revenues.

Rather, they are also exploring the opportunities of data analytics application in risk management.

The risk landscape is changing fast and this is driven mostly by increased volatility, heightened economic and political uncertainty, intense regulatory complexity, high-profile data breaches, rising employee fraud, shifting consumer habits and preferences, and increased competition.

As a result of these fundamental changes the strategic conversation around risk is changing too. Thus, business leaders should embrace risk as a tool used to create value and achieve higher performance. It is no longer something to only fear, minimize and avoid.

Applying data and analytics to an organization’s risk efforts plays an important role in strengthening internal controls. Implementing stronger controls is essential for avoiding and reducing substantial financial and reputational losses.

Companies that have previously placed little value or emphasis on strengthening internal controls have learned the hard way, and for many, the wake-up call came too late.

High-profile Data Breaches

The number of cyber attacks and ensuing data breaches is at alarming rate. Hackers are targeting companies across all industries and stealing treasure troves of data for criminal proceeds. Recently, a global cyber attack “WannaCry” halted service delivery and brought businesses and countries to their knees, locking people out of their data and demanding they pay a ransom or lose everything.

In the wake of these massive data attacks, companies are waking up to the realization that they need to strengthen their cyber resilience programs.

Investing in data analytics is one way of achieving this, and CFOs are uniquely positioned within the organization to drive the analytics efforts. Although data is the oil of the new digital economy, finance executives must look at data in two ways – as a source of risk and as a means to manage the risk.

Real-time Monitoring of Data

This is essential for reducing the potential of data breaches and better protect strategic data of the company. The CFO can help monitor the company’s data by performing real-time data-flow analysis and outlier analysis.

The former involves tracking the location of data at different times during a business process. Internet of Things (IoT) has brought about new ways of collecting and storing large quantities of data sets.

For instance, sensors are being installed in machines, clothing items, delivery vehicles, wearable devices, company products etc and these minute devices are capable of transmitting the data to an internal server for further analysis and insight generation.

Majority of the data hacking incidents happen at night when business have shut down for the day. It is this period that companies are more prone to cyber breaches.

By regularly conducting data-flow analysis, personnel responsible for data security will be able to detect any unusual data queries being made on the company’s database during a certain period, and compare that number with trends over the last month, quarter, year or longer.

If a trend is identified, this should act as a starting point for asking specific questions around data security and trigger responses.

Outlier analysis, mostly used by credit and debit card companies and other financial institutions, helps identify anomalies in the customer’s transaction history. Based on the historical transactions of the credit or debit card holder or customer over a period of time, the company is able to develop a profile for each and every customer.

Suppose one of your clients resides in Location A where he or she mostly transacts from, one day you notice that soon after recording a transaction in Location A another large sum transaction is recorded in Location B within a short period of time and the commuting distance between A and B is long making it impossible for your customer to be in one place at one time, this transaction must immediately be flagged up as an outlier and tell you that something is unusual.

Thus, as the purchasing history data of your customers increase, more focus should be placed on real-time outlier analysis. Thanks to technological innovation, today’s computers have massive computing power to store and perform this critical analysis on very large datasets.

Make Use of Both Structured and Unstructured Data

Structured data is easy to analyze because it is highly organized and predictable. Unstructured data is essentially the opposite, it takes more effort and time to compile.

However, much of the company’s data is unstructured, and this where CFOs can uncover perils and act almost immediately to avert hazards.

Thus, as social media networks continue to grow in use, finance executives need to find meaningful ways of combining data from multiple sources, regardless of location or format, for analysis.

It is through this combination and analysis of disparate datasets that finance is able to make informed analysis and provide improved decision support.

Many brands have suffered mishaps because of poor or misaligned social media strategies. For instance, a negative tweet was allowed to go viral before the company could hardly respond leading to damaged reputations.

Thus, having a coherent and well executed social media plan will help you detect any external threats to the company’s reputation. One negative tweet has the massive potential to make you lose your key customers and shut door.

In high performing companies, CFOs are taking advantage of new technologies and keeping an ear on the ground in order to hear what is being said about their companies on social media platforms.

This new software has the capabilities to gather and combine data from various social media platforms concerning the company’s products, services, competitors etc.

They have also deployed teams to provide round-the-clock monitoring of social media activities.

When this data is analyzed and insights gleaned, the company can reach out to the message source, tell its side of the story and resolve any differences. Better more, the company is also able to trigger a response ahead of any negative story.

Retail companies are making use of image-recognition software to detect product issues while they sit on market shelves and ensure these errors are corrected well in advance. Using their smartphones, sales reps can snap photos of the company’s products. The software then makes an instant visual analysis of the photos leading to corrective measures being taken.

Email Risk and Fraud Prevention

As employee fraud continue to skyrocket, email use, in its unstructured form, is getting special attention. If fraud perpetrators are not detected well in advance and their plans allowed to flourish, the organization stands to lose hundreds if not millions of dollars.

It is therefore imperative that companies invest more time and resources analyzing the email patterns of their employees. For example, real-time monitoring of patterns in the metadata of employees’ email communications can help you reveal risks before they take centre stage.

You need to look at clues such as – Who is the email being sent to? What is the subject and the nature of the content? Is the email high priority or low priority? Who is copied and blind copied? What time of the day is the email sent?

Investigating such information can help you monitor incoming and outgoing email traffic of specific groups or individuals, locate high risk areas that you need to look into and also establish if restricted company information is being released to the public either accidentally or on purpose.

The success and power of data analytics in achieving value-adding risk management depends heavily on the quality and preciseness of the questions asked, the organization’s ability to gather data that addresses these questions, the integrity of the data gathered and the ability of users to draw insights from the data in an objective manner.

Before investing in data analytics software, first identify the challenges your business is currently facing and ask the critical key performance questions that you want answers for.

Transforming Finance Into An Analytics Powerhouse

As organizations continue to change at an unprecedented pace, the role of finance also continues to expand and transform. The function must do more than just reporting results and provide forward-looking analysis that supports strategic decision-making processes and enhances business performance. This increased pressure on CFOs to be more business partners and strategic partners is renewing the call for finance to embrace and be at the forefront of data analytics to guide smarter decision making.

CFOs Must Cultivate a Data-Driven Culture

Businesses are operating in an economy that is more technologically driven and data-centric. Digitization, increased globalization, changing business models, increased volatility and a changing regulatory environment continue to pose challenges on businesses, especially with regards to decision-making.Unfortunately, data alone is not enough to make smarter decisions.

Making smarter decisions requires organizations to develop capabilities that enable them to quickly and easily transform this raw data into useful insights. These insights must be available to management in real-time otherwise they will end up working with a lot of “Dead Data.”

Finance is already used to dealing with large amounts of data and because the function is centrally positioned within the business to oversee various key decisions, CFOs should work more closely with business teams in driving their analytics agenda. For example, they can:

  • Ask business leaders critical questions they expect data analytics to answer. The more CFOs and their analytical teams continue to probe, the better the insights generated. In a constantly volatile environment, management must be able to model various what-if-scenarios and their outcomes.
  • Provide data-driven insights in the areas of pricing, inventory management, supply chain optimization, customer profitability and M&A, thereby demonstrating the value brought to the business by analytics.
  • Deploy dashboards that not only show financial metrics, but also operational, customer and process metrics and allow business leaders to drill down to the specifics themselves and make improved decisions.

Expand CFO Influence Outside the Finance Function

Traditional financial data from legacy ERP systems is no longer the main driver of decisions. Today’s businesses have more data (Structured and Unstructured) than in the past, and the rate at which this data is being produced continues to increase at alarming levels. The predicted growth in data is exponential, with some experts predicting a 4,300 percent increase in annual data production by 2020.

It is not a case of collecting data and leaving it to become obsolete and irrelevant before it can be used for the purpose it was collected for. Decision makers are depending more on insights derived from data to make better decisions. In the fight against cyber crime, companies are using predictive analytics to identify anomalies within their systems, assess vulnerabilities, predict attacks and automatically resolve. No longer are companies relying solely on threat signatures to fight cyber crime.

In the retailing industry, companies are using analytics to understand customer preferences, segment customers, create market differentiation and improve margins. In other organizations, analytics are being applied to improve and strengthen operations. IoT devices are helping companies assess, monitor and enhance machine performance.

This above examples alone show how data has become a strategic asset. By owning and driving analytics initiatives within their businesses, CFOs can continue to expand their strategic leadership role, strengthen their ties throughout the organization, expand their influence outside the finance function and become strategic business partners.

Adopt Modern Analytical Systems

Advancement in technologies and the growth in Shared Services business models has reduced the amount of time finance executives spend on transactional and routine activities. Today, much of the CFO’s time is spend on strategic issues, for example, helping the CEO and other business leaders execute strategy, identifying M&A opportunities, purchasing and implementing IT systems, creating shareholder value, assessing and monitoring risks,  and driving business performance.

In order to continue delivering on the above, CFOs must reduce their reliance on disconnected analytical data processes and legacy analytical systems, and invest in analytical capabilities that enable them to execute strategy more effectively, reduce processing cycle times, improve financial productivity and reduce finance operating costs.

Spreadsheets have their role in analytics but it is important to note that upon reaching a certain level, they become limited. As the business grows and the amount of data produced increases, it is worth investing in a data analytics system that is suited to your business needs and helps you achieve your strategic objectives. This is not just about replacing spreadsheets and the old software with the new system and tools. Instead, it is about understanding the fact that the new system is just an enabler and not your lottery ticket to riches.

By becoming an analytics powerhouse, the finance function  will be able to model various what-if-scenarios and provide the foresight to predict future outcomes, the insight to make real-time strategic decisions, and the hindsight to analyze and improve historical performance. Overall, the organization will have an advantage over its competitors.

I welcome your thoughts and comments


Data Analytics and the FP&A Function

Technological advancements in Big Data and Analytics are having a significant impact on the business’s operating model and strategic performance.

Many companies are already exploring how best they can adopt big data and analytics technologies to improve their businesses,  reduce costs, streamline processes,  improve marketing initiatives, and pursue future profits.

In the majority of these organizations,  the marketing function and supply chain are leading the pace in applying these new analytic capabilities and serving the customers.

Unfortunately, finance is lagging behind and still holding on to its legacy systems and primitive technologies.

All is not yet lost, there is still hope for finance to embrace advanced analytical technologies and help drive business performance.

In addition to helping marketing and supply chain functions, Big Data and Analytics can too play a critical role in supporting the finance function fulfill its FP&A role in today’s dynamic business environment.

In a world awash with large volumes of data, unstructured and structured, being able to identify patterns, anomalies and derive strategic insights is key for effective decision making.

Having this ability to access, synthesize and monetize data requires the FP&A function to invest in new skills and data tools and take advantage of the potential uses of new data types.

It is therefore imperative for the CFO to consider the implications of investing in Big Data and Analytics technologies as well as the impact of using data for effective decision making.

Modern technologies are not the domain for the CIO only but also for the CFO. Finance must learn to partner with the business, understand the language of IT and develop an ability to identify and evaluate the various ways data analytics technology can help the FP&A function.

CFOs should be asking themselves, how best can they leverage Big Data and Analytics technology to help improve the organization’s budgeting, planning and forecasting processes? How best can they enrich operational and financial forecasts with the most reliable data and make them more accurate?

While everyone is talking about Big Data and Analytics these days and how they have the potential to transform the organization and create a competitive advantage, it is easier for management and executives to join the “Big Data Dream” without first formulating a clear and coherent data strategy.

In the end, these executives end up collecting large volumes of data, most of it being worthless, resulting in the business incurring significant data costs and suffering from ineffective decision making.

The value in data is found when the organization is able to collect, synthesize, analyze and retrieve strategic insights from that piece of data and improve decision-making process.

Key to consider prior making significant investment in data analytics technologies is the alignment of data strategy with the broader strategy of the organization, data access and governance issues, new skills requirements, and implementation road map.

One of the challenges facing many FP&A functions is assessing the relevant data to analyze, identifying trends and gaining valuable strategic insights. When preparing forecasts and analyzing business performance, there is need, for example, to synthesize data across operational, financial and customer information.

How can all this data be integrated and used for decision support purposes? Unfortunately,  not many finance professionals possess this ability to manage this new data and new data types in a way that creates visibility across the organization and benefit other functions.

Thus, it is crucial for the FP&A function in today’s economy to develop data mining and analysis capabilities to ensure relevant data is being used for strategic and performance improvement decision making processes.

There is a need therefore for the organization to radically change its data approach and evaluate how it fits well within the overall strategy of the business. Ensure effective KPIs, measures and metrics are designed and implemented to help managers run the business.

This approach will ensure that information requirements as well as investments in advanced technologies are not managed in silos but rather, in a deliberate and organized fashion that aligns and supports the broader strategy of the business.

Furthermore, as data becomes a strategic asset to the organization,  it is crucial that the CFO collaborates with other senior executives of the organization and engage them in planning conversations and collaboratively find ways of improving business performance.

In today’s economic environment, data is found everywhere, both internally and externally and this data can only be accessed if finance decides to leave its comfort zone and begin engaging with the business.

Unfortunately, many finance professionals have a strong technical background and are weak when it comes to soft skills. Walking around the business, initiating conversations with other functions and asking smart questions doesn’t come naturally to many finance professionals.

However, as the role of finance continue to evolve, the finance organization must learn to adapt and acquire new soft skills to avoid being left behind in the back office.

Today’s finance professional is required not only to have data skills, but also the ability  to communicate with the broader organization, have a strategic mindset and a deeper understanding of the operational areas of the organization as well as the ability to identify opportunities and help the business grow by  reducing costs, evaluating and increasing top-line revenues.

FP&A must be able to ask smart questions and identify the relevant business needs that can be addressed by data analytics. How can the business benefit from technology, make smarter and faster decisions and also become more efficient? How can an investment in data and analytics technology help the business identify emerging or unknown risks areas and manage these risks intelligently?

By taking advantage of data analytics technologies, the FP&A function will be able to identify business performance leading indicators based on the data available, adjust forecasts and drive that information into operations.

Uncertainty and volatility in the global economy  are on the rise and because of this, many business executives are worried and cautious about where to make large capital expenditures. As the support function with a larger visibility across the organization,  the FP&A function can help address this uncertainty.

By embracing modern technologies, the function can transition from business intelligence and reporting on what happened, to data mining and understanding why something happened, to predictive analytics and determining what is likely to happen and when.

This will in-turn help  management and executives put contingency plans in place and become proactive.

Taking a phased approach is necessary when implementing data analytics technologies.  Instead of going full force with the implementation,  the organization can start small and use a specific business unit as the basis for the pilot project.

Results from this experimentation can then be used to evaluate technology performance in terms of ease of use, speed and benefits. If the results are satisfactory,  the next business unit or region is selected and the process continues until there is a complete roll-out across the entire organization.

© 2021 ERPM Insights

Theme by Anders NorénUp ↑