categoryReporting and Analytics

Big Data, Insights and Decision Making

Today, we are living through an explosion in the amount and quality of all kinds of available information. Society is facing a deluge of data and there isn’t even a slight indication that this information glut will soon be halted.

The below statistics on data creation evidently highlight the fact that no one is going to stop creating information.

As of 2013, experts believed that 90% of the world’s data was generated from 2011 to 2012.

In 2018, more than 2.5 quintillion bytes of data were created every day.

At the beginning of 2020, the digital universe was estimated to consist of 44 zettabytes of data.

By 2025, approximately 463 exabytes would be created every 24 hours worldwide.

As of June 2019, there were more than 4.5 billion people online.

80% of digital content is unavailable in nine out of every ten languages.

In 2019, Google processed 3.7 million queries, Facebook saw one million logins, and YouTube recorded 4.5 million videos viewed every 60 seconds.

Netflix’s content volume in 2019 outnumbered that of the US TV industry in 2005.

By 2025, there would be 75 billion Internet-of-Things (IoT) devices in the world.

By 2030, nine in every ten people aged six and above would be digitally active.

Source: SeedScientific

In the US, private companies now collect and sell as many as 75,000 individual data points about the average American consumer. And that number is miniscule compared with future expectations.

Why so much interest in customer data? Because the right data can tell business decision makers which customers to avoid and which they can exploit based on the company’s strategy and its stated objectives.

While it’s important to appreciate the benefits of data, we also need to acknowledge and respond to its drawbacks.

Just as people often confuse credit cards for currency, information alone is futile. The process of creating intelligence is not simply a question of access to information.

Rather, it is about asking the right questions, and collecting the right data.

You need a lot of pixels in a photo in order to be able to zoom in with clarity on one portion of it. Similarly, you need a lot of observations in a dataset in order to be able to zoom in with clarity on one small subset of that data.

Source: Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are by Seth Stephens-Davidowitz.

Your business performance will not improve through Big Data alone. You need Rich Data. Deep Data. Even if it comes in the form of Small Data.

The biggest reason investments in data analytics fail to payoff, though, is that most companies are choking on data. They have lots of terabytes but few critical insights.

Instead of being adequately informed, they are exceedingly informed because they are taking much of what they already have for granted.

We are exceptional at storing information but fall short when it comes to retrieving the same information. As a result, we get overloaded.

Some important questions to consider before investing in new data:

  • Is more information necessarily good?
  • Does it really improve the decision-making process?
  • Can you extract value from the information you already have?
  • Are you overwhelmed but underserved by today’s information sources?
  • How much of the data under your possession is useful, and how much of it gets in the way? That is, what is your data’s Signal-to-Noise ratio?

What are the ensuing problems of information overload?

  • Indecisiveness due to paralysis by analysis. The endless analysis is so overwhelming making it difficult to know how and when to decide.
  • Endless argumentation. In the era of limitless data, there is always an opportunity to crunch some numbers, spin them a bit and prove the opposite.
  • A total reliance on evidence-based decision making can undermine logical approaches to deliberation and problem solving. The solution is not always Big Data. The judgement of humans and small data is often necessary to help. We cannot just throw data at any question. Data, whether Big or small, and humans compliment each other.

The growth in the amount of data without the ability to process it is not useful in and of itself. Once data has been analyzed, it needs to be summarized in an easy-to-understand way and presented visually to enable decision makers apply their own expertise and make their own judgements.

Although Big Data offers us an opportunity to analyze new kinds of information and identify trends that have long existed but we hadn’t necessarily been aware of, there are a few things that it does not do well.

  • Data analysis is quite bad at narrative and emergent thinking.
  • It fails to analyze the social aspects of interaction or to recognize context. Human beings are undoubtedly good at telling stories that incorporate multiple causes.
  • Big Data also fails to identify which correlations are more or less likely to be false. The larger and more expansive the datasets, the more correlations there are, both false and true.

Correlation versus Causality is a huge issue in data analysis. The mere fact that two random variables are correlated does not automatically imply causation.

To test for causality, not merely correlations, randomized, controlled experiments (also called A/B Testing) are necessary.

  • People are divided into two groups.
  • The treatment group is shown and asked to do or take something.
  • For the control group, the status quo is maintained.
  • Each group is monitored how it responds.
  • The difference in the outcomes between the two groups is the causation.

Undertaking controlled experiments help us learn interventions that work and those that do not, and ultimately improve our decision making.

As you can see, the power of data lies in what business teams do with it. Clearly define enterprise data-use cases, aligning them with business strategy.

You don’t always need plenty data to create key insights that inform decision making. You need the right data blended with other insights and observations gathered offline.

Information is now plentiful and inexpensive to produce, manipulate, and disseminate. Almost anyone can add information. The big question is how to reduce it and base critical decisions on only a tiny sampling of all available data.

A Proactive Approach to RPA Adoption

In order to enable their team members to focus more of their time on higher value, higher satisfaction tasks demanding high levels of flexibility, creativity, critical thinking, problem-solving, leadership and emotional intelligence rather than repetitive tasks, CFOs are increasingly turning to RPA.

Short for Robotic Process Automation, RPA uses software to complete repetitive, structured, rules-based tasks to automate business processes at scale. It starts with simple, local tasks and scales up to enterprise-wide, intelligent automation, driven by machine learning and artificial intelligence.

In the finance function, RPA is being used to automate tasks that are of a repetitive nature and require tedious manual efforts.

Examples of such tasks include bank reconciliation process, sales ordering and invoicing, fixed asset management, financial and external reporting, inventory management, receivables and payables management, financial statement consolidation, tax planning and accounting, and forecasting.

Given the diverse applications of Robotics within finance, before introducing RPA into the finance function, it’s imperative for the CFO and everyone involved in the process to clearly assess and understand the benefits and risks attached.

Not every finance process is a good-fit for RPA

RPA implementation essentially depends on structured data and defined workflows. Thus, for any process to be a viable candidate for automation, the process must involve only structured, digital input and follow a rules-based processing approach.

Correctly identifying the process is therefore a critical step and holds the key to the success of any automation initiative.

However, the current plethora of new digital technologies and applications all promising to disrupt the way finance work is done is pausing a big challenge for finance leaders to separate hype from reality.

As a result, some leaders are deploying robotics into their operations simply due to a speed-to-market goal, resulting in missed benefits and unnecessary costs. When deploying RPA into finance, it’s not about following the herd or what your close competitors are doing.

Rather, you need to perform an objective analysis of your finance processes including an evaluation of which RPA tool or vendor relationship suits your unique needs. There is no one-size-fits-all solution.

Only a handful of processes can be neatly and entirely automated using an RPA tool alone. Therefore, it’s important to start your objective analysis with end to end process thinking because you will need to use multiple tools and techniques to realize the most transformative benefits.

Robotics tools are non-invasive

This means organizations do not need to make existing legacy changes when implementing RPA. The technology can be installed on any desktop or computer in a non-interruptive way with minimal IT involvement or coding abilities.

Because of this faster deployment, CFOs need not make the mistake that IT involvement is not required at all. Though RPA increases efficiency, it also brings with it the concern of system hacking and data breaches.

Also, platform security vulnerabilities, privacy implications and denial of service may yield ramifications that impact the RPA integrity, reliability and downstream business processes.

That is why it’s important to involve the IT from the upfront as the team plays a critical role of ensuring strong systems are in place to raise alerts of data breaches or process errors, and proactively remedy the situations.

Similar to every other system used in the organization, software bots need to be operationally managed and technically maintained.

Risk control and governance

Automation agendas are exciting and groundbreaking, yet intelligent and informed risk decisions surrounding their implementation need to be made to proactively create value and protect the business. As robots extract, aggregate, transform and upload data, risk and control considerations should become key discussion topics.

A robust governance framework should therefore be put in place to support the robotics deployment. The framework should succinctly address areas of concern such as approval of any system changes in case of process which have been automated, scalability, data storage and regulatory compliance.

The RPA tool should be capable of generating a detailed audit trail, highlighting any change or decision taken by the bot.

Just like humans whose performance is assessed, we should also be able to monitor and confirm the accuracy of the tasks being performed by the bots, reliability of the systems and adaptability to process changes.

Robust monitoring and security governance is critical to ensure all the tools and related infrastructure developed in RPA are compliant with IT security policies, regulatory provisions and risk policies across the organization.

CFOs and other leaders thus need to be aware and ready to deal with new complexities that could arise as a result of introducing robotics in finance.

Avoid putting the technology or tool ahead of your people

Many new technology implementation initiatives fail because decision makers leave people out of the equation. People decisions are often an after thought, secondary to technology decisions.

Implementing robotics is about driving operational efficiency, productivity, quality, customer satisfaction and more. These outcomes will not be realized if the key people who are meant to drive the change are not informed from the start.

As a leader, despite not having complete details about the final benefits of the initiative, you still need to communicate across the team why the organization has decided to bring about the change, what the ultimate organizational structure will look like after the change and the change impact on the existing employees.

People drive change and not technology. Reskilling of the employees who will be interacting and interfacing with the bots is therefore necessary for the overall success of the initiative.

In order to gain maximum benefits out of the automation exercise, have a long-term view and consider its strategic relevance to the business.

Engage employees throughout the organization and focus on automating cross-functional end-to-end processes across multiple stages instead of deploying RPA in pockets.

Analytics and AI: Humans and Machines are Good at Different Aspects of Prediction

Mostly driven by growth in the IoT, and the widespread use of internet, social media and mobile devices to perform search, send text, email and capture images and videos, the amount of data that we are producing on a daily basis is startling.

Consequently, companies are turning to data analytics and AI technologies to help them make sense of all the data at their disposal, predict the future, and make informed decisions that drive enterprise performance.

Although adoption of analytics and AI systems is increasingly extending in more mission-critical business processes, the implications of these emerging technologies on busines strategy, management, talent and decisions is still poorly understood.

For example, the single most common question in the AI debate is: “Will adoption of AI by businesses lead to massive human job cuts?”

Borrowing lessons from historical technological advances, yes, certain jobs will be lost, and new ones also created. However, machines are not taking over the world, nor are they eliminating the need for humans in the workplace.

Jobs will still be there albeit different from the traditional roles many are accustomed to. The majority of these new roles will require a new range of education, training, and experience.

For instance, nonroutine cognitive tasks demanding high levels of flexibility, creativity, critical thinking, problem-solving, leadership, and emotional intelligence do not yet lend themselves to wholesale automation.

Analytics and AI rely on data to make predictions

As more and better data is continually fed to the machine learning algorithms, the more they learn, and improve at making predictions.

Given these applications search for patterns in data, any inaccuracies or biases in the training data will be reflected in subsequent analyses.

But how much data do you need? The variety, quality and quantity of input, training and feedback data required depends on how accurate the prediction or business outcome must be to be useful.

Training data is used to train the predictive algorithms to predict the target variable, while the feedback data is used to assess and improve the algorithm’s prediction performance.

Undoubtedly, advanced analytics and AI systems are only as good as the data they are trained on. The data used to train these learning algorithms must be free of any noise or hidden biases.

You therefore need to understand how predictive technologies learn from data to perform sophisticated tasks such as customer lifetime value modeling and profitability forecasting.

This helps guide important decisions around the scale, scope and frequency of data acquisition. It’s about striking a balance between the benefits of more data and the cost of acquiring it.

Humans and machines both have shortcomings

In the context of prediction, humans and machines both have recognizable strengths and weaknesses.

Unless we identify and differentiate which tasks humans and machines are best suited for, all analytics and AI investments will come to naught.

For instance, faced with complex information with intricate interactions between different indicators, humans perform worse than machines. Heuristics and biases often get in the way of making accurate predictions.

Instead of accounting for statistical properties and data-driven predictions, more emphasis is often placed on salient information unavailable to prediction systems.

And, most of the times, the information is deceiving, hence the poor performance.

Although machines are better than humans at analyzing huge data sets with complex interactions amidst disparate variables, it’s very crucial to be cognizant of situations where machines are substandard at predicting the future.

The key to unlocking valuable insights from predictive analytics investments involves first and foremost understanding the definite business question that the data needs to answer.

This dictates your analysis plan and the data collection approaches that you will choose. Get the business question wrong, conclusively expect the insights and recommendations from the analysis to also be wrong.

Recall, with plentiful data, machine predictions can work well.

But, in situations where there is limited data to inform future decision making, machine predictions are relatively poor.

To quote Donald Rumsfeld, former US Secretary of Defense:

There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know.

Donald Rumsfeld, former US Secretary of Defense

Thus, for known knowns, abundant data is readily available. Accordingly, humans trust machines to do a better than them. Even so, the level of trust changes the moment we start talking about known unknowns and unknown unknowns.

With these situations, machine predictions are relatively poor because we do not have a lot of data to ingest into the prediction model.

Think of infrequent events (known unknowns) that occur once in a while, or something that has never happened before (unknown unknowns).

At least for infrequent events or happenings, humans are occasionally better at predicting with little data.

Generally so because we are good at comparison and applying prudent judgement, examining new situations and identifying other settings that are comparable to be useful in a new setting.

We are naturally wired to remember key pieces of information from the little data available or the limited associations we have had in the past.

Rather than be precise, our prediction comes with a confidence range highlighting its lack of accuracy.

Faced with unknown unknowns, both humans and machines are relatively bad at predicting their arrival.

The simple truth is that we cannot predict truly new events from past data. Look no further than the current Brexit conundrum.

Nobody precisely knew the unintended consequences of the UK leaving the EU. Leavers and Remainers both speculated as to what the benefits and disadvantages of leaving the EU maybe.

Of course, nobody knows what will happen in the future but that doesn’t mean we can’t be prepared, even for the unknown unknowns.

In their book Prediction Machines: The Simple Economics of Artificial Intelligence, Ajay Agrawal, Joshua Gans, and Avi Goldfarb present an additional category of scenarios under which machines also fail to predict precisely – Unknown Knowns.

Per the trio:

Unknown knowns is when an association that appears to be strong in the past is the result of some unknown or unobserved factor that changes over time and makes predictions we thought we could make unreliable.

PREDICTION MACHINES: THE SIMPLE ECONOMICS OF ARTIFICIAL INTELLIGENCE

With unknown knowns, predictive tools appear to provide a very accurate answer, but that answer can be very incorrect, especially if the algorithms have little grasp of the decision process that created the data.

To support their point of view, the authors make reference to pricing and revenue analysis in the hotel industry, although the same viewpoint is applicable elsewhere.

In many industries, higher prices are analogous to higher sales, and vice versa.

For example, in the airline industry, airfares are low outside the peak season, and high during peak seasons (summer and festive) when travel demand is highest.

Presented with this data, and without an understanding that price movements are often a function of demand and supply factors, a simple prediction model might advocate raising different route airfares to sell more empty seats and increase revenues. Evidence of causal inference problems.

But, a human being with a solid understanding of economics concepts will immediately call attention to the fact that increasing airfares is unlikely to increase flight ticket sales.

To the machine, this is an unknown known. But to a human with knowledge of pricing and profitability analysis, this is a known unknown or maybe even a known known provided the human is able to properly model the pricing decision.

Thus, to address such shortcomings, humans should work with machines to identify the right data and appropriate data analysis models that take into consideration seasonality and other demand and supply factors to better predict revenues at different prices.

As data analytics and AI systems become more advanced and spread across industries, and up and down the value chain, companies that will progress further are those that are continually thinking of creative ways for machines to integrate and amplify human capabilities.

In contrast, those companies that are using technology simply to cut costs and displace humans will eventually stop making progress, and cease to exist.

Finance Analytics: Using The Right Data to Generate Actionable Insights

We are living in the information age. Data is everywhere and affecting every aspect of our lives. As the use of data and analytics become pervasive, in future it will be uncommon to find an industry not capitalizing on the benefits they can provide.

CFOs are increasingly leveraging data and analytics tools to optimize operational and financial processes, and drive better decision-making within their organizations. Finance is no longer simply a steward of historical reporting and compliance.

Today, the expectation on finance is to engage with new technologies, collaborate with various business stakeholders to create value and act as a steward of enterprise performance.

That is, deliver the right decision support to the right people at the right time.

Delivering impactful data-driven insights does not happen on its own. You need to have the right tools, processes, talent and culture working well together.

Here are some of the questions you can start asking to kickstart you on this journey:

  • Does your organization have the necessary tools and capabilities to collect and analyze the data (structured and unstructured) and make sense of it all?
  • How robust are your data governance and management processes?
  • How do you define the structure of your finance analytics team? Are you focused heavily on traditional and technical skills or diversity?
  • How are people armed to make better decisions using the data, processes, and analytical methods available?
  • As a finance leader, are you promoting a culture that bases key decisions on gut feel versus data-driven insights?

Both intuitive and analytical decisions are important and can create significant value for the business.

However, I’d recommend against promoting a culture that embraces the randomness of intuitive thinking and avoids analytical thinking completely.

What are you doing with your data?

In a world where data is ubiquitous and plentiful, it can be overwhelming to separate the noise from the important.

According to IBM, 80 percent of today’s data is originating from previously untapped, unstructured information from the web such as imagery, social media channels, news feeds, emails, journals, blogs, images, sounds and videos.

And this unstructured data holds the important insights required for faster, more informed decisions. Take news feeds as an example. Breaking news on economic policy can help you reevaluate production and supply chain decisions, and adjust forecasts accordingly.

But, for fear of missing out, many businesses end up hoarding data with little to zero analysis performed on this data. You don’t need to collect and store each and every single piece of data type out there.

Instead, only data which matters most and is key to unlocking answers to your critical business performance questions. The rest is all noise.

That’s why it’s critical to regularly ask the question, “What are we doing with all the data we have?” This way you would also be able to identify any information gaps that require filling via new data sources.

For analytics to work, people need the foresight to ask the right questions

People play a very critical role in an organizational analytics efforts. Without insightful guidance on the right questions to answer and the right hypotheses to test, analytics efforts are foiled. They cannot direct themselves.

Asking the right questions is key to drawing the right conclusions. Let’s look at the steps involved in making better decisions using data and analytics for one of your product lines. After a strong start, you’re now starting to witness a decline in online revenue but are not sure why.

Before jumping into complex analytics:

  • First, identify the problem that the data need to answer. You already know What happened. However, this is not enough. To address the issue, it’s important that you have a better understanding of the impacted segment and potential reasons for the revenue decline. This is where you start asking questions such as why is revenue declining? When did the revenue decline start? Where did it happen (country, region, city)? What might have caused this (potential drivers)? What actions can we take to address the decline?
  • Then come up with hypotheses of the issue to be addressed. A hypothesis describes a possible solution, such as a driver or a reason behind your business question. Many people make the mistake of simply jumping into data collection and analysis before forming a hypothesis to answer the critical business question. Going back to our revenue decline example, supposedly you have identified several hypotheses and then prioritized two to explain the revenue decline – A slowing down economy and impeding recession causing customers to tighten their purse strings quite a bit, and a new website design with enhanced security features but also additional online check out steps.
  • Based on these two hypotheses you can then identify the exact data that should be collected and analyzed to prove or disprove each hypothesis.

Irrespective of how advanced your analytical tools are, asking the wrong business performance questions results in flawed data being collected, and ultimately poor insights and recommendations. Hence the importance of diversity within the team.

Remember, no data or analytics model is perfect. Finding good data and making sure it’s cleaned and validated is fundamental but the organization also shouldn’t wait for perfection.

It’s easy to get fixated on perfection and fail to see the forest for the trees.

Building Trust in Your Data, Insights and Decisions

Over the years, the adoption of advanced analytics and data visualization tools by enterprises in order to generate key insights that can support quality business performance decisions has considerably evolved.

Nonetheless, not all investments in this arena are producing the desired outcomes.

Although significant sums of money, resources and time have been channeled towards data and analytics initiatives, the quality of insights generated, and ultimately the decisions are far from convincing.

Some of the possible reasons for explaining these failures include:

  • A sole focus on technology, and less on processes and people.
  • Misguided belief that the more data you have the better.
  • Lack of or poor organizational alignment and cultural resistance.

Delivering trusted insights is not a simple matter of investing in the right technology alone. Neither will simply collecting and storing large amounts of data lead to better decisions.

For instance, investing in state-of-the-art data analytics and visualization tools at the expense of data quality doesn’t automatically empower you to generate valuable insights that drive business performance.

Or having the right technology and quality data, but poor organizational alignment coupled with intense cultural resistance.

So how can your organization use data and analytics to generate trusted insights, make smarter decisions and impact business performance?

Prior generation of any business insights, trends or activity patterns, it’s imperative to ensure that the performance data to be analyzed is trusted. That is, the data must be fit for purpose, current, accurate, consistent and reliable.

With so much data and information available today, it’s increasingly difficult for decision makers to determine which data points are essential in understanding what drives the business and incorporate these into business decision making.

Resultantly, many organizations end up data hoarding and incurring unnecessary expenses along the process.

To ensure your data is fit for purpose, first and foremost, develop an understanding of the real business questions that the data needs to answer. By asking questions at the very beginning, you will be able to identify and define any problems or issues in their context.

Further, asking relevant questions will help you develop an improved understanding of the present factors, past events, or emerging opportunities driving the need for advanced analytics and data visualization investments in the first place.

Concurrently, asking the wrong questions results in ineffectual resolutions to the wrong business problems.

Generally, people want to kick off their data analysis effort by first collecting data and not asking the relevant questions, establishing the business need and formulating the analysis plan.

But, the quality of the insights and the value of the recommendations delivered are rooted in the quality of the data you start out with.

Data collection should only begin once the purpose and goals of the data analysis are clearly defined. Thus, the more and better questions you ask, the better your insights, recommendations, decisions and ultimately business performance will be.

Data accuracy, consistency and reliability

Ensuring that data used in analysis is accurate, consistent and reliable remains one of the common hurdles hampering analytics initiatives. One of the reasons for this is multiple data sources.

For example, suppose you want to analyze and evaluate the impact of a given marketing campaign on customer experience, in the past to do so you would have managed with data from your CRM system only.

Today, in order to get a 360-degree view of your customers, analyzing CRM data alone is not sufficient. You also need to analyze data from your web analytic system, customer feedback surveys, data warehouses, email, call recordings, social media or production environments.

Thus, you need to bring these disparate data sources together to generate valuable, actionable insights.

Issues arise when the multiple data sources define key business performance terms or metrics (e.g. sales, conversion, margin growth) differently and there is no common definition shared across the business.

This lack of shared understanding of business performance has far reaching implications on your analysis and makes it futile to compare results.

Data collection, followed by data cleansing and validation, are all therefore important. After you have collected the data to be used in the analysis per your analysis plan, it’s important to clean and validate the data to make it useable and accurate. Select a small data sample and compare it with what you expected.

Ensure the data type matches your expectations. Remember the adage: garbage in, garbage out. Even if you have the best of the breed analytics tools to slice and dice information, with poor data quality practices, your insights and decisions will remain unsubstantiated, erroneous and worthless.

For this reason and to ensure the consistency of the data used in your analysis, know and understand the source of every piece of your data, the age of the data, the components of the data, what is not included in the data, when and how often the data is refreshed.

Without data integrity, it is difficult to generate trusted insights and deliver recommendations that decision makers can confidently act on. Also, delivering trusted insights goes beyond merely attaching a colourful report in an email and hitting the send button.

Instead, the insights must be delivered in a digestible and actionable format. That is, communicated in a format that is more readily understood by decision makers in the organization. Poorly communicated insights can result in no action, negating the value of the insights in the first place. Thus, avoid getting bogged down in details.

Knowing is powerful, but acting based on knowledge is what drives smart decisions, creates true value and impacts business performance.

Converting Data Into Insights: The Right Technology Alone is No Guarantee of Success

The fact that technology is playing an increasingly significant role in executing many traditional finance tasks while at the same time generating greater insights that drive business performance is irrefutable.

However, even though many organizations are significantly investing in advanced data and analytics technologies, majority of these investments are reportedly yielding disappointing returns.

This is often because they focus mostly on the implementation of new tools, and less on the processes and people. For instance, there is a widespread misconception that data analytics is a technology issue, and also about having the most data.

As a result, many organizations are ultimately spending large amounts of money on new technologies capable of mining and managing large datasets.

There is relatively little focus on generating actionable insights out of the data, and building a data-driven culture, which is the people aspect of analysis.

Data analytics is not purely a technology issue. Instead, it is a strategic business imperative that entails leveraging new technologies to analyze the data at your disposal, gain greater insight about your business, and guide effective strategy execution.

Furthermore, having the right data is more important than having the most data. Your organization might have the most data, but if you’re not analyzing that data and asking better questions to help you make better decisions, it counts for nothing.

Reconsider your approach to people, processes and technology

The success of any new technology greatly depends on the skills of the people using it. Additionally, you need to convince people to use the new technology or system, otherwise it will end up being another worthless investment.

Given digital transformation is here to stay, for any technology transformation to be a success, it’s imperative to have the right balance of people, IT and digital skills.

It’s not an issue of technology versus humans. It’s about striking the right balance between technology and people, with each other doing what they do best. In other words, establishing the right combination of people and technology.

For example, advanced data analytics technologies are, by far, better than humans at analyzing complex data streams. They can easily identify trends and patterns in real-time.

But, generating useful insights from the data all depends on human ability to ask the right key performance questions, and identify specific questions that existing tools and techniques are currently not able to answer.

Rather than hastily acquiring new data and tools, start by reviewing current data analytics tools, systems and related applications. This helps identify existing capabilities including tangible opportunities for improvement.

It’s not about how much the business has invested or is willing to invest in data analytics capabilities; it’s about how the business is leveraging existing tools, data and insights it currently has to drive business growth, and where necessary, blending traditional BI tools with new emerging data analytics tools.

That’s why it’s critical to build a clear understanding of which new technologies will be most beneficial to your business.

Technology alone will not fix broken or outdated processes. Many businesses are spending significant amounts of money on new tools, only to find out later that it’s not the existing tool that is at fault but rather the process itself.

Take data management process as an example, mostly at larger, more complex organizations. Achieving a single view of the truth is repeatedly challenging. This is often because data is often confined in silos, inconsistently defined or locked behind access controls.

A centralized data governance model is the missing link. There are too many fragmented ERP systems. Data is spread across the business, and it’s difficult to identify all data projects and how they are systematized across the business.

In such cases, even if you acquire the right data analytics tools, the fact that you have a weak data governance model as well as a non-integrated systems infrastructure can act as a stumbling block to generating reliable and useful decision-support insights.

To fix the faltering data management process, you need to establish a data governance model that is flexible across the business and provides a centralized view of enterprise data – that is, the data the organization owns and how it is being managed and used across the business.

This is key to breaking down internal silos, achieving a single view of the truth, and building trust in your data to enable effective analytics and decision making.

Is your organization spending more time collecting and organizing data than analyzing it?

Have you put in place robust data governance and ownership processes required to achieve a single version of the truth?

Are you attracting and retaining the right data analytics talent and skills necessary to drive data exploration, experimentation and decision making?

More Data Doesn’t Always Lead to Better Decisions

Thanks to advancements in technology, our digital lives are producing expansive amounts of data on a daily basis.

In addition to this enormous amount of data that is produced each day, the diversity of data types and data sources, and the speed with which data is generated, analyzed and reprocessed has increasingly become unwieldy.

With more data continuously coming from social spheres, mobile devices, cameras, sensors and connected devices, purchase transactions and GPS signals to name a few, it does not look like the current data explosion will ebb soon.

Instead, investments in IoT and advanced analytics are expected to grow in the immediate future. Thinking back, investing in advanced data analytics to generate well-informed insights that effectively support decision making and drive business performance have been the paragon of big corporations.

And smaller businesses and organizations, as a result, have for some time embraced the flawed view that such investments are beyond their reach. It’s no surprise then that adoption has been at a snail’s pace.

Thanks to democratization of technology, new technologies are starting to get into the hands of smaller businesses and organizations. The solutions are now being packaged into simple, easy-to-deploy applications that most users without specialized training are able to operate.

Further, acquisition costs have significantly reduced thereby obviating the upfront cost barrier, that for years, has acted as a drag on many company IT investments.

While the application of data management and advanced analytics tools is now foundational and becoming ubiquitous, growing into a successful data-driven organization is about getting the right data to the right person at the right time to make the right decision.

Distorted claims such as data is the new oil have, unfortunately, prompted some companies to embark on unfruitful data hoarding sprees. It is true that oil is a valuable commodity with plentiful uses. But, it is also a scarce resource not widely available to everyone.

The true value of oil is unearthed after undergoing a refinement process. On the contrary, data is not scarce. It is widely available. Nonetheless, akin to oil, the true value of data is unlocked after we have processed and analyzed it to generate leading-edge insights.

It’s a waste of time and resources to just hoard data and not analyze it to get a better understanding of what has happened in the past and why it has happened. Such insights are crucial to predicting future business performance scenarios and exploiting opportunities.

More data doesn’t necessarily lead to better decisions. Better decisions emanate from having a profound ability to analyze useful data and make key observations that would have otherwise remained hidden.

Data is widely available, what is scarce is the ability to extract informed insights that support decision-making and propel the business forward.

To avoid data hoarding, it is necessary to first carry out a data profiling exercise as this will assist you establish if any of your existing data can be easily used for other purposes. It also helps ascertain whether existing records are up to date and also if your information sources are still fit-for -purpose.

At any given time, data quality trumps data quantity. That is why it is important to get your data in one place where it can easily be accessed for analysis and produce a single version of the truth.

Unlike in the past where data was kept in different systems that were unable to talk to each other making it difficult to consolidate and analyze data to facilitate faster decision making, the price of computing and storage has plummeted and now the systems are being linked.

As a result, companies can now use data-mining techniques to sort through large data sets to identify patterns and establish relationships to solve problems through data analysis. If the data is of poor quality, insights generated from the analysis will also be of poor quality.

Let’s take customer transactional data as an example. In order to reveal hidden correlations or insights from the data, it’s advisable to analyze the information flow in real-time; by the hour, by the day, by the week, by the month, over the past year and more. This lets you proactively respond to the ups and downs of dynamic business conditions.

Imagine what could happen if you waited months before you could analyze the transactional data? By the time you do so, your insights are a product of “dead data”. Technology is no longer an inhibitor, but culture and the lack of leadership mandate.

As data become more abundant, the main problem is no longer finding the information as such but giving business unit managers precise answers about business performance easily and quickly.

What matters most is data quality, what you do with the data you have collected and not how much you collect. Instead of making more hay, start looking for the needle in the haystack.

Migrating From On-Premise to Cloud-Based Solutions

Gartner forecasts cloud computing to be a $278 billion business by 2021, as companies increasingly adopt cloud services to realize their desired digital business outcomes. In a separate survey of 439 global financial executives, the research company found that finance is moving to the cloud much faster than expected.

By 2020, 36 percent of enterprises are expected to be using the cloud to support more than half of their transactional systems of record.

With cloud technology promising to provide the speed and agility that the business requires, including generating significant cost savings and new sources of revenue, it’s not surprising the cloud market is experiencing a boom.

While companies have experienced both organic and inorganic growth, many have struggled to keep pace with rapidly changing technology landscape. Their businesses are saddled with legacy technology infrastructures that are slowing progress towards the achievement of strategic objectives as they are slow to react to change.

Given the need to react more quickly, at the same time become more proactive and drive business performance, senior finance executives are turning to cloud-based financial solutions to provide real-time management information reporting and decision support.

Adopting cloud not a simple and quick decision

Looking at the expected growth metrics for the cloud market and then hearing of all the novel opportunities offered by cloud computing, CFOs and CIOs are enticed to believe that it is only a matter of time before the organization’s computing technology and data is skyward.

However, over the course of my career I have come to understand that the only thing right about a forecast is that it’s wrong. Either the forecast is close to reality or very far from reality.

Cost savings is cited most often by senior finance executives as the reason for adopting cloud technology. Considering that cloud is essentially a form of outsourcing arrangement in which client organizations use the internet to connect to a variety of applications, storage services, hardware resources, platforms and other IT capabilities offered by cloud service providers, upfront capital investment and maintenance costs are minimal.

This is because the cloud service provider owns the necessary hardware and other resources needed to deliver these services and is also responsible for employing the staff required to support them. Unlike on-premise solutions where the business has to incur large capital outlays for hardware and IT support staff, cloud is an operating expense (except implementation costs which can be capitalized), that is payable as and when incurred.

With cloud, a business is able to add new licences to its subscription when it needs them and scale its licensing costs with its growth, rather than buying in bulk upfront. This is in direct contrast to traditional on-premise software where companies in anticipation of business growth have over invested in IT systems hoping to add more users to the user list soon after the growth is achieved.

However, when deciding whether to invest in cloud or not, CFOs should look beyond cost benefits. In addition to cost savings, they should also consider tactical and more strategic and structural issues. Unfortunately, the challenge for many finance professionals is that when evaluating investments the focus is solely on cost. We fail to examine and evaluate the non-financial risks and strategic impact of the investment on the business.

Strategic and structural considerations

As I have written in the past, most organizations get caught up in the hype of new technologies and end up making technology investments that are unaligned to the strategy of the business. There is no strong business case for investing in the new technology. Don’t allow yourself to fall into the same trap.

Investing in cloud is not an IT or finance only decision. Decisions of what to migrate to the cloud, when to migrate it, and how to transition from an on-premise environment to a cloud-based environment all require a collaborated effort if the organization is to achieve its stated objectives.

Further, transitioning to the cloud computing environment  is not a matter of flicking the switch up and down. You need to have deeper understanding of the cloud resources (public cloud, private cloud, community cloud and hybrid cloud) available on the market, their delivery models (SaaS, PaaS and IaaS) and how these all fit together into your business and technology model.

Understanding the cloud model will help you determine whether cloud is appropriate in the context of your business purpose and risks. For example, in the case of public cloud, the applications and cloud-based services are positioned as utilities available to anyone who signs on.

If over the years your company has built and strengthened IT capabilities that differentiate you from competitors, migrating to the cloud can mean walking away from your success recipe and expose yourself to vulnerabilities.

Therefore, if you  are planning to migrate your on-premise computing environment to the cloud, take a long-term view of your IT environment and determine what type of applications are candidates for the cloud and which will not be transitioned until the distant future.

Ideally, you should start with applications that have low risk associated with them or those that have a business need that cannot be met using traditional computing capabilities. Once you have build greater comfort and trust in the cloud, you can then scale to include other applications.

The pain of disintegration

It is no secret that many businesses today are re-evaluating their technology infrastructure and leveraging new technologies to respond faster and be more flexible. But is cloud computing right for your business? Is speed to deploy more important to you than having systems that talk to each other?

In a world where each cloud service provider is claiming that their solution is better than the next one on the same shelf, CFOs and CIOs are grappling with the decision of which cloud service provider to go with. As a result, the company ends up doing business with more than one vendor to compensate for the shortfalls of the other system.

The problem arises when the cloud-based application from one vendor is unable to work with an application from another provider resulting in more than one version of the truth. In other cases, the company’s on-premise IT infrastructure does not allow sharing data with multiple cloud-based applications, which in turn results in finance spending time consolidating and reconciling data from disparate systems in excel.

Given that the cloud model depends on providing essentially a standardized package across the board, it is important to weigh the pros and cons of foregoing customization versus rapid implementation. Because the cloud market is projected to grow in the coming years, many IT solution providers are channeling money towards cloud-based solutions. This has resulted in some vendors withdrawing IT support on legacy ERP systems and phasing them out.

In some cases, the vendors have installed upgrades to the same solutions. The problem with these solutions is that they were originally not built with modern business requirements in mind hence they can only get you a few more years of support.

It is therefore important for CFOs and CIOs to be cognizant whether the solution was originally designed as a cloud-based solution, or it is a modified version of a solution initially designed to function in a traditional on-premise client-ownership model.

With data being found everywhere today and advanced analytics playing a critical role in supporting key decision making, delivering one version of truth has never been so important. In order to make sense of all the data at its disposal a company should focus its efforts on centralizing data management and IT policies. Having a single data repository ensures everyone is drinking from the same well for information and insights.

However, in companies where IT governance is weak the tendency is for teams to autonomously adopt cloud-based solutions that meet their individual needs. This is counter-productive to data management centralization efforts as data normally ends up spread across multiple cloud-based systems that are dis-aggregated.

Just imagine a scenario where FP&A, tax, treasury, procurement, supply chain, and other finance functions each identify and select their own cloud solution. Consolidating and analyzing relevant data from these various systems to get a big picture view of business performance in itself is a huge task to complete as that data is divided across many domains.

While each cloud-based move may look beneficial in isolation, adopted together they may increase operating expenses to a level that undermines the anticipated savings.

Control versus no control

Although cloud-based solutions offer more affordable options and more flexible payment plans than traditional software providers, the issue of data control is still a concern. Cyber criminals are getting smarter by the day,and the fact is, whether organizational data resides on the internet or offline on the company’s network, it’s never completely immune to attack.

When it comes to data security, it is imperative for CFOs and CIOs to know that the moment data is no longer in-house, the business may end up having less control over who has access to key systems and data. The fact that you have outsourced your IT systems and data control to a third party does not make your company immune to a lawsuit in the event of a breach. You therefore need to sign agreements that protect you against various types of misfortunes.

Although the cloud service provider employs a team of IT support staff to monitor and perform regular threat assessments and deploy the latest patch immediately, the organization should not relax and assume the house is in order. You need to get strong assurances from your cloud vendor that your business data is safe.

You also need to know where your data is stored, the applicable data laws, how often it is backed up and whether you are able to perform audits on the data. Other factors to consider include end of agreements for convenience. Many software vendors try to lock in client organizations for a significant period of time. This in turn makes it difficult for client organizations to change vendors without incurring costs or changing systems some way.

Thus, when negotiating an agreement with a cloud-based service provider, try by all means to ensure that you are not locked-in for lengthy periods just in case you need to change technology providers in the near future.

Don’t rush to the cloud without a clearly defined vision and road map. Engage in plenty of deliberation to ensure the new surpasses the old, sample technologies with less risk and related influences on the business and then scale the adoption.

Delivering Reliable Insights From Data Analytics

Data and analytics are becoming increasingly integral to business decisions. Organizations across all sectors are leveraging advanced real-time predictive analytics to balance intuitive decision-making and data-driven insights to monitor brand sentiment on social media, better serve their customers, streamline operations, manage risks and ultimately drive strategic performance.

Traditional business intelligence reporting tools with static charts and graphs are being replaced by interactive data visualization tools that enable business insights to be shared in a format that is easily understood by decision makers, in turn enabling better, faster operational and strategic decision-making.

Given the constantly changing business landscape driven by increased competition, macro and geopolitical risks, intense regulatory demands, complex supply chains, advanced technological changes etc. decision makers are turning to finance teams for actionable insights to help them navigate through this volatility and uncertainty.

As business unit managers increasingly rely on finance decision support for enhanced business performance, it is imperative for CFOs and their teams to ensure the performance insights they are delivering are informed and actionable. However, a 2016 survey report by KPMG revealed that 60% of the survey participants are not very confident in their data and analytics insights.

Data Quality or Quantity?

In today’s world of exponential data growth, the ability for finance to deliver reliable and actionable insights rests not on the quantity of data collected, analyzed and interpreted, but rather on the quality of that data. The starting point for any successful data analytics initiative involves clarifying why the business needs to collect a particular type of data.

One of the challenges facing many businesses today is identifying and selecting which internal and external sources of data to focus on. As a result, these companies end up investing in data warehouses full of massive amounts of useless data. To avoid being data rich and insight poor, CFOs need to understand  the role of quality in developing and managing tools, data and analytics.

Before inputting data into any analytical model, it is important to first assess the appropriateness of your data sources. Do they provide relevant and accurate data for analysis and interpretation? Instead of relying on a single source of data for analysis, you need to have the ability to blend and analyze multiple sources of data. This will help make informed decisions and drive business performance.

Further, businesses are operating in a period of rapid market changes. Market and customer data is getting outdated quickly. As a result, being agile and having the ability to quickly respond to changing market conditions has become a critical requirement for survival. The business cannot afford to sit on raw data for longer periods. Capabilities that enable data to be accessed, stored, retrieved and analysed in a timely basis should be enhanced.

Thus, in order to provide business users with access to real-time and near-time operational and financial data, the organization should focus on reducing data latency. Reducing data latency allows finance teams to run ad-hoc reports to answer specific business questions which in turn enables decision makers to make important decisions more quickly.

In the event that finance provides business insights or recommendations based on inaccurate data, analysis and predictions, this will quickly erode, if not extinguish, business trust and shake the confidence of those decision makers who rely on these predictions to make informed decisions.

As data volumes increase and new uses emerge, the challenge will only increase. It is therefore critical for finance to put in place robust data governance structures that assess and evaluate the accuracy of data analytics inputs and outputs.

Work with business partners to set objectives up front

Churning out performance reports that do not influence decision making is a waste of time yet that is what most finance teams are spending their time doing. It is not always the case that the numbers that make sense to finance will also make sense to business partners.

The biggest problem in the majority of these cases is lack of understanding by finance of the business objectives. Instead of collaborating with the business to develop a better understanding of their operations and how performance should be measured and reported, many finance analytics teams are working in their own silos without truly linking their activities back to business outcomes.

To improve finance’s reporting outcomes, the function should take stock of the reports it produces per month, quarter or annually. Then evaluate the nature and purpose of each report produced and what key decisions it helps to drive. It is not about the quantity of reports that matters, but rather the quality of the reports.

Business partners need to be engaged at the start of the process and throughout the analytics process. They need to be involved as finance explores the data and develop insights to ensure that when the modeling is complete, the results make sense from a business perspective

By working with business partners and setting objectives up front, finance will be able to focus its efforts and resources on value-add reports that tell a better business story. Further, the function will be able to assess and monitor the effectiveness of its data models in supporting business decisions

Simplify interconnected analytics

With so many variables impacting business performance the organization cannot continue to rely on gut instinct to make better decisions. The organization has no choice but to use data to drive insights. As a result, organizations are relying on a number of interconnected analytical models to predict and visualize future performance.

However, given that one variable might have multiple effects, it is important for the business to understand how changes in one variable will affect all the models that use that variable, rather than just one individual model. By maintaining a meta-model, the organization would be able to visualize and control how different analytical models are linked.

It also helps ensure consistency in how data is used across different analytical models. Ultimately, decision makers will be able to prioritize projects with the greatest potential of delivering the highest value to the business.

Build a data analytics culture

Advanced data analytics is a growing field and as such competition for talent among organizations is high. Due to their traditional professional training, many accounting and finance professionals lack the necessary data and analytics skills.

More over, decision makers not knowing enough about analytics are reluctant to adopt their use. Because of cognitive bias, it is human nature to subconsciously feel that their successful decisions in the past justify a continued use of old sources of data and insight. What we tend to forget is that what got us here won’t get us there, hence the need to learn, relearn and unlearn old habits.

To move forward, the organization should focus on overcoming cognitive biases developed over the years, and closing this skills gap and develop training and development programs that can help achieve the desired outcomes. Using advanced analytics to generate trusted insights requires an understanding of the various analytics methodologies, their rigor and applicability.

It’s difficult to have people understand if they don’t have the technical capabilities. However, building a data analytics culture does not imply that the organization should focus on developing data science skills alone. You also need to develop analytics translators.

These are individuals who are able apply their domain expertise in your industry or function, using a deep understanding of the drivers, processes, and metrics of the business to better leverage the data and turn it into reliable insights and concrete measurable actions.

Building a data analytics culture that promotes making decisions based on data-driven insights is a continuous endeavour that spans the data analytics life cycle from data through to insights and ultimately to generating value. Successful use cases can be used to further drive the culture across the organization.

A Practical Approach to Using Artificial Intelligence for CFOs – Part I

Part I Leveraging AI in the CFO Suite

In their role as curator of critical information for their company, Chief Financial Officers must create processes and develop systems that filter out noise and focus only on the most important, actionable information. The plethora of data being created is growing at astronomical rates making this role much more crucial and much more difficult. In this article we’ll explore how CFOs can take a practical approach to integrating artificial intelligence (AI) into their operations.

First let’s define AI in a manner that applies to its use in finance.

AI is information derived from algorithms applied to data set(s) normally accomplished with little or no human intervention.

  • Algorithm: a set of steps that are followed to solve a mathematical problem or to complete a computer process
  • Information: knowledge you get about something: facts or details about a subject

In his book, The Design of Business, Roger Martin describes the stages of learning that go from mystery to heuristic to algorithmic. The financial processes at many companies are heuristic, made up of general guidelines but containing many steps, developed by trial and error, and known only to the process owner. These processes lock corporate technology in the minds of one or a few individuals; creating technology risk and a training burden when staffing transitions occur. Developing an AI system and framework to effectively select processes that should incorporate more AI is rapidly becoming a core skill required for CFO success.

Until recently, many financial applications of AI have helped uncover altogether new techniques or capabilities. For example, program trading in the financial markets came about because AI could “crunch” numbers (the price of a basket of individual stocks) fast enough to allow traders to arbitrage an index against a portfolio of individual stocks.

In addition to the speed factor, AI now is being used to replace repetitive, linear tasks and increase our information output capacity. Both uses have wide implications for the CFO, including;

  • Choosing a system architecture that will capture AI most effectively for your organization
  • Managing your talent in a manner that is socially responsible
  • Developing and acquiring talent that captures the benefits of our AI system investment
  • Mastering the ability to manage the Decision Pyramid

How to Leverage AI in Finance

To date, most of the investments in AI for business have to do with specific industries; stock trading, portfolio management, banking and insurance underwriting. Customer development and customer service have also benefited from large investments in AI. The CFO responsibility areas, although ripe with automation, have not adopted AI to the extent these other industries or functions have. The opportunity is vast, but we need a methodology to identify where to start and continue our AI investment.

The main benefits to AI are derived from three aspects; speed, accuracy and volume. Logically, we should apply AI to areas where the total value (increased revenue and reduced cost) of the following three variables is greatest:

  • Speed: The incremental value of time as applied to a process or delivering information
  • Accuracy: The incremental cost of error or lack of precision in a process or information
  • Volume: The incremental cost of each unit of volume in a process or in reports and analysis.

Identifying the value of these different variables is the key to selecting an appropriate AI strategy and developing a work plan to implement it.

There are two main ways AI can enhance the CFO responsibility areas.

  1. As a Process Improvement Mechanism. In this case AI will be applied to the transactional work to complete it more quickly, more accurately and/or more of it.
  2. As a Decision Support Mechanism. Here AI is applied to the data used to create the information in a report or analysis to improve the decision support. This support is enhanced through quicker, more accurate and/or more information.

Illustrations of these two types of AI applications can be visualized using two examples:

  1. Procure to Pay (P2P): Using AI on the P2P process may yield big improvements in process effectiveness which will lead to lower costs and a reduction in errors.
  2. Budgeting and Forecasting: Using AI in the Forecasting process expands the scope of data that can be incorporated into the model, including the shift from exclusively using internal data to expanding the model to include external data. This use of AI will improve decision making by reducing the noise in our outputs due to using more robust input data.

We have a developed a worksheet to assist in targeting where AI will bring you the most value. The worksheet is patterned after the Four Pillars of CFO Success and includes the major CFO technical competencies (i.e. CFO competencies ripe for AI application). Some critical thinking about each competency will allow you to develop a comparative scoring schedule to assist you in building an AI strategy.

Click to get your AI Identification Worksheet

Next Up Tomorrow: Part II The Benefits of AI and What You Will Need to Make It a Success

© 2021 ERPM Insights

Theme by Anders NorénUp ↑