categoryReporting and Analytics

Finance Analytics: Using Relevant Data to Generate Actionable Insights

We are living in the information age. Data is everywhere and affecting every aspect of our lives. As the use of data and analytics become pervasive, in future it will be uncommon to find an industry not capitalizing on the benefits they can provide.

CFOs are increasingly leveraging data and analytics tools to optimize operational and financial processes, and drive better decision-making within their organizations. Finance is no longer simply a steward of historical reporting and compliance.

Today, the expectation on finance is to engage with new technologies, collaborate with various business stakeholders to create value and act as a steward of enterprise performance.

That is, deliver the right decision support to the right people at the right time.

Delivering impactful data-driven insights does not happen on its own. You need to have the right tools, processes, talent and culture working well together.

Here are some of the questions you can start asking to kickstart you on this journey:

  • Does your organization have the necessary tools and capabilities to collect and analyze the data (structured and unstructured) and make sense of it all?
  • How robust are your data governance and management processes?
  • How do you define the structure of your finance analytics team? Are you focused heavily on traditional and technical skills or diversity?
  • How are people armed to make better decisions using the data, processes, and analytical methods available?
  • As a finance leader, are you promoting a culture that bases key decisions on gut feel versus data-driven insights?

Both intuitive and analytical decisions are important and can create significant value for the business.

However, I’d recommend against promoting a culture that embraces the randomness of intuitive thinking and avoids analytical thinking completely.

What are you doing with your data?

In a world where data is ubiquitous and plentiful, it can be overwhelming to separate the noise from the important.

According to IBM, 80 percent of today’s data is originating from previously untapped, unstructured information from the web such as imagery, social media channels, news feeds, emails, journals, blogs, images, sounds and videos.

And this unstructured data holds the important insights required for faster, more informed decisions. Take news feeds as an example. Breaking news on economic policy can help you reevaluate production and supply chain decisions, and adjust forecasts accordingly.

But, for fear of missing out, many businesses end up hoarding data with little to zero analysis performed on this data. You don’t need to collect and store each and every single piece of data type out there.

Instead, only data which matters most and is key to unlocking answers to your critical business performance questions. The rest is all noise.

That’s why it’s critical to regularly ask the question, “What are we doing with all the data we have?” This way you would also be able to identify any information gaps that require filling via new data sources.

For analytics to work, people need the foresight to ask the right questions

People play a very critical role in an organizational analytics efforts. Without insightful guidance on the right questions to answer and the right hypotheses to test, analytics efforts are foiled. They cannot direct themselves.

Asking the right questions is key to drawing the right conclusions. Let’s look at the steps involved in making better decisions using data and analytics for one of your product lines. After a strong start, you’re now starting to witness a decline in online revenue but are not sure why.

Before jumping into complex analytics:

  • First, identify the problem that the data need to answer. You already know What happened. However, this is not enough. To address the issue, it’s important that you have a better understanding of the impacted segment and potential reasons for the revenue decline. This is where you start asking questions such as why is revenue declining? When did the revenue decline start? Where did it happen (country, region, city)? What might have caused this (potential drivers)? What actions can we take to address the decline?
  • Then come up with hypotheses of the issue to be addressed. A hypothesis describes a possible solution, such as a driver or a reason behind your business question. Many people make the mistake of simply jumping into data collection and analysis before forming a hypothesis to answer the critical business question. Going back to our revenue decline example, supposedly you have identified several hypotheses and then prioritized two to explain the revenue decline – A slowing down economy and impeding recession causing customers to tighten their purse strings quite a bit, and a new website design with enhanced security features but also additional online check out steps.
  • Based on these two hypotheses you can then identify the exact data that should be collected and analyzed to prove or disprove each hypothesis.

Irrespective of how advanced your analytical tools are, asking the wrong business performance questions results in flawed data being collected, and ultimately poor insights and recommendations. Hence the importance of diversity within the team.

Remember, no data or analytics model is perfect. Finding good data and making sure it’s cleaned and validated is fundamental but the organization also shouldn’t wait for perfection.

It’s easy to get fixated on perfection and fail to see the forest for the trees.

Building Trust in Your Data, Insights and Decisions

Over the years, the adoption of advanced analytics and data visualization tools by enterprises in order to generate key insights that can support quality business performance decisions has considerably evolved.

Nonetheless, not all investments in this arena are producing the desired outcomes.

Although significant sums of money, resources and time have been channeled towards data and analytics initiatives, the quality of insights generated, and ultimately the decisions are far from convincing.

Some of the possible reasons for explaining these failures include:

  • A sole focus on technology, and less on processes and people.
  • Misguided belief that the more data you have the better.
  • Lack of or poor organizational alignment and cultural resistance.

Delivering trusted insights is not a simple matter of investing in the right technology alone. Neither will simply collecting and storing large amounts of data lead to better decisions.

For instance, investing in state-of-the-art data analytics and visualization tools at the expense of data quality doesn’t automatically empower you to generate valuable insights that drive business performance.

Or having the right technology and quality data, but poor organizational alignment coupled with intense cultural resistance.

So how can your organization use data and analytics to generate trusted insights, make smarter decisions and impact business performance?

Prior generation of any business insights, trends or activity patterns, it’s imperative to ensure that the performance data to be analyzed is trusted. That is, the data must be fit for purpose, current, accurate, consistent and reliable.

With so much data and information available today, it’s increasingly difficult for decision makers to determine which data points are essential in understanding what drives the business and incorporate these into business decision making.

Resultantly, many organizations end up data hoarding and incurring unnecessary expenses along the process.

To ensure your data is fit for purpose, first and foremost, develop an understanding of the real business questions that the data needs to answer. By asking questions at the very beginning, you will be able to identify and define any problems or issues in their context.

Further, asking relevant questions will help you develop an improved understanding of the present factors, past events, or emerging opportunities driving the need for advanced analytics and data visualization investments in the first place.

Concurrently, asking the wrong questions results in ineffectual resolutions to the wrong business problems.

Generally, people want to kick off their data analysis effort by first collecting data and not asking the relevant questions, establishing the business need and formulating the analysis plan.

But, the quality of the insights and the value of the recommendations delivered are rooted in the quality of the data you start out with.

Data collection should only begin once the purpose and goals of the data analysis are clearly defined. Thus, the more and better questions you ask, the better your insights, recommendations, decisions and ultimately business performance will be.

Data accuracy, consistency and reliability

Ensuring that data used in analysis is accurate, consistent and reliable remains one of the common hurdles hampering analytics initiatives. One of the reasons for this is multiple data sources.

For example, suppose you want to analyze and evaluate the impact of a given marketing campaign on customer experience, in the past to do so you would have managed with data from your CRM system only.

Today, in order to get a 360-degree view of your customers, analyzing CRM data alone is not sufficient. You also need to analyze data from your web analytic system, customer feedback surveys, data warehouses, email, call recordings, social media or production environments.

Thus, you need to bring these disparate data sources together to generate valuable, actionable insights.

Issues arise when the multiple data sources define key business performance terms or metrics (e.g. sales, conversion, margin growth) differently and there is no common definition shared across the business.

This lack of shared understanding of business performance has far reaching implications on your analysis and makes it futile to compare results.

Data collection, followed by data cleansing and validation, are all therefore important. After you have collected the data to be used in the analysis per your analysis plan, it’s important to clean and validate the data to make it useable and accurate. Select a small data sample and compare it with what you expected.

Ensure the data type matches your expectations. Remember the adage: garbage in, garbage out. Even if you have the best of the breed analytics tools to slice and dice information, with poor data quality practices, your insights and decisions will remain unsubstantiated, erroneous and worthless.

For this reason and to ensure the consistency of the data used in your analysis, know and understand the source of every piece of your data, the age of the data, the components of the data, what is not included in the data, when and how often the data is refreshed.

Without data integrity, it is difficult to generate trusted insights and deliver recommendations that decision makers can confidently act on. Also, delivering trusted insights goes beyond merely attaching a colourful report in an email and hitting the send button.

Instead, the insights must be delivered in a digestible and actionable format. That is, communicated in a format that is more readily understood by decision makers in the organization. Poorly communicated insights can result in no action, negating the value of the insights in the first place. Thus, avoid getting bogged down in details.

Knowing is powerful, but acting based on knowledge is what drives smart decisions, creates true value and impacts business performance.

Converting Data Into Insights: The Right Technology Alone is No Guarantee of Success

The fact that technology is playing an increasingly significant role in executing many traditional finance tasks while at the same time generating greater insights that drive business performance is irrefutable.

However, even though many organizations are significantly investing in advanced data and analytics technologies, majority of these investments are reportedly yielding disappointing returns.

This is often because they focus mostly on the implementation of new tools, and less on the processes and people. For instance, there is a widespread misconception that data analytics is a technology issue, and also about having the most data.

As a result, many organizations are ultimately spending large amounts of money on new technologies capable of mining and managing large datasets.

There is relatively little focus on generating actionable insights out of the data, and building a data-driven culture, which is the people aspect of analysis.

Data analytics is not purely a technology issue. Instead, it is a strategic business imperative that entails leveraging new technologies to analyze the data at your disposal, gain greater insight about your business, and guide effective strategy execution.

Furthermore, having the right data is more important than having the most data. Your organization might have the most data, but if you’re not analyzing that data and asking better questions to help you make better decisions, it counts for nothing.

Reconsider your approach to people, processes and technology

The success of any new technology greatly depends on the skills of the people using it. Additionally, you need to convince people to use the new technology or system, otherwise it will end up being another worthless investment.

Given digital transformation is here to stay, for any technology transformation to be a success, it’s imperative to have the right balance of people, IT and digital skills.

It’s not an issue of technology versus humans. It’s about striking the right balance between technology and people, with each other doing what they do best. In other words, establishing the right combination of people and technology.

For example, advanced data analytics technologies are, by far, better than humans at analyzing complex data streams. They can easily identify trends and patterns in real-time.

But, generating useful insights from the data all depends on human ability to ask the right key performance questions, and identify specific questions that existing tools and techniques are currently not able to answer.

Rather than hastily acquiring new data and tools, start by reviewing current data analytics tools, systems and related applications. This helps identify existing capabilities including tangible opportunities for improvement.

It’s not about how much the business has invested or is willing to invest in data analytics capabilities; it’s about how the business is leveraging existing tools, data and insights it currently has to drive business growth, and where necessary, blending traditional BI tools with new emerging data analytics tools.

That’s why it’s critical to build a clear understanding of which new technologies will be most beneficial to your business.

Technology alone will not fix broken or outdated processes. Many businesses are spending significant amounts of money on new tools, only to find out later that it’s not the existing tool that is at fault but rather the process itself.

Take data management process as an example, mostly at larger, more complex organizations. Achieving a single view of the truth is repeatedly challenging. This is often because data is often confined in silos, inconsistently defined or locked behind access controls.

A centralized data governance model is the missing link. There are too many fragmented ERP systems. Data is spread across the business, and it’s difficult to identify all data projects and how they are systematized across the business.

In such cases, even if you acquire the right data analytics tools, the fact that you have a weak data governance model as well as a non-integrated systems infrastructure can act as a stumbling block to generating reliable and useful decision-support insights.

To fix the faltering data management process, you need to establish a data governance model that is flexible across the business and provides a centralized view of enterprise data – that is, the data the organization owns and how it is being managed and used across the business.

This is key to breaking down internal silos, achieving a single view of the truth, and building trust in your data to enable effective analytics and decision making.

Is your organization spending more time collecting and organizing data than analyzing it?

Have you put in place robust data governance and ownership processes required to achieve a single version of the truth?

Are you attracting and retaining the right data analytics talent and skills necessary to drive data exploration, experimentation and decision making?

More Data Doesn’t Always Lead to Better Decisions

Thanks to advancements in technology, our digital lives are producing expansive amounts of data on a daily basis.

In addition to this enormous amount of data that is produced each day, the diversity of data types and data sources, and the speed with which data is generated, analyzed and reprocessed has increasingly become unwieldy.

With more data continuously coming from social spheres, mobile devices, cameras, sensors and connected devices, purchase transactions and GPS signals to name a few, it does not look like the current data explosion will ebb soon.

Instead, investments in IoT and advanced analytics are expected to grow in the immediate future. Thinking back, investing in advanced data analytics to generate well-informed insights that effectively support decision making and drive business performance have been the paragon of big corporations.

And smaller businesses and organizations, as a result, have for some time embraced the flawed view that such investments are beyond their reach. It’s no surprise then that adoption has been at a snail’s pace.

Thanks to democratization of technology, new technologies are starting to get into the hands of smaller businesses and organizations. The solutions are now being packaged into simple, easy-to-deploy applications that most users without specialized training are able to operate.

Further, acquisition costs have significantly reduced thereby obviating the upfront cost barrier, that for years, has acted as a drag on many company IT investments.

While the application of data management and advanced analytics tools is now foundational and becoming ubiquitous, growing into a successful data-driven organization is about getting the right data to the right person at the right time to make the right decision.

Distorted claims such as data is the new oil have, unfortunately, prompted some companies to embark on unfruitful data hoarding sprees. It is true that oil is a valuable commodity with plentiful uses. But, it is also a scarce resource not widely available to everyone.

The true value of oil is unearthed after undergoing a refinement process. On the contrary, data is not scarce. It is widely available. Nonetheless, akin to oil, the true value of data is unlocked after we have processed and analyzed it to generate leading-edge insights.

It’s a waste of time and resources to just hoard data and not analyze it to get a better understanding of what has happened in the past and why it has happened. Such insights are crucial to predicting future business performance scenarios and exploiting opportunities.

More data doesn’t necessarily lead to better decisions. Better decisions emanate from having a profound ability to analyze useful data and make key observations that would have otherwise remained hidden.

Data is widely available, what is scarce is the ability to extract informed insights that support decision-making and propel the business forward.

To avoid data hoarding, it is necessary to first carry out a data profiling exercise as this will assist you establish if any of your existing data can be easily used for other purposes. It also helps ascertain whether existing records are up to date and also if your information sources are still fit-for -purpose.

At any given time, data quality trumps data quantity. That is why it is important to get your data in one place where it can easily be accessed for analysis and produce a single version of the truth.

Unlike in the past where data was kept in different systems that were unable to talk to each other making it difficult to consolidate and analyze data to facilitate faster decision making, the price of computing and storage has plummeted and now the systems are being linked.

As a result, companies can now use data-mining techniques to sort through large data sets to identify patterns and establish relationships to solve problems through data analysis. If the data is of poor quality, insights generated from the analysis will also be of poor quality.

Let’s take customer transactional data as an example. In order to reveal hidden correlations or insights from the data, it’s advisable to analyze the information flow in real-time; by the hour, by the day, by the week, by the month, over the past year and more. This lets you proactively respond to the ups and downs of dynamic business conditions.

Imagine what could happen if you waited months before you could analyze the transactional data? By the time you do so, your insights are a product of “dead data”. Technology is no longer an inhibitor, but culture and the lack of leadership mandate.

As data become more abundant, the main problem is no longer finding the information as such but giving business unit managers precise answers about business performance easily and quickly.

What matters most is data quality, what you do with the data you have collected and not how much you collect. Instead of making more hay, start looking for the needle in the haystack.

Migrating From On-Premise to Cloud-Based Solutions

Gartner forecasts cloud computing to be a $278 billion business by 2021, as companies increasingly adopt cloud services to realize their desired digital business outcomes. In a separate survey of 439 global financial executives, the research company found that finance is moving to the cloud much faster than expected.

By 2020, 36 percent of enterprises are expected to be using the cloud to support more than half of their transactional systems of record.

With cloud technology promising to provide the speed and agility that the business requires, including generating significant cost savings and new sources of revenue, it’s not surprising the cloud market is experiencing a boom.

While companies have experienced both organic and inorganic growth, many have struggled to keep pace with rapidly changing technology landscape. Their businesses are saddled with legacy technology infrastructures that are slowing progress towards the achievement of strategic objectives as they are slow to react to change.

Given the need to react more quickly, at the same time become more proactive and drive business performance, senior finance executives are turning to cloud-based financial solutions to provide real-time management information reporting and decision support.

Adopting cloud not a simple and quick decision

Looking at the expected growth metrics for the cloud market and then hearing of all the novel opportunities offered by cloud computing, CFOs and CIOs are enticed to believe that it is only a matter of time before the organization’s computing technology and data is skyward.

However, over the course of my career I have come to understand that the only thing right about a forecast is that it’s wrong. Either the forecast is close to reality or very far from reality.

Cost savings is cited most often by senior finance executives as the reason for adopting cloud technology. Considering that cloud is essentially a form of outsourcing arrangement in which client organizations use the internet to connect to a variety of applications, storage services, hardware resources, platforms and other IT capabilities offered by cloud service providers, upfront capital investment and maintenance costs are minimal.

This is because the cloud service provider owns the necessary hardware and other resources needed to deliver these services and is also responsible for employing the staff required to support them. Unlike on-premise solutions where the business has to incur large capital outlays for hardware and IT support staff, cloud is an operating expense (except implementation costs which can be capitalized), that is payable as and when incurred.

With cloud, a business is able to add new licences to its subscription when it needs them and scale its licensing costs with its growth, rather than buying in bulk upfront. This is in direct contrast to traditional on-premise software where companies in anticipation of business growth have over invested in IT systems hoping to add more users to the user list soon after the growth is achieved.

However, when deciding whether to invest in cloud or not, CFOs should look beyond cost benefits. In addition to cost savings, they should also consider tactical and more strategic and structural issues. Unfortunately, the challenge for many finance professionals is that when evaluating investments the focus is solely on cost. We fail to examine and evaluate the non-financial risks and strategic impact of the investment on the business.

Strategic and structural considerations

As I have written in the past, most organizations get caught up in the hype of new technologies and end up making technology investments that are unaligned to the strategy of the business. There is no strong business case for investing in the new technology. Don’t allow yourself to fall into the same trap.

Investing in cloud is not an IT or finance only decision. Decisions of what to migrate to the cloud, when to migrate it, and how to transition from an on-premise environment to a cloud-based environment all require a collaborated effort if the organization is to achieve its stated objectives.

Further, transitioning to the cloud computing environment  is not a matter of flicking the switch up and down. You need to have deeper understanding of the cloud resources (public cloud, private cloud, community cloud and hybrid cloud) available on the market, their delivery models (SaaS, PaaS and IaaS) and how these all fit together into your business and technology model.

Understanding the cloud model will help you determine whether cloud is appropriate in the context of your business purpose and risks. For example, in the case of public cloud, the applications and cloud-based services are positioned as utilities available to anyone who signs on.

If over the years your company has built and strengthened IT capabilities that differentiate you from competitors, migrating to the cloud can mean walking away from your success recipe and expose yourself to vulnerabilities.

Therefore, if you  are planning to migrate your on-premise computing environment to the cloud, take a long-term view of your IT environment and determine what type of applications are candidates for the cloud and which will not be transitioned until the distant future.

Ideally, you should start with applications that have low risk associated with them or those that have a business need that cannot be met using traditional computing capabilities. Once you have build greater comfort and trust in the cloud, you can then scale to include other applications.

The pain of disintegration

It is no secret that many businesses today are re-evaluating their technology infrastructure and leveraging new technologies to respond faster and be more flexible. But is cloud computing right for your business? Is speed to deploy more important to you than having systems that talk to each other?

In a world where each cloud service provider is claiming that their solution is better than the next one on the same shelf, CFOs and CIOs are grappling with the decision of which cloud service provider to go with. As a result, the company ends up doing business with more than one vendor to compensate for the shortfalls of the other system.

The problem arises when the cloud-based application from one vendor is unable to work with an application from another provider resulting in more than one version of the truth. In other cases, the company’s on-premise IT infrastructure does not allow sharing data with multiple cloud-based applications, which in turn results in finance spending time consolidating and reconciling data from disparate systems in excel.

Given that the cloud model depends on providing essentially a standardized package across the board, it is important to weigh the pros and cons of foregoing customization versus rapid implementation. Because the cloud market is projected to grow in the coming years, many IT solution providers are channeling money towards cloud-based solutions. This has resulted in some vendors withdrawing IT support on legacy ERP systems and phasing them out.

In some cases, the vendors have installed upgrades to the same solutions. The problem with these solutions is that they were originally not built with modern business requirements in mind hence they can only get you a few more years of support.

It is therefore important for CFOs and CIOs to be cognizant whether the solution was originally designed as a cloud-based solution, or it is a modified version of a solution initially designed to function in a traditional on-premise client-ownership model.

With data being found everywhere today and advanced analytics playing a critical role in supporting key decision making, delivering one version of truth has never been so important. In order to make sense of all the data at its disposal a company should focus its efforts on centralizing data management and IT policies. Having a single data repository ensures everyone is drinking from the same well for information and insights.

However, in companies where IT governance is weak the tendency is for teams to autonomously adopt cloud-based solutions that meet their individual needs. This is counter-productive to data management centralization efforts as data normally ends up spread across multiple cloud-based systems that are dis-aggregated.

Just imagine a scenario where FP&A, tax, treasury, procurement, supply chain, and other finance functions each identify and select their own cloud solution. Consolidating and analyzing relevant data from these various systems to get a big picture view of business performance in itself is a huge task to complete as that data is divided across many domains.

While each cloud-based move may look beneficial in isolation, adopted together they may increase operating expenses to a level that undermines the anticipated savings.

Control versus no control

Although cloud-based solutions offer more affordable options and more flexible payment plans than traditional software providers, the issue of data control is still a concern. Cyber criminals are getting smarter by the day,and the fact is, whether organizational data resides on the internet or offline on the company’s network, it’s never completely immune to attack.

When it comes to data security, it is imperative for CFOs and CIOs to know that the moment data is no longer in-house, the business may end up having less control over who has access to key systems and data. The fact that you have outsourced your IT systems and data control to a third party does not make your company immune to a lawsuit in the event of a breach. You therefore need to sign agreements that protect you against various types of misfortunes.

Although the cloud service provider employs a team of IT support staff to monitor and perform regular threat assessments and deploy the latest patch immediately, the organization should not relax and assume the house is in order. You need to get strong assurances from your cloud vendor that your business data is safe.

You also need to know where your data is stored, the applicable data laws, how often it is backed up and whether you are able to perform audits on the data. Other factors to consider include end of agreements for convenience. Many software vendors try to lock in client organizations for a significant period of time. This in turn makes it difficult for client organizations to change vendors without incurring costs or changing systems some way.

Thus, when negotiating an agreement with a cloud-based service provider, try by all means to ensure that you are not locked-in for lengthy periods just in case you need to change technology providers in the near future.

Don’t rush to the cloud without a clearly defined vision and road map. Engage in plenty of deliberation to ensure the new surpasses the old, sample technologies with less risk and related influences on the business and then scale the adoption.

© 2019 ERPM Insights

Theme by Anders NorénUp ↑