categoryReporting and Analytics

Building Trust in Your Data, Insights and Decisions

Over the years, the adoption of advanced analytics and data visualization tools by enterprises in order to generate key insights that can support quality business performance decisions has considerably evolved.

Nonetheless, not all investments in this arena are producing the desired outcomes.

Although significant sums of money, resources and time have been channeled towards data and analytics initiatives, the quality of insights generated, and ultimately the decisions are far from convincing.

Some of the possible reasons for explaining these failures include:

  • A sole focus on technology, and less on processes and people.
  • Misguided belief that the more data you have the better.
  • Lack of or poor organizational alignment and cultural resistance.

Delivering trusted insights is not a simple matter of investing in the right technology alone. Neither will simply collecting and storing large amounts of data lead to better decisions.

For instance, investing in state-of-the-art data analytics and visualization tools at the expense of data quality doesn’t automatically empower you to generate valuable insights that drive business performance.

Or having the right technology and quality data, but poor organizational alignment coupled with intense cultural resistance.

So how can your organization use data and analytics to generate trusted insights, make smarter decisions and impact business performance?

Prior generation of any business insights, trends or activity patterns, it’s imperative to ensure that the performance data to be analyzed is trusted. That is, the data must be fit for purpose, current, accurate, consistent and reliable.

With so much data and information available today, it’s increasingly difficult for decision makers to determine which data points are essential in understanding what drives the business and incorporate these into business decision making.

Resultantly, many organizations end up data hoarding and incurring unnecessary expenses along the process.

To ensure your data is fit for purpose, first and foremost, develop an understanding of the real business questions that the data needs to answer. By asking questions at the very beginning, you will be able to identify and define any problems or issues in their context.

Further, asking relevant questions will help you develop an improved understanding of the present factors, past events, or emerging opportunities driving the need for advanced analytics and data visualization investments in the first place.

Concurrently, asking the wrong questions results in ineffectual resolutions to the wrong business problems.

Generally, people want to kick off their data analysis effort by first collecting data and not asking the relevant questions, establishing the business need and formulating the analysis plan.

But, the quality of the insights and the value of the recommendations delivered are rooted in the quality of the data you start out with.

Data collection should only begin once the purpose and goals of the data analysis are clearly defined. Thus, the more and better questions you ask, the better your insights, recommendations, decisions and ultimately business performance will be.

Data accuracy, consistency and reliability

Ensuring that data used in analysis is accurate, consistent and reliable remains one of the common hurdles hampering analytics initiatives. One of the reasons for this is multiple data sources.

For example, suppose you want to analyze and evaluate the impact of a given marketing campaign on customer experience, in the past to do so you would have managed with data from your CRM system only.

Today, in order to get a 360-degree view of your customers, analyzing CRM data alone is not sufficient. You also need to analyze data from your web analytic system, customer feedback surveys, data warehouses, email, call recordings, social media or production environments.

Thus, you need to bring these disparate data sources together to generate valuable, actionable insights.

Issues arise when the multiple data sources define key business performance terms or metrics (e.g. sales, conversion, margin growth) differently and there is no common definition shared across the business.

This lack of shared understanding of business performance has far reaching implications on your analysis and makes it futile to compare results.

Data collection, followed by data cleansing and validation, are all therefore important. After you have collected the data to be used in the analysis per your analysis plan, it’s important to clean and validate the data to make it useable and accurate. Select a small data sample and compare it with what you expected.

Ensure the data type matches your expectations. Remember the adage: garbage in, garbage out. Even if you have the best of the breed analytics tools to slice and dice information, with poor data quality practices, your insights and decisions will remain unsubstantiated, erroneous and worthless.

For this reason and to ensure the consistency of the data used in your analysis, know and understand the source of every piece of your data, the age of the data, the components of the data, what is not included in the data, when and how often the data is refreshed.

Without data integrity, it is difficult to generate trusted insights and deliver recommendations that decision makers can confidently act on. Also, delivering trusted insights goes beyond merely attaching a colourful report in an email and hitting the send button.

Instead, the insights must be delivered in a digestible and actionable format. That is, communicated in a format that is more readily understood by decision makers in the organization. Poorly communicated insights can result in no action, negating the value of the insights in the first place. Thus, avoid getting bogged down in details.

Knowing is powerful, but acting based on knowledge is what drives smart decisions, creates true value and impacts business performance.

Converting Data Into Insights: The Right Technology Alone is No Guarantee of Success

The fact that technology is playing an increasingly significant role in executing many traditional finance tasks while at the same time generating greater insights that drive business performance is irrefutable.

However, even though many organizations are significantly investing in advanced data and analytics technologies, majority of these investments are reportedly yielding disappointing returns.

This is often because they focus mostly on the implementation of new tools, and less on the processes and people. For instance, there is a widespread misconception that data analytics is a technology issue, and also about having the most data.

As a result, many organizations are ultimately spending large amounts of money on new technologies capable of mining and managing large datasets.

There is relatively little focus on generating actionable insights out of the data, and building a data-driven culture, which is the people aspect of analysis.

Data analytics is not purely a technology issue. Instead, it is a strategic business imperative that entails leveraging new technologies to analyze the data at your disposal, gain greater insight about your business, and guide effective strategy execution.

Furthermore, having the right data is more important than having the most data. Your organization might have the most data, but if you’re not analyzing that data and asking better questions to help you make better decisions, it counts for nothing.

Reconsider your approach to people, processes and technology

The success of any new technology greatly depends on the skills of the people using it. Additionally, you need to convince people to use the new technology or system, otherwise it will end up being another worthless investment.

Given digital transformation is here to stay, for any technology transformation to be a success, it’s imperative to have the right balance of people, IT and digital skills.

It’s not an issue of technology versus humans. It’s about striking the right balance between technology and people, with each other doing what they do best. In other words, establishing the right combination of people and technology.

For example, advanced data analytics technologies are, by far, better than humans at analyzing complex data streams. They can easily identify trends and patterns in real-time.

But, generating useful insights from the data all depends on human ability to ask the right key performance questions, and identify specific questions that existing tools and techniques are currently not able to answer.

Rather than hastily acquiring new data and tools, start by reviewing current data analytics tools, systems and related applications. This helps identify existing capabilities including tangible opportunities for improvement.

It’s not about how much the business has invested or is willing to invest in data analytics capabilities; it’s about how the business is leveraging existing tools, data and insights it currently has to drive business growth, and where necessary, blending traditional BI tools with new emerging data analytics tools.

That’s why it’s critical to build a clear understanding of which new technologies will be most beneficial to your business.

Technology alone will not fix broken or outdated processes. Many businesses are spending significant amounts of money on new tools, only to find out later that it’s not the existing tool that is at fault but rather the process itself.

Take data management process as an example, mostly at larger, more complex organizations. Achieving a single view of the truth is repeatedly challenging. This is often because data is often confined in silos, inconsistently defined or locked behind access controls.

A centralized data governance model is the missing link. There are too many fragmented ERP systems. Data is spread across the business, and it’s difficult to identify all data projects and how they are systematized across the business.

In such cases, even if you acquire the right data analytics tools, the fact that you have a weak data governance model as well as a non-integrated systems infrastructure can act as a stumbling block to generating reliable and useful decision-support insights.

To fix the faltering data management process, you need to establish a data governance model that is flexible across the business and provides a centralized view of enterprise data – that is, the data the organization owns and how it is being managed and used across the business.

This is key to breaking down internal silos, achieving a single view of the truth, and building trust in your data to enable effective analytics and decision making.

Is your organization spending more time collecting and organizing data than analyzing it?

Have you put in place robust data governance and ownership processes required to achieve a single version of the truth?

Are you attracting and retaining the right data analytics talent and skills necessary to drive data exploration, experimentation and decision making?

More Data Doesn’t Always Lead to Better Decisions

Thanks to advancements in technology, our digital lives are producing expansive amounts of data on a daily basis.

In addition to this enormous amount of data that is produced each day, the diversity of data types and data sources, and the speed with which data is generated, analyzed and reprocessed has increasingly become unwieldy.

With more data continuously coming from social spheres, mobile devices, cameras, sensors and connected devices, purchase transactions and GPS signals to name a few, it does not look like the current data explosion will ebb soon.

Instead, investments in IoT and advanced analytics are expected to grow in the immediate future. Thinking back, investing in advanced data analytics to generate well-informed insights that effectively support decision making and drive business performance have been the paragon of big corporations.

And smaller businesses and organizations, as a result, have for some time embraced the flawed view that such investments are beyond their reach. It’s no surprise then that adoption has been at a snail’s pace.

Thanks to democratization of technology, new technologies are starting to get into the hands of smaller businesses and organizations. The solutions are now being packaged into simple, easy-to-deploy applications that most users without specialized training are able to operate.

Further, acquisition costs have significantly reduced thereby obviating the upfront cost barrier, that for years, has acted as a drag on many company IT investments.

While the application of data management and advanced analytics tools is now foundational and becoming ubiquitous, growing into a successful data-driven organization is about getting the right data to the right person at the right time to make the right decision.

Distorted claims such as data is the new oil have, unfortunately, prompted some companies to embark on unfruitful data hoarding sprees. It is true that oil is a valuable commodity with plentiful uses. But, it is also a scarce resource not widely available to everyone.

The true value of oil is unearthed after undergoing a refinement process. On the contrary, data is not scarce. It is widely available. Nonetheless, akin to oil, the true value of data is unlocked after we have processed and analyzed it to generate leading-edge insights.

It’s a waste of time and resources to just hoard data and not analyze it to get a better understanding of what has happened in the past and why it has happened. Such insights are crucial to predicting future business performance scenarios and exploiting opportunities.

More data doesn’t necessarily lead to better decisions. Better decisions emanate from having a profound ability to analyze useful data and make key observations that would have otherwise remained hidden.

Data is widely available, what is scarce is the ability to extract informed insights that support decision-making and propel the business forward.

To avoid data hoarding, it is necessary to first carry out a data profiling exercise as this will assist you establish if any of your existing data can be easily used for other purposes. It also helps ascertain whether existing records are up to date and also if your information sources are still fit-for -purpose.

At any given time, data quality trumps data quantity. That is why it is important to get your data in one place where it can easily be accessed for analysis and produce a single version of the truth.

Unlike in the past where data was kept in different systems that were unable to talk to each other making it difficult to consolidate and analyze data to facilitate faster decision making, the price of computing and storage has plummeted and now the systems are being linked.

As a result, companies can now use data-mining techniques to sort through large data sets to identify patterns and establish relationships to solve problems through data analysis. If the data is of poor quality, insights generated from the analysis will also be of poor quality.

Let’s take customer transactional data as an example. In order to reveal hidden correlations or insights from the data, it’s advisable to analyze the information flow in real-time; by the hour, by the day, by the week, by the month, over the past year and more. This lets you proactively respond to the ups and downs of dynamic business conditions.

Imagine what could happen if you waited months before you could analyze the transactional data? By the time you do so, your insights are a product of “dead data”. Technology is no longer an inhibitor, but culture and the lack of leadership mandate.

As data become more abundant, the main problem is no longer finding the information as such but giving business unit managers precise answers about business performance easily and quickly.

What matters most is data quality, what you do with the data you have collected and not how much you collect. Instead of making more hay, start looking for the needle in the haystack.

Migrating From On-Premise to Cloud-Based Solutions

Gartner forecasts cloud computing to be a $278 billion business by 2021, as companies increasingly adopt cloud services to realize their desired digital business outcomes. In a separate survey of 439 global financial executives, the research company found that finance is moving to the cloud much faster than expected.

By 2020, 36 percent of enterprises are expected to be using the cloud to support more than half of their transactional systems of record.

With cloud technology promising to provide the speed and agility that the business requires, including generating significant cost savings and new sources of revenue, it’s not surprising the cloud market is experiencing a boom.

While companies have experienced both organic and inorganic growth, many have struggled to keep pace with rapidly changing technology landscape. Their businesses are saddled with legacy technology infrastructures that are slowing progress towards the achievement of strategic objectives as they are slow to react to change.

Given the need to react more quickly, at the same time become more proactive and drive business performance, senior finance executives are turning to cloud-based financial solutions to provide real-time management information reporting and decision support.

Adopting cloud not a simple and quick decision

Looking at the expected growth metrics for the cloud market and then hearing of all the novel opportunities offered by cloud computing, CFOs and CIOs are enticed to believe that it is only a matter of time before the organization’s computing technology and data is skyward.

However, over the course of my career I have come to understand that the only thing right about a forecast is that it’s wrong. Either the forecast is close to reality or very far from reality.

Cost savings is cited most often by senior finance executives as the reason for adopting cloud technology. Considering that cloud is essentially a form of outsourcing arrangement in which client organizations use the internet to connect to a variety of applications, storage services, hardware resources, platforms and other IT capabilities offered by cloud service providers, upfront capital investment and maintenance costs are minimal.

This is because the cloud service provider owns the necessary hardware and other resources needed to deliver these services and is also responsible for employing the staff required to support them. Unlike on-premise solutions where the business has to incur large capital outlays for hardware and IT support staff, cloud is an operating expense (except implementation costs which can be capitalized), that is payable as and when incurred.

With cloud, a business is able to add new licences to its subscription when it needs them and scale its licensing costs with its growth, rather than buying in bulk upfront. This is in direct contrast to traditional on-premise software where companies in anticipation of business growth have over invested in IT systems hoping to add more users to the user list soon after the growth is achieved.

However, when deciding whether to invest in cloud or not, CFOs should look beyond cost benefits. In addition to cost savings, they should also consider tactical and more strategic and structural issues. Unfortunately, the challenge for many finance professionals is that when evaluating investments the focus is solely on cost. We fail to examine and evaluate the non-financial risks and strategic impact of the investment on the business.

Strategic and structural considerations

As I have written in the past, most organizations get caught up in the hype of new technologies and end up making technology investments that are unaligned to the strategy of the business. There is no strong business case for investing in the new technology. Don’t allow yourself to fall into the same trap.

Investing in cloud is not an IT or finance only decision. Decisions of what to migrate to the cloud, when to migrate it, and how to transition from an on-premise environment to a cloud-based environment all require a collaborated effort if the organization is to achieve its stated objectives.

Further, transitioning to the cloud computing environment  is not a matter of flicking the switch up and down. You need to have deeper understanding of the cloud resources (public cloud, private cloud, community cloud and hybrid cloud) available on the market, their delivery models (SaaS, PaaS and IaaS) and how these all fit together into your business and technology model.

Understanding the cloud model will help you determine whether cloud is appropriate in the context of your business purpose and risks. For example, in the case of public cloud, the applications and cloud-based services are positioned as utilities available to anyone who signs on.

If over the years your company has built and strengthened IT capabilities that differentiate you from competitors, migrating to the cloud can mean walking away from your success recipe and expose yourself to vulnerabilities.

Therefore, if you  are planning to migrate your on-premise computing environment to the cloud, take a long-term view of your IT environment and determine what type of applications are candidates for the cloud and which will not be transitioned until the distant future.

Ideally, you should start with applications that have low risk associated with them or those that have a business need that cannot be met using traditional computing capabilities. Once you have build greater comfort and trust in the cloud, you can then scale to include other applications.

The pain of disintegration

It is no secret that many businesses today are re-evaluating their technology infrastructure and leveraging new technologies to respond faster and be more flexible. But is cloud computing right for your business? Is speed to deploy more important to you than having systems that talk to each other?

In a world where each cloud service provider is claiming that their solution is better than the next one on the same shelf, CFOs and CIOs are grappling with the decision of which cloud service provider to go with. As a result, the company ends up doing business with more than one vendor to compensate for the shortfalls of the other system.

The problem arises when the cloud-based application from one vendor is unable to work with an application from another provider resulting in more than one version of the truth. In other cases, the company’s on-premise IT infrastructure does not allow sharing data with multiple cloud-based applications, which in turn results in finance spending time consolidating and reconciling data from disparate systems in excel.

Given that the cloud model depends on providing essentially a standardized package across the board, it is important to weigh the pros and cons of foregoing customization versus rapid implementation. Because the cloud market is projected to grow in the coming years, many IT solution providers are channeling money towards cloud-based solutions. This has resulted in some vendors withdrawing IT support on legacy ERP systems and phasing them out.

In some cases, the vendors have installed upgrades to the same solutions. The problem with these solutions is that they were originally not built with modern business requirements in mind hence they can only get you a few more years of support.

It is therefore important for CFOs and CIOs to be cognizant whether the solution was originally designed as a cloud-based solution, or it is a modified version of a solution initially designed to function in a traditional on-premise client-ownership model.

With data being found everywhere today and advanced analytics playing a critical role in supporting key decision making, delivering one version of truth has never been so important. In order to make sense of all the data at its disposal a company should focus its efforts on centralizing data management and IT policies. Having a single data repository ensures everyone is drinking from the same well for information and insights.

However, in companies where IT governance is weak the tendency is for teams to autonomously adopt cloud-based solutions that meet their individual needs. This is counter-productive to data management centralization efforts as data normally ends up spread across multiple cloud-based systems that are dis-aggregated.

Just imagine a scenario where FP&A, tax, treasury, procurement, supply chain, and other finance functions each identify and select their own cloud solution. Consolidating and analyzing relevant data from these various systems to get a big picture view of business performance in itself is a huge task to complete as that data is divided across many domains.

While each cloud-based move may look beneficial in isolation, adopted together they may increase operating expenses to a level that undermines the anticipated savings.

Control versus no control

Although cloud-based solutions offer more affordable options and more flexible payment plans than traditional software providers, the issue of data control is still a concern. Cyber criminals are getting smarter by the day,and the fact is, whether organizational data resides on the internet or offline on the company’s network, it’s never completely immune to attack.

When it comes to data security, it is imperative for CFOs and CIOs to know that the moment data is no longer in-house, the business may end up having less control over who has access to key systems and data. The fact that you have outsourced your IT systems and data control to a third party does not make your company immune to a lawsuit in the event of a breach. You therefore need to sign agreements that protect you against various types of misfortunes.

Although the cloud service provider employs a team of IT support staff to monitor and perform regular threat assessments and deploy the latest patch immediately, the organization should not relax and assume the house is in order. You need to get strong assurances from your cloud vendor that your business data is safe.

You also need to know where your data is stored, the applicable data laws, how often it is backed up and whether you are able to perform audits on the data. Other factors to consider include end of agreements for convenience. Many software vendors try to lock in client organizations for a significant period of time. This in turn makes it difficult for client organizations to change vendors without incurring costs or changing systems some way.

Thus, when negotiating an agreement with a cloud-based service provider, try by all means to ensure that you are not locked-in for lengthy periods just in case you need to change technology providers in the near future.

Don’t rush to the cloud without a clearly defined vision and road map. Engage in plenty of deliberation to ensure the new surpasses the old, sample technologies with less risk and related influences on the business and then scale the adoption.

Delivering Reliable Insights From Data Analytics

Data and analytics are becoming increasingly integral to business decisions. Organizations across all sectors are leveraging advanced real-time predictive analytics to balance intuitive decision-making and data-driven insights to monitor brand sentiment on social media, better serve their customers, streamline operations, manage risks and ultimately drive strategic performance.

Traditional business intelligence reporting tools with static charts and graphs are being replaced by interactive data visualization tools that enable business insights to be shared in a format that is easily understood by decision makers, in turn enabling better, faster operational and strategic decision-making.

Given the constantly changing business landscape driven by increased competition, macro and geopolitical risks, intense regulatory demands, complex supply chains, advanced technological changes etc. decision makers are turning to finance teams for actionable insights to help them navigate through this volatility and uncertainty.

As business unit managers increasingly rely on finance decision support for enhanced business performance, it is imperative for CFOs and their teams to ensure the performance insights they are delivering are informed and actionable. However, a 2016 survey report by KPMG revealed that 60% of the survey participants are not very confident in their data and analytics insights.

Data Quality or Quantity?

In today’s world of exponential data growth, the ability for finance to deliver reliable and actionable insights rests not on the quantity of data collected, analyzed and interpreted, but rather on the quality of that data. The starting point for any successful data analytics initiative involves clarifying why the business needs to collect a particular type of data.

One of the challenges facing many businesses today is identifying and selecting which internal and external sources of data to focus on. As a result, these companies end up investing in data warehouses full of massive amounts of useless data. To avoid being data rich and insight poor, CFOs need to understand  the role of quality in developing and managing tools, data and analytics.

Before inputting data into any analytical model, it is important to first assess the appropriateness of your data sources. Do they provide relevant and accurate data for analysis and interpretation? Instead of relying on a single source of data for analysis, you need to have the ability to blend and analyze multiple sources of data. This will help make informed decisions and drive business performance.

Further, businesses are operating in a period of rapid market changes. Market and customer data is getting outdated quickly. As a result, being agile and having the ability to quickly respond to changing market conditions has become a critical requirement for survival. The business cannot afford to sit on raw data for longer periods. Capabilities that enable data to be accessed, stored, retrieved and analysed in a timely basis should be enhanced.

Thus, in order to provide business users with access to real-time and near-time operational and financial data, the organization should focus on reducing data latency. Reducing data latency allows finance teams to run ad-hoc reports to answer specific business questions which in turn enables decision makers to make important decisions more quickly.

In the event that finance provides business insights or recommendations based on inaccurate data, analysis and predictions, this will quickly erode, if not extinguish, business trust and shake the confidence of those decision makers who rely on these predictions to make informed decisions.

As data volumes increase and new uses emerge, the challenge will only increase. It is therefore critical for finance to put in place robust data governance structures that assess and evaluate the accuracy of data analytics inputs and outputs.

Work with business partners to set objectives up front

Churning out performance reports that do not influence decision making is a waste of time yet that is what most finance teams are spending their time doing. It is not always the case that the numbers that make sense to finance will also make sense to business partners.

The biggest problem in the majority of these cases is lack of understanding by finance of the business objectives. Instead of collaborating with the business to develop a better understanding of their operations and how performance should be measured and reported, many finance analytics teams are working in their own silos without truly linking their activities back to business outcomes.

To improve finance’s reporting outcomes, the function should take stock of the reports it produces per month, quarter or annually. Then evaluate the nature and purpose of each report produced and what key decisions it helps to drive. It is not about the quantity of reports that matters, but rather the quality of the reports.

Business partners need to be engaged at the start of the process and throughout the analytics process. They need to be involved as finance explores the data and develop insights to ensure that when the modeling is complete, the results make sense from a business perspective

By working with business partners and setting objectives up front, finance will be able to focus its efforts and resources on value-add reports that tell a better business story. Further, the function will be able to assess and monitor the effectiveness of its data models in supporting business decisions

Simplify interconnected analytics

With so many variables impacting business performance the organization cannot continue to rely on gut instinct to make better decisions. The organization has no choice but to use data to drive insights. As a result, organizations are relying on a number of interconnected analytical models to predict and visualize future performance.

However, given that one variable might have multiple effects, it is important for the business to understand how changes in one variable will affect all the models that use that variable, rather than just one individual model. By maintaining a meta-model, the organization would be able to visualize and control how different analytical models are linked.

It also helps ensure consistency in how data is used across different analytical models. Ultimately, decision makers will be able to prioritize projects with the greatest potential of delivering the highest value to the business.

Build a data analytics culture

Advanced data analytics is a growing field and as such competition for talent among organizations is high. Due to their traditional professional training, many accounting and finance professionals lack the necessary data and analytics skills.

More over, decision makers not knowing enough about analytics are reluctant to adopt their use. Because of cognitive bias, it is human nature to subconsciously feel that their successful decisions in the past justify a continued use of old sources of data and insight. What we tend to forget is that what got us here won’t get us there, hence the need to learn, relearn and unlearn old habits.

To move forward, the organization should focus on overcoming cognitive biases developed over the years, and closing this skills gap and develop training and development programs that can help achieve the desired outcomes. Using advanced analytics to generate trusted insights requires an understanding of the various analytics methodologies, their rigor and applicability.

It’s difficult to have people understand if they don’t have the technical capabilities. However, building a data analytics culture does not imply that the organization should focus on developing data science skills alone. You also need to develop analytics translators.

These are individuals who are able apply their domain expertise in your industry or function, using a deep understanding of the drivers, processes, and metrics of the business to better leverage the data and turn it into reliable insights and concrete measurable actions.

Building a data analytics culture that promotes making decisions based on data-driven insights is a continuous endeavour that spans the data analytics life cycle from data through to insights and ultimately to generating value. Successful use cases can be used to further drive the culture across the organization.

© 2019 ERPM Insights

Theme by Anders NorénUp ↑