Enterprises need guardrails for top data-driven decisions

21 May 2024 Consultancy.uk

Data is the lifeblood of a business - but the way a business manages its data is often not as connected as the circulatory system. Paolo Platter, CTO at Agile Lab, warns that without effective governance mechanisms, companies will continue to fail to get the most from their data management.

Data drives business operations and decision-making. But while the raw data itself is valuable, it’s the intelligence and insights that can be gleaned from it that truly fuels innovation and growth.

This vital intelligence is the foundation on which organisations build long-term strategies, optimise processes, and identify new opportunities. It provides the essential context and evidence necessary to support informed decision-making across all aspects and functions within an enterprise.

Enterprises need guardrails for top data-driven decisions

And, in today’s digital age, there’s certainly no shortage of data to draw on for in-depth analysis, with IoT and AI creating volumes at an unprecedented rate. It has come to a point where many large enterprises have data lakes and warehouses overflowing with untapped potential.

Why governance needs guardrails

However, organisations are struggling to leverage this information efficiently and cost-effectively. Worryingly, many are also finding it impossible to enforce strict governance frameworks that ensure data is consumed and produced in alignment with internal standards for quality, integrity, architecture, compliance, and security. So, while there is growing recognition of the value this data holds, there is mounting frustration at how difficult it is to harness.

It’s a problem that isn’t easy to solve as the original data, and multiple copies, often reside in disparate systems, formats, and locations. This makes it laborious to access, consolidate, process, and apply consistent governance standards. Considerable manual intervention is often necessary, taking up valuable time and resources.

Attempting to leverage data more effectively, organisations have continued to invest in expensive data management systems and analytics platforms. In most cases it has made the situation worse, by creating more duplication, different technical standards, and confusion over the integrity and whereabouts of data assets.

As a result, extracting meaningful intelligence often requires a combination of sophisticated tools and highly skilled staff – and painfully long lead times of months or years. Saddled with a burgeoning set of tools, growing siloes of data, and rising costs, organisations desperately need a new approach before the issue gets even more out of control.

Regaining control with computational governance

What’s needed is a reliable method of imposing enterprise-wide governance rules that, like guard rails, remain in place throughout the lifecycle of data - wherever it resides.

This is where computational governance comes in. Different from data management tools, it doesn’t create, copy, or move data around. Instead, it’s a solution in the form of an automated platform that imposes a consistent governance framework throughout an enterprise. Its purpose is to enforce internal standards and security controls but, at the same time, empower data consumers and producers by expediting data discovery and project development.

Overseeing all data tools and technologies, rather than replacing them, computational governance represents a sea change in enterprise-wide data management. Its technology-agnostic, thus enabling organisations to quickly take advantage of the potential value in their existing data silos without the need for further consolidation. Futureproofing is assured as customers can adopt new tools whenever required and bring in structured and unstructured data, whatever their business demands.

Inbuilt, customisable guardrails ensure that every project adheres to relevant standards at global and local levels and cannot reach production unless the pre-defined policies are followed. This encompasses all areas where standards must be followed, including data quality, integrity, architecture, compliance, and security. Bypassing the system isn’t an option. Therefore, relying simply on trust for adherence is a thing of the past.

This approach brings a new level of certainty and reliability to decision-making and strategic planning. By eliminating much of the time spent re-checking and re-validating data, it also slashes time to market.

A leap in performance and agility

Saving time and improving performance have been driving forces behind the creation of computational governance. Businesses say that data practitioners spend nearly 50% of their time on non-revenue making activities such as finding, then validating the integrity of data, before initiating projects. The problem is so restrictive that it threatens to virtually destroy business agility. Some have considered building their own governance platforms. However, with estimated timescales of 3 to 4 years, this would eat into budgets, diverting focus from core business operations.

Instead, a modern third-party platform is much quicker to deploy. It is designed to be up and running in fast with minimal disruption to existing operations. The first stage is creating and customising specifications that set out required data practices, internal policies, compliance rules and architecture standards. Once these fundamentals are in place data teams can start new projects or incorporate existing and old ones. Intelligent templates help automate any technology and practice, dramatically improving the data practitioner experience by reducing mental effort and cutting the delivery of projects from years to months.

Additionally, users with no technical background can easily uncover new opportunities via a user-friendly interface similar to an online shopping cart. Before, they might have needed assistance from multiple data managers. It enables anyone with the right user permissions to search and retrieve business-relevant information straightaway.

This leap in performance is key to agility, radically speeding up responsiveness to new opportunities or changes in the economic climate. It also aligns closely with the relatively new concept of data mesh that seeks to promote agility by distributing data ownership to business domains.

Alignment with the data mesh concept

Taking the view that domain experts understand their own data best, the mesh model advocates a self-service approach to accelerate decision-making. For organisations wanting to make this shift, a computational governance platform will smooth the transition. It addresses the key concerns of consistency and control that can impede the adoption of a mesh model, while aligning with its four key principles of: Domain Oriented Ownership, Self-Service Data Infrastructure as a Platform, Federated Computational Governance, and Data-as-a-Product.

Utilising computational governance also reassures organisations adopting these principles that decisions are based on data that is accurate and reliable. And that compliance and security standards meet relevant regulations and legislation, mitigating legal, financial, and reputational risks. While also maintaining the organisation’s architecture framework, facilitating interoperability and integration across systems.

Effectively, computational governance provides the guardrails to allow the breaking down of data silos, and gives domain experts autonomy to unlock the full potential of their data assets. This will drive innovation and agility, allowing enterprises to adapt quickly and confidently to their evolving business needs.