Data Science Talent Logo
Call Now

The Principles of an Effective Data-Driven Organisation By Pat McQuillan

Patrick McQuillan is an analytics executive and strategy consultant with a passion for using data-driven tools to transform business strategy.
He’s held key leadership roles including Global Head of Data Governance and Operational Effectiveness at Wayfair. He’s also led consulting teams to drive AI strategy for Fortune 100, government, and higher education clients. Patrick is a sought-after expert in data governance, business intelligence, and AI, with ongoing speaking engagements and a host of publications.
In this post, Patrick reflects on the common roadblocks companies experience when implementing a data strategy. He highlights the importance of data literacy – for senior leadership as well as IT teams – and the need for cross-company communication:


Companies are tasked with the ever-increasing responsibility to be data-driven. This makes sense given that the global economy has more access than ever to digital information, and with the rise of resources like big data and ChatGPT there’s no surprise that accurate performance measurement and technological differentiation have become king to remain competitive in the marketplace. One of the greatest challenges in being able to leverage these resources properly is a misalignment of data literacy across organisations, particularly with senior leadership. To some leaders, being data-driven means using any sort of analytical tool (e.g. Excel, Tableau) to make informed decisions; to others, it’s about utilising advanced analytical methods and AI to perform impossible tasks. Neither of these definitions are wrong, but the value is not in the tool itself – it’s in the wellspring of information the tool draws from. The key to a successful data-driven enterprise – no matter its objective or resources – is in its ability to create a high-fidelity, end-to-end data strategy that infuses reliable insights into the day-to-day decision-making of the business.

In order to integrate an effective data strategy that will survive and grow into the long term, common hurdles need to be overcome. These include:

Data illiteracy at the decision making level:

Often the bottlenecks begin at the top. While there has been an explosion in the creation of executive data leadership roles (e.g. CTO, CIO, CDO) over the past decade, the lion’s share of organisations still do not have positions like these in place to prioritise a strong data and IT infrastructure. This creates a culture where advanced analytics and strong empirical decision making are replaced with an oversimplified set of executive key performance indicators (KPIs) that exist in a dashboard or static report that gets updated each month or so.

Outcomes misalignment:

Time and time again I’ve seen companies of all sizes – from $10 million startups to Fortune 100 conglomerates – lack clear alignment and stewardship on the KPIs they rely on to track outcomes. Examples of this include

(1) multiple redundant KPIs with similar names to track near identical outcomes, (2) definitions that are often too technical for nontechnical stakeholders to effectively communicate performance, and

(3) unclear ownership of KPIs at the individual or team level. These systematic issues breed overconfidence in reporting accuracy, increase the frequency of data blackouts/errors and the affiliated costs of fixing them, and create an inherent lack of trust in the entire data infrastructure.

Overlooking the human factor:

Even with a strong data infrastructure in place and clear alignment on outcomes, these mean little without proper data storytelling to contextualise performance. It is particularly dangerous when these types of discussions are siloed across various teams within an organisation, leading to an organisation-wide lack of transparency and disorganised reporting. The Data Strategist has a responsibility to partner with key stakeholders across the organisation and communicate a clear narrative that coincides with the KPIs being reported.

Of the countless companies I have worked with over the years, I have rarely seen one that is not struggling with at least one of these issues at its core. There is a need for a common set of principles to guide efforts as data-driven organisations modernise and incorporate data into their routine decision making. Particularly, these principles must give rise to a process that is self fulfilling and adaptable – even during periods of rapid change – to ensure that leaders are planting the seeds for success that is both sustainable and achievable at scale.


The root of all analytics is ensuring access to reliable data; sourced responsibly with clear alignment on ownership, methodology, and intent. This means creating a data strategy explicitly rooted in a partnership with data and IT leaders, as well as the non-technical stakeholders whose teams are responsible for reporting on and delivering specific KPIs. This approach creates an open environment that necessitates open communication among all parties involved in providing an accountable narrative around performance.

These parties are collectively responsible for a sound, wellgoverned data infrastructure that minimises the risks of information blackouts and inaccurate reporting. This infrastructure should be based on three key principles:

Managing an internal data dictionary:

From KPI definitions (and how they’re calculated) to the logic behind setting targets, all stakeholders must speak the same language around how performance is being measured. A data dictionary is often a one-stop-shop that ensures clear alignment and communication around outcomes, measurement methodology, and data lineage.

Invest in a resilient back end:

Whether you call it a data lake, data warehouse, data mart, or any of the other trendy conventions we hear about, it is critical to possess a high-fidelity data storage location and ETL process in place for any organisation that hopes to be competitively intelligent. This means sourcing clear and consistent data to be pulled into your back end, having rigorous cleaning processes in place that account for various potential errors that can be scaled rapidly if needed, and automating as much of this as possible to minimise manual maintenance.

Ultimately the data that is output from this process should be ready to be queried and loaded into reporting tools.

Routine QA, even on automated processes:

Too many times have I seen companies tell me that they don’t need to run regular checks on their back end because “it’s automated.” This blissful ignorance benefits no one when it comes to maintaining an IT asset, similar to how it still makes sense to check a car engine periodically. A sanity check every blue moon can save millions in preventing potential data loss or faulty reporting.

Laying the groundwork for these principles will place your organisation ahead of a vast majority of competitors in the market, who are often realising too late this needs to be done and invest twice the resources to reverse engineer a solution.


Now that the back end is in a good place, we can talk about the fun stuff: business intelligence (BI). This is a critical inflection point where empirical analysis of the source data is translated into digestible insights for all involved stakeholders to drive decision making. These insights are often shared using one of three mediums: self-service dashboards, shared spreadsheet tools, and recurring reports issued on a regular basis. No matter the choice of medium, the result should be a reliable self-service tool that maximises value for the specific audience and relevant topic the metrics are designed to explain.

The Data Strategist responsible for designing and owning BI resources should always keep their audience in mind. Each self-service tool should have a unique layout, set of KPIs, and forum for discussion that best suits its intended users (e.g. board of directors, C-suite, individual team managers). This also means constructing these tools using the medium that is bestsuited to your audience to that they can glean the most information in the least amount of time. The best way to evangelise data-driven efforts in any organisation is through the use of BI, which is often one of the easiest points of gaining buy-in from senior leadership for future data-focused initiatives.

In all cases, the data loaded into these tools should be semiautomated and be communicated as simply as possible. This means not only choosing the right amount of detail and flexibility (e.g. date filters, KPI segmentation), but also using footnotes and other contextualcues to succinctly highlight any assumptions taken in the analysis. The data being presented in any interactive BI tools should also be as real-time as possible, meaning that it should refresh with new information as often as new data has become available. This can usually be done by scheduling the underlying queries to run at a set frequency that best suits the specific data being sourced and the questions that need to be answered.

The objective here is to create BI tools that – while they may look different from each other, draw on different data, and be intended for entirely disparate audiences – all follow the same core principles around data-driven storytelling that deliver confidence to their users and provide rich context around the KPIs being reported.


The data narrative is a collective effort that relies on both the Data Strategist to collect and prioritise insights, but also on individual stakeholder teams to provide deeper context into the specific workflows behind each key insight. It is critical that all the key players who share the responsibility of measuring and delivering on performance can meet regularly and strategise, using BI tools that draw on a well-governed data infrastructure to steer discussion. These meetings can take many forms, but my experience has shown that the best results come from hosting something called “business reviews.”

A business review is nothing more than a meeting to discuss performance around a certain topic, and with a certain audience. These can be a monthly 5-person call with executive leadership, or a weekly 120-person call with all managers and directors of a national business segment. What matters more is the fact that all of the right people are using data-driven BI tools to discuss performance and make decisions on a regular basis. There are two sides to this format: the Data Strategist and the non-technical partnered stakeholders. The Data Strategist is responsible for the items we’ve discussed so far – sourcing reliable data, creating compelling data visualisations and reports to tell a cohesive story, and providing a forum for discussion. On the other side of the equation, partnered stakeholders are there to give deeper context into each KPI, and work with the Data Strategist and senior leadership to offer solutions to pressing challenges affecting the business, such as:

• Why is customer engagement down from the last quarter?

• What are we doing about the supply chain bottlenecks in the EMEA?

• Can software engineering prioritise resolving the breakage issue with the customer portal?

The main point here is that following a ‘don’t shoot the messenger’ model where the data team is held responsible for all aspects of performance does not work, nor should it. Success is collective, and – while the Data Strategist is at the beginning of all performance discussions – there are other stakeholders in the room whose entire responsibilities at times are to move the needle a certain direction and deliver on KPI targets. Hosting business reviews gives them the environment to speak to their work and gain visibility with leadership, all in partnership with the data-minded individuals who developed the BI tools they are using and have a bird’s eye view on the sum total performance of the entire organisation. This data-driven model enables streamlined partnerships with core teams to foresee risks and navigate challenges effectively.


Innovations in the data and analytics space are evolving at a rate that outpaces most other areas of business today. An end-to-end data strategy rooted in an unchanging set of underlying principles is paramount for modern businesses to grow, as it allows them to have consistent access to clean data and tell compelling stories regardless of the new analytics tools or methods they choose to adopt in the long run. This ultimately comes down to senior leadership being able to invest intelligently in a data-focused centre of excellence designed to partner across the organisation to increase transparency among stakeholder teams, provide accurate insights at every major level of the business, and hold contributors accountable for their work. This also means being able to manage tech debt effectively and determine what is worth doing now, and what needs to be invested in over time as a sustainable solution at scale.

PATRICK MCQUILLAN has a successful history leading data-driven business transformation and strategy on a global scale and has held data executive roles in both Fortune 500 companies as well as various strategy consulting firms. He is the Founder of Jericho Consulting and a Professor at Northeastern University and Boston University, where he teaches graduate programmes in Analytics and Business Intelligence.

© Data Science Talent Ltd, 2024. All Rights Reserved.