Vista’s Krystel Walker gives us the run-down on how Channel Platform Enablement (CPE) is addressing data quality issues.
Data is arguably the most powerful tool a company has today, yet as a problem-solving device, data can produce just as many questions as it answers. In a debriefing with Krystel Walker, Senior Data Analyst in DnA’s Channel Platform Enablement (CPE) team, we explore how data quality issues impact trust and decision making – and how improving data quality enables Vista’s transformation into a more confidently data-driven organization.
What does CPE do?
We cover a lot of ground, from providing education and context on in demand data products, to consolidation of key cross channel concepts like spend, to tracking data stability and quality. Essentially, our role is to ensure information is gathered and used in a way that makes sense, benefits our colleagues, and supports wider business goals.
Data enablement is not always straightforward. So, to be effective, we’re always learning, growing, and adapting. By staying on top of this, we can guide the business in telling richer stories or navigating multiple answers to a single question.
For example, we’ve been building a report which tracks return on ad spend (ROAS) by different attribution methodologies. Viewing ROAS based on the channel a customer ultimately converted on may tell a different – yet still relevant – story than ROAS by the channel of site entry. We aim to educate the wider company on these methods to assist with more tactical optimization across channels.
What is essential to deliver adoptable products?
Our team has rapidly come to recognize the importance of including users of our data products early in the development process, during the product development stages. This was a crucial lesson learned in our recent work on a new data quality dashboard, specifically for Paid Search data – a lesson we’ll take with us as we launch other data monitoring products. Stakeholders, data product teams, and marketing specialists were consulted at various steps and they each provided unique input on visualizations and the type of data sources and report comparisons they needed to have included.
Initial feedback suggested that although our first dashboard was useful for the Paid Search team, the way it was presented made them feel overwhelmed. We also realized later in development that embedded analysts and technology teams were potential stakeholders, but our messaging and approach had to evolve to resonate with them. We don’t expect to get everything perfect at the start, but that means that proactively seeking out this feedback and expanding our reach early on is critical, not only to learn and grow, but also to deliver the right product to the right teams.
How was the feedback you gathered acted on?
The product evolved from one dashboard that contained everything, to a suite of related dashboards, each performing different functions. One dashboard focuses on comparing the same set of metrics at an aggregated level as they appear in multiple sources; another one monitors those specific metrics at a more granular level; and a third one maps product overlap and highlights the potential for missing data across reports.
The suite of dashboards helps to build trust with the Paid Search team and other power users of Paid Search data, allowing decision-makers to feel confident answering business questions with the right data at their disposal. Plus, the different breakdowns within the dashboards allowed for stakeholders outside the Paid Search team to engage at the level that was right for them. We also included text call-outs next to each visualization as a guide to help users determine what the visual is showing, and how to interpret the data. Ultimately, the dashboards have and will continue to reduce time spent on data quality triage and allow instead for more product development time and insight generation.
How has the product been used by the wider DnA team?
When we launched Vista’s new UK platform, an embedded analyst was getting mixed messages from the reports they were leveraging to tell the migration story, leaving them unsure of the true effect of the website rollout on customer traffic.
It is known that the way in which site traffic is assigned to marketing channels changes after a market migrates; making sure these volume shifts are “reasonable” is critical. Our product monitors the discrepancy between reporting products and gives analysts signals if the shifts they see are concerning and potentially require investigation, or if they are not concerning and analysis can continue. The suite enables analysts to explore, quantify, and monitor traffic data – critical for this migration question – among other metrics like order volume, revenue, and impressions from varied sources in multiple views. We created clarity for this analyst, enabling them to have more confidence in the data sources they were leveraging to develop their performance narrative.
Generally, as we build domain-driven architecture and products we must solve a lack of trust or understanding in data with context-driven governance, education, and documentation. Our dashboard suite is an example of how to achieve this. These practices will make it easier to adopt data products and to democratize data, which is arguably one of the most important contemporary drivers for business success.
At a high level, what has been your experience delivering this product so far?
After presenting our work on comparative quality monitoring to over 50 people in the company involved in a variety of data-driven projects, we can see that there’s an enthusiasm for this type of product suite and a desire for anything that helps to ease product adoption and improve data literacy.
We’ve learned that the same product in the hands of a data product team PO can deliver different value than when leveraged by an embedded analytics consultant. On the other hand, two products tracking similar metrics, but built to solve different use cases, can give different answers to the same question. The CPE team thinks of data like water: not all sources will be perfect, but ideally, they should be clean, and realistically their level of cleanliness should be easily known. Therefore, especially in our newer organization filled with people eager to dive into data, the value of delivering context with an empathetic mindset cannot be overstated.
Improving the confidence of our stakeholders and the quality of our data tangibly benefits decision making which can ultimately improve user engagement and success. By ensuring that the data running through DnA products is the best it can be, we give ourselves the best chance of delivering a worthwhile customer experience for small business owners.
This all means we’ll continue to think more broadly about our stakeholders, champion data literacy, develop solid documentation, and more. We’ve learned there’s real power in collaboration – of people, of products – and the fuel is understanding.
Want to become part of a truly data and analytics-driven organization? Explore our career opportunities at Vista DnA.