Hi everyone, thank you for joining us on today’s webinar “Data Quality: The missing piece from your data-driven business.” My name is Sean Coombs and I lead thought leadership for Experian’s data quality business. I’ll be moderating today’s webinar. During this session, we are going to talk about a subject that is very relevant to many of our businesses. That is how quality, actionable data is increasingly becoming integral to effective decisioning, and how you can operationalize data quality to truly enable greater business efficiencies.
We will save some time at the end of today’s webinar to answer some questions, so please feel free to submit your questions using the Q&A box on your screen.
On the line with me, I have Jeff Meli and Michael Berard, who will be our speakers today.
Throughout his 20 years at Experian, Jeff has held analytical and consulting roles focused on partnering with clients to drive value from their credit and marketing investments. His clients have spanned multiple industries across North America. Jeff currently heads up a team of data analysts and statisticians responsible for the development of scorecards for acquisition, applicant decisioning and account management, as well as business intelligence solutions and ad-hoc statistical analysis. His team also has expertise in the development, integration and deployment of CCAR loss forecasting models.
Michael Berard is the Strategic Director of Data Management at Experian Data Quality, a leading provider of data quality and data management solutions. Michael is focused on helping organizations leverage their data to gain actionable insights, make faster, data-driven business decisions, and gain a competitive advantage. With more than 7 years of experience, Michael has assisted organizations of all sizes to implement data governance programs, reduce regulatory risk, and streamline data migration projects.
Thank you both for being here. [Pause]
With that, I’d like to hand it over to Michael to get us started.
At Experian, we’re here to help businesses make actionable decisions based on data they can trust.
Discuss the methodology for the 2018 global data management benchmark report.
What we found is data is data is fundamental to the world we live in.
The role that data plays in strategic initiatives cannot be understated, and organizations across industry verticals are looking to their data as a primary driver of business opportunities. The largest data-driven opportunity we see in the next five years is around analytics. This is driven by organizations looking at data for daily operations and for strategic decision-making. This is a trend we’ve seen for a few years now as businesses are doubling down on investments in hiring analyst roles.
Other key areas are real-time processing, data as a service, internet of things and automation.
A lot of data is inaccurate.
But we can see from the research that while businesses want to do more with data, they simply can’t. A third of data is believed to be inaccurate.
There are large concerns from the c-suite around data allowing them to provide a good customer experience (remember that is a top priority) and to meet regulatory obligations like GDPR.
The main reasons for this inaccuracy are items like human error, a lack of communication and poor data strategies. Now some of those challenges come directly from the fact that we have legacy systems that can’t keep up with the way data is currently used. There is a lack of transparency around data usage, but also poor data governance practices. internally
Experian is a leading provider of data management software and solutions, solving complex data needs of organizations globally. We are focused on helping organizations leverage their data to gain actionable insights, make faster, data-driven business decisions, and gain a competitive advantage.
If you are doing projects in any of these areas, you need to have a strategy around data quality to tie them together.
Companies today realize the importance of investing in the right technology, to gain the ability to make actionable decisions based on data they trust. In order to stay competitive in the current digitally driven market, organizations are often required to migrate legacy data from outdated systems. This process can be tedious, which is why many migrations exceed budget or timelines. This results in increased costs, lower productivity, and lost opportunities.
Lots of businesses have data management projects in the next year, and data migrations is at the top of the list.
We see GDPR as a catalyst for modernizing data management practices. We all see data governance make a big leap this year to 34%.
Give examples of data quality issues: sampling, legacy, IT vs. business, silos
We are helping organizations build a baseline to complete successful data migration projects on time, and in budget. Once the data has been successfully migrated to the new system, there should be continued processes in place to proactively manage data. The continued ability to understand data at a holistic level allows organizations to proactively analyze, find discrepancies, and identify relationships within the information. In order to move data from one system to another, organizations are seeking technology to ensure their data will be fit for purpose along the way, and once it gets to its destination.
Jeff, as Experian’s chief scientist, what’s your perspective on how data quality had changed from when you started at Experian until now, and why is it so important to your role?
@Jeff, can you discuss why data prep is necessary for your work, and some challenges or experiences you’ve had with data prep?
This is known as data preparation, and 36% of organizations in our study say they are planning data preparation projects in the next 12 months (up from 27% the prior year).
While it may seem as if data preparation is the easiest data-related project, it may surprise you to learn that 91% of C-level executives be that preparing data for insight ultimately costs their business in terms of resources and efficiency. That’s because employees spend up to 44% of total project time preparing their data for insights, rather than spending that time on more meaningful work.
DEFINE: Operational data quality refers to taking a proactive approach to managing the data you collect to ensure it is accurate, standardized, and up-to-date.
Without baseline data quality practices in place, organizations leave themselves open to risk. In fact, there are many business problems that are tied to having poor data quality. Some, like returned mail, are easy to identify. Others, like IT or marketing overspend, imperfect customer service, reduced customer retention, or inaccurate forecasting might be more difficult to identify, but are still tied to poor data quality. By investing in the right data quality solutions, businesses are able to solve for these problems, and also gain the ability to leverage this information to support larger goals and initiatives. With a healthy foundation of data, marketing will be able to leverage accurate and personalized targeting efforts, customer service can maximize efficiency in call time handling or upsell opportunities, operations will face less returned packages, and analytics will be able to gain deeper insights and more accurate finance models.
@Jeff, can you speak to some real world data quality examples that you have encountered in your day-to-day?
Before a business user can begin building dashboards and generating reports, the data needs to be cleansed, standardized, deduplicated, and otherwise transformed so that it can used in the analysis.
This process is known as data preparation, and it can be a big hassle.
@Jeff, can you walk us through some of the use cases or applications you see from your position? Give some real-life stories or examples.
Experian’s Decision Analytics team develops a variety of sophisticated models for our clients. Yet the process leading up to model development requires a substantial amount of work in order to improve the quality of the data. Our team needed a way to reduce the time it takes to ingest and analyze data for our modeling business.
A single customer view is a consolidated, consistent and holistic representation of the data known by an organization about its customers.
Improving the customer experience requires a deep understanding of needs and motivations. While everyone has data on their customers, the silos between databases and flaws in the information can make understanding your customer difficult.
• Knowing the attributes of your best customers enables you to find more like them
• Understanding the purchase patterns of each customer segment helps better predict inventory requirements
• Locating where your best customers live can help logistics save money and identify optimal locations for store placements
• Identifying payment preferences can uncover credit business opportunities
• Categorizing lifestyles and life stages help with market sizing and product capabilities
In today’s digitally driven business environment, there are multiple touch points in which a consumer can interact with a business. But when customer information is held in disparate systems, obtaining a clear customer view can be difficult and limit business opportunities. This means businesses must link these consumer interactions together.
*Provide anecdote on millennials interacting with brands vs gen y and gen x
Less than 5% of organizations say they have a consolidated view of their customers and 25% of organizations plan to undergo a single customer view project in the next year.
These charts show the drivers behind those initiatives and what is holding businesses back from success.