Skip to main content

Power of community: become a data hero too!

Head of Customer Success
Data Sharing Community in Frankfurt

Twice a year the CDQ Data Sharing Community gathers for a 2-day onsite workshop to discuss all things master data in quite a unique format. During the event, data professionals from our member companies get a chance to strengthen their peer networks, discuss common work topics, exchange experiences, good practices, and, finally, have a little bit of fun.

In my view, this community is strength in itself. We believe that working together, collaborating and cooperating brings the best out of not only data, but out of many ideas. The level of trust and openness within the community is truly an asset, enabling to learn from each other - sometimes also from failures or mistakes. Of course, the CDQ Data Sharing Community is about sharing data, but it's also about sharing knowledge and sharing experiences.

Our latest workshop was held in Frankfurt, where we welcomed 74 participants from our esteemed customer pool joining us both in person and online. We kicked it off with a quick scan through participants expectations to understand what qualifies as a good workshop to them. Here’s a list of this edition’s key focus interests from our participants:

  • Insights on the services that CDQ can provide, e.g. CDQ First Time Right and master data cleansing.
  • Opportunity to talk to other colleagues from industry and to share experiences and pain points.
  • Exploring the implementation of CDQ solutions within SAP MDG
  • Learning best practices from the other companies.
  • Implementation of CDQ Bank Trust Score to reduce the risk of invoice fraud and sanction screening to reduce the risk of compliance breaches

Starting with some food for thought

We always talked about data sharing being the outstanding aspect of our community, but also getting a common understanding outside the CDQ data sharing community. And as we are already at a tipping point, meaning everyone agrees on the value of data sharing, there's still ways to improve the line of argumentation when it comes to managing data quality.

My co-host, Tobias, brought a few studies to share with the community, among which is one remarkable study, “2024 data and AI leadership executive survey”. It’s super interesting to find, that it has been conducted since 2012, so you can see 12 years of evolution in data topics. And you can also stay on top of the key trends, get to know what is coming new as a topic to be tackled and addressed and what you as data experts should be aware of.

Even though we're talking about data and data quality since ages, you get to see that only half of the respondents said they really manage data as a business asset.

Another trend which Tobias brought up was AI. And in fact, AI also starts with business partner data: if our master data about business partners is not ready, is not properly in shape, we cannot make use of AI models and the promises which Gen. AI but also AI in general make. Everyone on C-level talks about AI, no one wants to take care of the data. But data is the basic fuel of everything which needs to be done in AI, and being a part of our community supports increasing the maturity of data activities and by that – the individual organizations' AI journeys.

Community members on the stage

The most exciting part of our on-site workshops are always the stories shared by the community members. On one hand, this is a great opportunity to learn from each other, on the other – a chance to have a reality-check for the presenters, with many various questions and impulses exchanged in the following discussions.

This year, we had three great stories shared by our customers Schwarz IT, Voith and Henkel, so let me spill some beans.

Cross-company location management with Schwarz Group

Schwarz IT presented their project concerning the improvement of location data sharing. They outlined the challenges faced in manual communication with business partners, that often led to operational workflow errors. Our speakers emphasized the need for a standardized, cross-company solution using a global common standard, specifically the GLN (Global Location Number) provided by GS1.

Together with our CDQ colleagues, Schwarz IT delved into the importance of the GLN in identifying physical locations globally and how it forms the core of their data model. They extend this core by defining additional location types and providing logistics-specific information. When sharing their internal view of solution, Schwarz emphasized on integration with existing system landscapes and communication with external networks.

The common vision involved a network based on the GS1 standard, where participants can share data via home data pools, ensuring adherence to the global data model. Data is made public to the network, allowing subscribers to access and synchronize information seamlessly.

Overall, the aim is to establish a global network for efficient and standardized location data sharing across industries. If you’re operating in retail or with consumer-packaged goods (CPG), CDQ GLN Connect should definitely be on your to-check-out list!

Data creation “First Time Right” with Voith

Voith colleagues described the process improvements achieved through workshops with CDQ and internal IT, focusing on SAP MDG system’s implementation. They emphasized the use of CDQ bank trust scores and shared challenges they faced, including technical complexities and communication issues with IT. Lessons learned from Voith:

  • Already mature, secure, and compliant processes like the creation and maintenance of business partner data can be significantly accelerated with the help of CDQ First Time Right.
  • Key success factor: agree on the future process together with business and IT and stick to it during the implementation.
  • Fail fast, learn fast, correct fast – and take decisions.

Outcome? Already immediately after go-live of the new process, a reduction of workload could be observed. The degree of automated decision-making will be further increased in future – we’re looking forward to Voith’s update in the next community meetings.

First Time Right journey with Henkel

Exactly 4 workshops ago, Henkel presented a Proof of Concept on CDQ First Time Right. This year, they were ready to share the full implementation, which serves as a testament to the power of collaboration and knowledge exchange.

Being a global company, with two major business segments, Henkel faced challenges managing vast amounts of data from various systems and entry points. They implemented CDQ to address data quality issues, especially in handling business partners and prospects across systems. And just to give you an idea of this complexity: we’re talking about millions of business partners and prospects across different systems operated by 6000 users. Hence, the implementation of CDQ services involved a PoC, risk assessment, data field mapping, and duplicate configuration, among other steps. Henkel also emphasized the importance of cross-system checks and validation, which CDQ facilitates.

The implementation process included setting up the CDQ API integration, defining data quality rules, and establishing workflows for data validation and fraud prevention, particularly in vendor account creation.

Lessons learned:

  • flexibility and effectiveness of CDQ's API integration
  • tailored use of 70+ global open data sources
  • greatest challenge: to change user behavior

Next steps for Henkel involve ongoing integration with SAP MDG, duplicate cleansing, and Salesforce integration.

Of course, the need for continuous improvement in data quality management and the importance of adapting tools to meet evolving business needs, are the obvious companions on the journey and I’m thrilled to see where this journey takes us in the next CDQ Data Sharing Community workshop.

CDQ Product updates

An important aspect for our community members during the onsite events is the direct exchange with the CDQ team. Our community had a chance to learn more about the building blocks of our CDQ Suite for Business Partners, and I’d like to give you a snap summary of these discussions.

CDQ Zero Maintenance

This session shed light on the challenges faced by organizations in aligning internal processes with evolving business needs. The diverse nature of business processes across companies urges the need for adaptable solutions tailored to individual requirements. Hence, the importance of validation rules and the handling of raw data updates to maintain data integrity. A crucial strategy to anticipate future needs and prevent potential issues is a proactive data assessment.

Integration of external data sources into existing frameworks was another focal point in this round, with an emphasis on harmonizing the data model mapping to accommodate additional attributes while ensuring compatibility with existing structures. For CDQ customers, this challenge is resolved by the CDQ ID as a primary reference for cross-referencing data from various vendors. Not only does it streamline the process of data integration, but also enhances efficiency.

Key take-away: embrace flexibility, proactivity, and seamless integration strategies, to navigate the complexities of modern data management with confidence and precision.

CDQ Fraud Prevention and Sanction Screening

We also delved into the complexities of bank data validation and the CDQ trust score system, addressing questions raised by the community. As per members’ requests, we explained the CDQ trust score and discussed bank data-related topics.

In a nutshell, the CDQ trust score is a harmonization of shared bank account information, with a great contribution from the community members. Creation of the trust score is calculated based on successful payments to a bank account. Participants of this session discussed various methods for uploading bank data to the system, openly shared their concerns regarding data accuracy and system integration and exchanged proposals to optimize data sharing and processing efficiency.

The driving force was the community engagement and collaboration – as it is also essential for refining and improving the bank data validation process.

CDQ Cloud Platform

Part of our community was also eager to learn a bit more about our cloud platform. Together with our DevOps colleagues, workshop participants dove deep into the site reliability engineering (SRE) practices within the software industry, with a focus on ensuring performance, availability, and scalability of CDQ services. At CDQ, we implement SRE principles by setting service level objectives during requirements engineering, conducting performance tests for critical functionalities, and utilizing elastic scaling for resource allocation. Additionally, we employ multi-region support to enhance service availability. These practices ensure that CDQ's services meet performance expectations and can adapt to changing demand levels efficiently.

CDQ Data Quality Cockpit

Based on the topics of interest from our members, the CDQ Data Quality Cockpit session revolved mainly around our data sources and data quality rules. The data coverage is based on both official registers and the data sharing pool contributed by the community. The growing pool of ready-to-use data quality rules wouldn’t be possible without the collaborative spirit of the Data Sharing Community.

Upcoming plans around data sources involve integration of new ones, with a strong emphasis on the United States for the next quarter. Again, community plays an important role in the process and is invited to engage in the prioritization by voting for desired data sources through CDQ's idea portal. We also discussed the governance of data quality rules and addressed the need for structured management of rule changes (due to regulatory updates or new rule creations).

In addition, this session gave our members a chance to delve into the topic of Geo coordinates. We outlined our triple-layered approach to data enhancement, leveraging Google Maps, Here Maps, and CDQ's own data pool for precise positioning. We took the opportunity to demonstrate the application of Geo coordinates through CDQ's web apps and API integration process, to emphasize user-friendliness and visualization in Google Maps.

Next, the focus shifted to reporting capabilities, where we detailed recent changes made to enhance reporting efficiency. We explained the shift from repeated job running to continuous monitoring, resulting in optimized resource consumption and faster report generation.

With community-first approach, we also discussed future plans for deprecating old reporting versions and migrating to new stacks, along with potential developments related to storages for commercial data sources.

Community vibes

The CDQ Data Sharing Community event was an absolute blast! Beyond the work sessions, we shared meals, grabbed coffee breaks, and had some real heart-to-heart talks after workshops. Evenings were filled with laughter and fun at the bar, and joint dinners gave us a chance to deepen our connections. This personal touch made working together even more enjoyable and fostered a collaborative, honest, and open spirit that truly set the tone for the entire event.

 

The aftermath

Take a fully packed agenda around trusted business partner data, a group of enthusiastic data professionals and an amazing collaborative spirit, and what do you get? I can only speak for myself, but here’s some feedback from our dear participants:

  • “The agenda and topics in Deep Dive were very suitable for our daily work.”
  • “I joined online and the technical quality of the remote presentation was really good.”
  • “Great concept for brainstorming together with community members and CDQ employees.”

I am super excited to see where the journey is going and how our community members are making the best out of our services. If you feel like joining this data-pack, please get in touch with us – because the next CDQ Data Sharing Community workshop is coming in September, so why not join us there?

 

Unlock your data value with CDQ Data Sharing Community

Get our monthly e-mail!

Data Mesh and the case of Data Sharing

Fueled by the increasing demand for distributed, scalable, and agile data platforms that can enable

Trust in shared data: quantifying the benefits of Data Sharing

Some call this approach 'Interenterprise MDM', some call it 'Collaborative Master Data Management', but the benefits of data sharing go beyond names. CDQ is the…

Stronger together: Sharing knowledge for better data

While trend reports can assist with priority setting, and frameworks can assist with activity structuring, successfully implementing data strategies and…