Good data is the cornerstone for good decisions, good processes and organizational efficiency. However, upon peeling back the layers of their databases, many organizations discover that they're actually suffering from bad data hygiene.
Some companies are still kicking around redundant processes for no better reason other than "we've always done it this way." With smart data management solutions, however, and a conversion to the “Single Point of Entry” religion, companies can benefit from a reduction in cost or waste, along with a more trustworthy base of information.
Don't know if your organization is one of the unwashed masses suffering from poor data hygiene? Here's a list of the worst symptoms that plague organizations.
- Multiple points of entry. Some organizations input data to multiple systems rather than
build bridges among systems and identify authoritative data sources. This leads to increased cost,
as multiple people essentially perform the same tasks, as well as inconsistency that leads to
- Lack of data accountability. Someone has to be held accountable for the quality of data
in an organization. Without accountability, fingers point in all directions when data-driven
decisions fail, and it's difficult to fix data when no one owns it.
- Reporting difficulty. Accurate and reliable reporting becomes difficult at best when
different teams are held to separate standards. At worst, individual reports show inconsistent
information, making decisions nearly impossible and slowing down or grinding executive decisions to
a halt while data sources are confirmed and reports corrected. Bad decisions happen when the
information discrepancies are not discovered in a timely manner. What's more, data users downstream
must either re-create pertinent data on their own or wait for clean data, hindering process flow
and their ability to do their jobs, exponentially reducing organizational efficiency and negating
- Increased costs. With people doing redundant work and other people running around trying to fact-check inconsistency, the company will not operate as efficiently as it could. The company is either losing money or spending more money -- regardless, it's a bad strategy for business.
Do any of these sound familiar? If they do, don't panic. Here's how we're fixing things at Westminster College.
Organizational efficiency through data governance
Westminster College is taking proactive steps toward data efficiency with a number of goals in mind, primarily in creating a "single version of the truth" that automatically populates other systems that rely on the same data. Because organizational culture led us to where we are, we instituted a data governance model that helps keep everyone on the straight and narrow when it comes to data decisions. We no longer simply stick data somewhere because we were given a "but no one is using that field" explanation from the users. We also no longer allow a department to simply create duplicate records to work around an operational issue and demand a single point of entry for data. Similarly, we don't allow a new division vice president (VP) to walk in and turn his unit's data world upside-down without solid business justification and a true understanding of the impact of the change on the organization.
Data owners should be held accountable for data quality and, more importantly, empowered to make decisions and take action to mak e sure that data quality remains high and supports organizational efficiency. An executive leader should direct individual data managers to maintain consistency across the organization -- even if they don't reside in the same organizational silo. Our Module Managers Committee was mostly inactive and didn't have any real authority, but we gave it some teeth and now it has the power to say "No" or impose requirements before allowing a data change to move forward. All significant data decisions are required to be run through this group. The group is not intended to simply say no to requests, of course, but to ensure that changes make sense. For instance, this ensures that a new VP understands second-order consequences before moving forward with his own data management solutions.
All division VPs were instructed to modify their job descriptions to include their committee membership and participation as assessed performance metrics. The director of information services -- who reports to me, the CIO -- leads the matrix management-type committee.
Now that we have a structure in place to begin to address the culture, we have been turning our attention to cleaning up existing data. In the past year, we've undertaken two major efforts. One has resulted in the college normalizing all of the data for one division. During this process, weaknesses were also identified in other campus divisions that ultimately feed this group's efforts, and the cleanup scope has been expanded to address data deficiencies as they are identified.
It goes without saying but clearly, when something has to be done twice, the potential for error multiplies. We are moving to an authoritative data source for all data elements and sourcing the original data with a single point of entry rather than rekeying it or attempting to keep it current in two locations whenever possible. For example, our coaches currently manually maintain their athletics rosters on the college website and, at some point, this information is also manually entered into the campus ERP system, where it’s used for fundraising operations. We're changing this process: We plan to create a SharePoint Web part for the college website that pulls and displays the roster information directly from the ERP system so that we have a single point of entry. The coaches will then maintain roster information in the ERP system and update information as quickly as possible, resulting in more timely information on our website and consistent data available down the line for fundraising.
This is just one of dozens of examples of efforts under way to make things a bit better at Westminster, but it really exemplifies the need for an organization to make a commitment to making sure the data works for you and not the other way around.
This was first published in June 2011