Wednesday 11 July 2007

#44 The Social Life of Performance Data

One of my clients is drowning in dozens of reports collectively containing over 100 measures. Where he expects two measures from separate reports to have the same values, they don't. Where he expects a measure's value to be accepted by his customer, it is disputed. Where he thinks he's looking at the right measure to answer his question, someone warns him no. The tangle of reports and measures is unwieldy, but has become the dogma of decision-making. Untangling them all into a streamlined sensible suite of reports is not as simple as setting up a swanky scorecard.

Data quality worries most users of performance measures. There are an obscene number of reported measures that only generate dialogue about how unreliable the underlying data is. But what can you do about the quality of performance data? I've heard some performance measure experts proclaim that performance data must have 100% integrity. Hogwash! It never will, and here are some of the reasons why.

performance data is gathered by people

A vast proportion of our performance measures rely on data that has been touched at least once by human hands. People design data collection forms and processes, people fill out those forms, people enter the data from the forms into computer databases, people extract and manipulate data out of databases, people filter and analyse the data to produce performance measures.

So human error and misunderstanding, ambiguity or absence of clear data definitions, ad hoc data collection and analysis processes, and vague measure definitions (the calculation of measure values) all contribute to the low confidence people have in reported measures.

How many of your performance measures are defined in enough detail to avoid miscalculation or use of the wrong data? How many of your data collection processes are documented consistently and ingrained into work practices? How many of your people that collect data have been trained to do it according to the documented process? Does your organisation have a data dictionary that is available outside of the IT team?

people know that performance data can sting

Unfortunately many of our organisations are still carrying the burden of a blame culture. People can still remember (or are still experiencing) the use of data as a big stick to humiliate, take resources away from, demote or sack the so-called poor performers. We know in this kind of environment people swing into self-preservation mode (it's only natural) and weigh up their choices: cop another whack with the data stick or sweep that nasty data under the rug?

Managers and decision-makers need to earn the trust of employees again, that data will not be used against anyone. Performance measures and data need to be seen more often being used to honestly assess performance of systems and processes, more often being used to explore root causes and learn from the past, more often being used to stimulate dialogue about how the future can be influenced.

How many of your managers and decision-makers look for root causes of undesirable performance in the systems and processes (as opposed to the people)? How many performance measures are supported by diagnostic measures of causal factors (as opposed to just slice-and-dice the data into smaller fragments)? Have you got an automatic improvement process that kicks in when a performance measure reveals a problem?

data has no meaning apart from its context

An event must occur before data can be produced. And the data is the product of the event being observed and interpreted and coded. When people are doing the observing (as opposed to a machine such as a temperature gauge), the person unconsciously - and occasionally consciously - applies filters that affect how the event is interpreted and how it is coded.

These filters are influenced by beliefs the person has about the event, their interactions and relationships with others around them, their physical and mental health on the day, what they are thinking about at the time, their values and priorities regarding their work, and the list goes on.

Have you explored the context around the types of performance data you collect? Have you thought about the factors that might influence the way someone interprets and codes what they observe when they are capturing performance data? Do you have guidelines and examples in your data collection instructions to help data collectors capture quality data?

don't just rely on technical solutions to data integrity problems

Yes, there's certainly more to the social life of data than the three parts discussed here. Most of them can be discovered and dealt with through better communication among the people involved in data capture: from designing measures to developing data collection processes, to collecting data, to storing and analysing it. Don't rely just on the technical solutions - think through what needs to change in the social systems surrounding data. And be concerned more with how much integrity your decisions can survive with, as opposed to 100% integrity.

No comments: