The PuMP Community Forum

Home PuMP Community Forum Q&A Keeping it real in measure design

Viewing 5 posts - 1 through 5 (of 5 total)

Keeping it real in measure design

Author
Posts
#1298
Cheryl Welch

Member

March 29, 2012

I have facilitated 3 of the 20 or so groups I will need to do to develop measures for our whole organization. A couple of times already, I have run into Measures Team members who think a particular measure is just great, but I disagree.  I want this to be their process, but I know it's my job to guide them through this successfully.  Folks have come up with measures that they will only look at once a year (% of annual safety training completed), and ones that involves something subjective from others (# of customers contacting us to tell us our service was good), etc. In some cases I have said “We'll try this and see how it works for you.”  In others, I have suggested we remove it to a Task List, because their topics are good things to accomplish, but not necessarily good performance measures.  What are some good ways to allow them to own the measures, while trying not to waste time on measures that might not be helpful to them?

#1308
Stacey Barr

Member

March 3, 2011

You are sure going to be kept busy for a while Cheryl, with 20 groups to set up with measures!! But based on your success so far, the impact you'll have it stupendous.

I run into similar situations, where the team loves a particular measure that I think is quite “ho-hum” and not going to be very powerful for them. Here's how I generally handle it:

1) I remind myself that their ownership or buy-in is more important than the sophistication of the measures.

2) I also acknowledge that this is new to them, and there is a learning curve. Much of the learning happens through their implementation, and they need to be allowed to do that at their pace.

3) Rather than disagree with them, I reinforce to them that it's always their decision about what they measure and my role is to expand their possible choices.

4) So then I will simply explain what I see as the limitations of their measures (referring back to the definition of what a performance measure is*, and which parts of that definition their measure doesn't satisfy well).

5) Along with that, I will suggest some alternatives and why I feel they are better (following the Measure Design process, of course).

6) Then I suggest the team decides what to do with my suggestions: discuss them further, act on them now, document them for their next performance measure review cycle, or ignore them completely.

7) If the team has made an informed and conscious decision about their measures, and you have helped them consider a wider set of choices, that's the best you can hope for, ever.

Cheryl, does this give you some ideas?

Smiles, Stacey. Smile

* Definition of a performance measure: An comparison that provides objective evidence of the degree to which a performance result is occuring over time.

#1309
Cheryl Welch

Member

March 29, 2012

Yes, all good ideas (and this is a long term plan for the next 3 years ofr so!).  These are along the lines I have been working, but you added some great points.  Re: #6, Sometimes I am at a loss to come up with better measures. 

% of annual safety training completed – I'm not sure how to measure this any better since the trainings required differ for each employee and they do just need to be completed within the calendar year.  But it seems like it may be less than useful to have a measure that only makes sense on December 31.  It does send a signal as the year ends – if the crew is not getting close to 100%, there needs to be some trainings taken.  Any ideas on better ways to measure that one, or is that one of those things that is good to DO, but not necessarily a good PERFORMANCE MEASURE?

# of customers contacting us to tell us our service was good – This information is good to know, but it really depends on whether a customer decides to take the time to contact us or not.  There could be a lot of good work being done, and the customers are just too busy to write in.  Any thoughts on this one?

Thanks for your time, Stacey!  –Cheryl

#1310
Stacey Barr

Member

March 3, 2011

Cheryl, I think the issue is with the Measure Design process. I'd like to see the performance results that relate to the 2 examples you gave, and then I can give you a more practical answer.

On the face of it though, neither sound like good measures because they don't meet all the requirements of what a good performance measure is:

  • % of annual safety training completed – this is not regular feedback through time, and it's not feedback of a performance result (as you say, it's something good to DO, so it is a strategy for improving a performance result, but not a performance result in itself).
  • # of customers contacting us to tell us our service was good – this one lacks objectivity, since it doesn't have the power of consistency for the reasons you mention (sometimes customers are too busy to volunteer feedback). Generally I don't like measures that rely on data that is volunteered.
#1311
Cheryl Welch

Member

March 29, 2012

Of course I should have included the results!  For the first one it is:

 Field Customer Service processes are safe

Our Safety Coordinator and the crew members wanted a leading indicator, not just the Lost time from Injuries that we track already, so they came up with one related to training. 

 

The result related to the second measure is:

 Crew members are fully trained and know resources relevant to their work [i.e. we know what to do on our own or know how to get the information needed]

We do customer satisfaction surveys every two years, but that's not often enough for a performance measure.  But we do have customers send notes or call in frequently (average 5 or so a month throughout the District) talking about the good jobs our people are doing.  Those identified get a 'Pat on the Back', & are recognized at an all-employee meeting. So they wanted to track Pats on the Back.

A similar one from Engineering is:

 High level of internal and external customer service is maintained

We talked about either developing a periodic survey or giving comment cards to customers as they deal with the Engineers, but it's going to be hard to get a sample that is statistically significant because there is a small pool of people who interact with the Engineers each month and it would again involve people volunteering info. 

Any suggestions on these would be welcome!  Thank you.  –Cheryl

Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic.