Indicators Part 1: Where do I start with Human Error?

It has to be addressed: Indicators. Three things make up the bulk of what many Human Performance Professionals do: Coordinate, Teach, and Track… You need to know where you are so you can tell where you are going. For example, a GPS needs to know where you are before it can provide guidance to a destination. Indicators help us know how badly or well we are doing, and also, if we are improving or not.

Good news! This post will explore a little about what seems to be useful, and what seems to be a waste of time. I’ve recently been asked, “What constitutes a useful indicator?” In my opinion performance must have a way to be measured, otherwise you never know where you are, or if you are getting worse or better, and the only tool at your disposal is something you could call “Cognitive Assumption,” which in reality would sound something like this: “I believe we are getting better, however, I have no objective evidence to support my assumption. It just feels better.” Cognitive analysis is okay, but not entirely scientific. Don’t let anyone mislead you; performance analysis IS science.

Remember the five steps of science?

  1. Observing
  2. Scoring
  3. Measuring
  4. Analyzing
  5. Applying

Does it sound like performance indicators are a science, yet?

The first place to start is by determining what you already have for data. How are errors or events currently tracked or processed at your facility? This could be tricky and involve communicating with others even outside your department.

Each indicator should have the following parts:

  • Definition – the concept being measured
  • Parameters – What are the attributes of the measure and how do they actually impact performance?
  • Criticality – How important the measure is and why we should care about it. Does it relate to the corporate mission?
  • Data Collection – Where is the source of data coming from and when will the information be provided by?
  • Metrics – What does the visual representation of the data look like?
  • Dependencies – Does this measure correlate with another indicator in some way?
  • Analysis – (The most important part!!!) As performance changes, can you relate it to changes and efforts to improve the measure? What is causing the measure to be this way and what does that imply?

Todd Conklin weighs in (allow me to paraphrase):

At the recent HPRCT Conference in June 2014 Todd Conklin gave an amazing keynote speech, and even though I wasn’t able to be there this year, I was able to watch it (three times!!! – mainly because Todd rocks). You can click here and join the Human Performance Association (307-637-0958 **). I believe it cost me $279 to become a member for a year. Todd reminded us that you can’t get better until you measure, and how important it is to figure out how to measure the things you’re doing correctly. I have not seen this before. It is so much easier to track failure by incident, than positive progress by task. We are stuck backwards looking and not even in a present mindset for current performance. Metrics might predict future performance and areas for interest and improvement, but still have not given a clear measure of what performance actually is, but more of a clear picture of what failure is, and if it is diminishing or getting worse.

So where are we?

Knowing what your worker hours (typically from payroll) happen to be, you should be able to calculate a monthly event-rate for your company, and perhaps even by department. What constitutes an event should not be subjective and as standardized as possible, following a strict library of codes. If you code a lot of issues, you may be able to calculate a lower-threshold error-rate as well, but that’s getting to the more subjective side, because not all lower-threshold info is being reported or consistently coded. With that in mind an event-rate seems to be the best common denominator between facilities if you want to compare apples to apples.

But what about measuring good performance and not just failure?

Ah yes. This is the golden nugget we are hoping to find some solution to in the very near future. Do you have a suggestion on how to measure positive performance? How many times you’ve completed a work order or job satisfactorily? How many component manipulations you’ve performed successfully? How would you effectively measure and track that data set? Who would do it? Can it be automated? Keep in mind that HOW we get results is sometimes more important than the results, also. So a positive outcome that was performed rushed and poorly, may show on this new measure as a good thing… This is why measuring good performance is not simple. Human Performance is about the behaviors of workers and leadership team members, and how hard is it to quantify a behavior?

On this suggestion I have more questions than answers. I havent seen this yet, and I’m trying to figure out how to do this. Please send suggestions to

Click these supporting Links:

What makes a good Metric?

Developing Performance Measurements

 PDF – Creating and Using Effective Performance Metrics


Situational Metrics

How to develop KPIs

Tips for making Infographics

Example of a Human Error Infographic

What do we do with the data gathered from indicators?

Look for some answers in an upcoming post! Suggestions for topics? Any performance improvement questions or challenges you want some help solving? Send an email to the site or comment on this post. After a very busy and distracting summer, we are looking to bring new content to this site and take it new places. In case you were wondering, more posts are coming including brand new podcasts very soon, as well. Thanks for stopping by and have an event-free day.

Are you going to post more HU-related content? Absolutely!!

coach_time_outHello HU colleagues worldwide. This site is still being updated and will continue to be a source of new HU information and links very soon. Over the past couple of months, the main author has been working, spending extra time with his kids, designing training, and getting a new amazing HU job opportunity. A lot of exciting things are going on and the second half of 2014 is going to be phenomenal.

The HU toolbox community is over 100 and continuing to grow! It remains free to sign up and it will enable you to never miss a new blogpost or podcast.

Contact us

Emails that come in via have been great. This community is filled with interesting and amazing talent dealing with humans and behavior. Your questions, challenges, wisdom and insight remains vital for the best quality posts. Keep them coming.

Call to action – new T&D HU network needed

Please contact this site if you are an HU professional in Transmission and Distribution. This site’s main author is building a new network for benchmarking and support purposes. Please let us know what you do and email with your contact info.

Quick challenge question:

What is your favorite HU Book?

Podcast Episode 6: Error prevention at a Connecticut Hair Salon (Interview)

Podcast Cover ArtNestled in Norwich, Connecticut, you can find a hair salon/studio called, “Details.” A few months ago the owner sat down with me to talk about how our two industries relate when it comes to error-prevention.

Heidi Duff is someone who completely understands her “calling.” Even after 26 plus years in the business, she approaches her daily work with high energy, enthusiasm, and a constant desire to be one of the best in her field. I totally respect that type of attitude, and feel even more energized with every conversation I have with her. On this podcast, we share a candid conversation about what it’s like being the salon owner, and what are some common human error that needs to be avoided in this billion-dollar and VERY personal industry. We talk how important it is to the reputation of each salon and how through timely feedback, employee meetings and training, her staff not only become top-notch, but stay that way, too. I love their slogan: “Inspire. Design. Evolve.”

This was such a fun interview, and not just because home-made shepherd’s pie was involved! Anyone that has ever met Heidi knows that she has something to say AND it’s worth listening to – I was able to learn the most important step in the salon service process – the consultation with the client.

Common error avoidance, having a good community reputation, worker feedback, training, pre-job briefs…. Get ready to learn how this all relates to Human Performance Tool usage…

Are you afraid of misinterpretations of INPO’s Cumulative Impact Document?

Cumulative ImpactI don’t mean to alarm you, but I have already read that some HU practitioners are excited to get back to a “common sense approach” to error reduction. If you currently believe that, I truly hope you understand things differently by the end of this post. The cumulative Impact document needs to be understood before drastic measures are taken to counteract the effectiveness of your event-prevention and error-reduction programs. We must always be careful with recommendations that may not suit our organization, but may suit the average site. Based on reactions to this document, I have also heard some say that it is potentially the most dangerous document ever put out by INPO. This is not to say the document is flawed, but is to say that how an organization may respond to it, could be. Prior to Human Performance programs, we had a prodominate culture that blamed the last person who touched it and just wanted to move on, but since then, this industry has made many amazing strides in the area of performance improvement and should be very careful turning positive improvement gains around. Am I worried about the impact of this document? Yes.

Not to say we don’t use it today, but to me the best way to describe “common sense” is what was used prior to having human performance programs in the nuclear industry. What I see as the catalyst for changing current programs is a comprehensive Effectiveness Follow-Up (EFU) on all the things we’re spending time on that aren’t actually driving performance improvement, but where this gets us in trouble is when we cut those things that are actually “moving the needle” in the right direction. This has made ineffective observation programs a target for elimination or severe overhaul.

This portion of INPO’s (October 16, 2013) “Industry Cumulative Impact Summary Report” I’d like to highlight, and I am not taking it out of context:

“Section C: Human Performance, Supervisory time with workers will shift from observing and documenting, to engaging and reinforcing expectations. The burden associated with documenting these observations will be reduced to improve the focus on coaching worker behaviors and reducing emphasis on observation documentation. Simple methods to quickly capture key gaps will be developed to allow high-level trending without challenging coaching effectiveness.”

[Author’s note: Apologies, as I could not find a link to the entire document at this time]

It makes me want to ask what exactly is “high-level trending” as opposed to other “levels” of trending?

Do not eliminate your observation program; fix it

The number one item that scares me is the cutting out of Observation programs completely. This document is not an excuse to get rid of a bad observation program, but instead a heads up that if your program is not effective, you should change it so it will be. This is not supposed to be treated as a permission slip to stop doing something just because it has been ineffective to date. I know of some stations that are completely transforming their Observation programs, and even eliminating the formal process. I can appreciate this if it was truly causing burden on the leadership, but this type of “burden” seems to be a fallacy to me.

“My Observation Program is a leadership burden” – Is this a justifiable statement?

The answer is “No.” Formal Observations are designed to have 4 phases:

  1. Preparation
  2. Performance
  3. Feedback – where engaging and reinforcing should already be happening!
  4. Documentation

The “feedback” phase is for the workers being observed and is the most important part of the process, but the complaining I’ve heard and the burden is said to come from the “documentation” phase. Once familiar with the Observation program software, a user should be able to document an observation in 3-10 minutes. If you cannot do it in that timeframe, you need a better software solution (click here for your best option).

Why should we document Observations?

Observations are documented for; tracking and trending timely positive and negative information; for obtaining, keeping, or re-obtaining training accredidation; as a means of proving leadership enegagement for recommendations based on SOER 10-02 (which practically insists you will document paired observations to prove observing supervisors are effectively engaging the workers); as a way to document At-Risk practices linked to a condition report system; as a way to prove housekeeping areas were walked down on the appropriate intervals; to discover and document shortfalls in knowledge areas that may need a training intervention; and to have proactive performance data, not obtainable or documented by other means.

It is truly a mistake to think that overall performance will improve with less documented observations.

Is coaching different than giving feedback?

Yes. This distinction has to be clear, concise, and most importantly, consistent. In a previous post, I defined coaching as “what someone says to someone else to guide then into correcting an undesired behavior.” A lot of people weighed in on that in LinkedIn forums, and some agreed, while some said it’s also the act of reinforcement. To me reinforcement is feedback, not coaching, but I see your point, especially as it relates to sports coaching.

Human Performance Tool recommendation or requirement?

The thinking in this type of cumulative impact response reminds me of the industry pullback on the “requirement” to use the circle and slash method of placekeeping, because it is a much more robust tool than simpler versions of placekeeping (signoffs and checkblocks) – this became more of a recommendation than a requirement, based on the situation that a human performance tool may actually cause someone to not be engaged or thinking while they are performing a continuous or reference level of use procedure, because they are too wrapped up on circling and slashing each step.  No human performance tool should be used if it is known to distract workers from the task. It may sound simple, but without practice the act of circle and slashing each procedure step actually can distract the procedure reader from the actual task being performed. Reader engagment can be vital to the success of that procedure. Circle and slash is an amazing tool totally based on STAR principles, and self-checking each step, but it should be practiced and mastered before it is employed. If you don’t think so, do an observation on someone who has never used, compared to someone who is very familiar with it.

Cumulative Impact Related Links:
Powerpoint for how cumulative burden is being addressed

NEI’s version called, “Cumulative Impact of Industry and NRC Actions”

NEI Nuclear notes: Regulation, Nuclear Energy and the Cafeteria

U.S. Nuclear Power Survival Part 2 (I really appreciate this article and I highly recommend reading it.)

Check out how DevonWay is starting to help the Cumulative Impact effort (YouTube video)


NEI’s November 7, 2013 presentation on “Cumulative Impacts”

From Slide 18:
“•Changing cultures -What is perceived as important to [a] specialist may be of low relative safety significance”
I’m having a real hard time thinking that a specialist’s (or Subject Matter Expert) experience has been disregarded in such a manner. More practitioners need to be paying attention to this particular bullet-point. I interpret this as “someone else (who is not a specialist) thinking that they know better and will be trying to change “culture” based on their expertise versus actual experts.” After doing some of my own research, now I understand why some practicitioners are saying this document could be “dangerous.” Will this be a return to the 1990s culture where HU Practitioners had to convince executive management that event-elimination and error-reduction programs (including observations) were necessary and within the realm of possibility? Because of this, practitioners are going to have more work in front of them than just managing a program. I hoped this was behind us for good.
There still is a balance that needs to be struck between production versus protection (click here and go to page 15).
I want to make something absolutely crystal clear – I personally view INPO (and NEI) and the bulk of their products as excellence. I myself am constantly striving to improve what I do and how I do it, and they have a similar mission – that is something I can appreciate. I also freely admit that I am not collectively smarter than all of the people that have done hard work to implement cumulative impact reduction. I am cautiously optimistic that when site’s evaluate how this will be translated into their leadership cultures, they will still use conservative decision making and a graded approach. If you are a practitioner, my advice is to not ignore how this is being implemented at your facility.

Practitioner Spotlight: James D. Newman – Are you looking to hire a Human Performance Practitioner?

James_D. NewmanDoes your organization need a fresh look at Human Performance? How satisfied are you with your current Human Performance Training Program? Is your Observation program being effective? Could you use an expert on measuring and improving performance in your organization?

James D. Newman is the answer to your problem. This post is spotlighting this site’s main author and creator.

Why is James any different than anyone else?

To sum up what makes James unique, he is dedicated and experienced.

James is a committed father with a vast background of interests.  He has worked primary and secondary systems at BWR and PWR nuclear power plants, has been a mobile Disc Jockey since he was 16, was a lead singer for a few rock bands, and has also built robots for competition.

He currently mentors college students and other HU practitioners, offers HU and Knowledge Transfer improvements as a keynote speaker at conferences and training venues, develops poster presentations and designs websites, actively participates in professional organizations (ASTD and NA-YGN), creates and teaches courses for thousands of nuclear workers and leadership, and is an author and podcast host—all while pursuing higher education.

Personal philosophy

The most frequent question I am asked is, “Where do I find the time?” There are usually a few hours left daily after the kids go to bed to research and make phone calls or e-mails. The rest of it is carefully fit into my life’s schedule. If you care about something enough, you make the time.

Always give the person you’re interviewing following an event the benefit of the doubt—there is a great chance the error he or she made is not the cause for the problem, but more the consequence of a deeper problem.

As an HU Practitioner, never be a full-time Root Cause team member because it takes away from your other full-time responsibilities to other priorities at the station. Instead, offer to check in at a regular frequency to help with human performance issues and interviews.

Always consider tapping into your network of contacts and associates when you’re questioning a process or criterion.

When designing training, always use ADDIE and make sure your evaluation process is challenging and significant.

Use the best software available to track, trend, and report issues and observations.

Why are you leaving your dream job?

Based on life changes, I will be moving to MA/RI/CT area within the next few months (after acquiring a stable position). My extended family lives in the tri-state area, so this means I am currently looking for a Human Performance position where I can live and work from. I am willing to travel, but not willing to spend weeks away from my children. The ideal location would be the area between Springfield, Boston, and Providence, but I would consider nearby options.

A little more about me

I’m 41 and have been working in commercial nuclear power since I was 19. I started right out of a Nuclear Engineering Technology degree program as a contracting junior mechanic and quickly transferred in-house into Instrument & Controls where I spent 15 years performing all aspects of a technician’s work – Surveillances, Preventative Maintenance, Troubleshooting, Corrective Maintenance and new installations from all areas of the plant to the Control Room. In 2007 I was chosen as the stations full-time Human Performance Coordinator. In 2010 I took that experience to another nuclear station as I started my Workforce Education Development degree over 1500 miles away. For nearly 18 months I flew back and forth on weekends to go through an amazing program while fully taking advantage of the long weekends offered by working four 10-hour days.

For the past 4 years I’ve been developing and implementing training, surveys, reports, and metrics while building an amazing world-wide network of Performance Improvement and Training professionals. Last year in May, I started this blog and a podcast of the same name, and that has opened the network even more. One of my largest passions is to teach out of the Affective Realm – to help someone value an idea, concept, expectation or requirement. I am extremely inventive with my training technique and activity creation to make memorable topical points. I am also very passionate about having a Safety Conscious Work Environment. I have done a lot of work with the Nuclear Safety Culture Monitoring Panel at my current station, and even considered Employee Concerns as a future possibility.

In the past year along with my fulfilling work at a utility, I’ve given presentations and support to a Midwestern Transmission and Distribution company, and provided Human Performance training to multiple CEOs and Vice Presidents from various large Kansas companies. The Kansas City chapter of ASTD recognized my work by awarding me a “Best Practice” for training, and subsequent presentation of the course to the chapter. The Omaha ASTD chapter has since asked me to present there, as well. I’ve attended at my own expense Omaha’s ASSE Chapter meeting for OPPD’s presentation on a stripped down HU program, keynote addresses at ICMA conferences (Jim Collins, Amy Cuddy,  and Daniel Pink), and also the New Media Expo conference in Las Vegas to become better at blogging and podcasting.

I am looking for stability, but also looking for an opportunity to really help an organization (or consultant firm helping organizations) with my experience, creativity, motivation and attitude. I also bring my valuable team of industry friends and contacts for benchmarking at a moment’s notice.

Click here for a Power Point presentation I gave to the Young Professionals Congress in 2007.

Some Prezis I’ve put together:

Leveraging Blogs and Podcasts for Human Performance Training (Built for a poster presentation at INPO’s Human Performance Conference I was asked to give in September 2013)

Awesome Podcast Content (My favorite Human Performance Podcasts)

The Me (A compilation of work I’ve been doing and tools used. One of my SIU Professors asked me to present in October 2013 to a WED class what work in this field could be like. This was 1550 miles away from my home on my own time and budget.)

Note that some of my training products will be posted in ebook form in the near future for a cost through this site.

Contacting me

Click here for my LinkedIn profile where you can find my resume and more about me. If you are not a LinkedIn member, please email me for a resume.

If you would like to contact me about a Human Performance position or opportunity, please email me directly at or call me after 5pm EST 860-917-5768.

A note for returning HU Tool blog readers

There is a lot more to me than what this post suggests, but I recognize the readers of this blog are looking for something more than learning about the author of it. I hope you do not mind that I posted about myself, instead of a different topic. At this time, all of the costs for operating the blog and podcasts have been my own, so I feel comfortable using it to leverage potential opportunities that may not be discovered otherwise. Thank you for your continued support. Every email I recelive that says thanks for authoring this blog means a lot to me.

Please stay tuned – I have been compiling upcoming posts on:

  • INPO’s Cumulative Impact document reactions
  • Preventing disasters
  • Decision making
  • Leadership alignment