OK, So Why Won't Government Employees Admit Even the Smallest Error?

I got some attention with a post the other day about an example of something I see constantly -- government employees unwilling to admit even the smallest error.

One reason is that even as someone who runs a company that partners with government agencies frequently, I am still an outsider and a member of the general public.  And government agencies train everyone in their organizations never to give any information to the public that is not fully vetted and controlled.  Government agencies have had their training budgets slashed, but the one training everyone still gets (along with diversity training) is training on how to reveal (or really, not reveal) information to the public.

But I think there is a more important reason for this behavior, and it is one I want to spend a bit of time on in part because it is one of my favorite business topics: incentives.   There is nothing in an organization that is harder to get right than incentives.  And this is doubly true of government agencies because most government agencies don't have, or don't choose to measure, any output variables.

What do I mean by output variables?  Organizations tend to measure both what I call input and output variables.   Let's consider a sales person.  An output variable is a business result, e.g. number of units sold, number of new customers added, revenue of products or services sold, gross margin of products sold, satisfaction rating from customers.  An input variable is a measure of how well process steps leading to that sale were completed, e.g. percent conformance to pricing guidelines, number of sales calls made, number of quotes produced.  If well selected, input variables tend to lead to the output variables but they don't in themselves pay the rent.

Because I am most familiar with them, I am going to use government recreation agencies like a state parks organization as an example.  I have yet to find a government recreation agency that measures its employees primarily on output variables, e.g. customer satisfaction of park visitors, fee revenue collected at park, net income of the park, change in deferred maintenance accounts, etc.  Instead their metrics are -- at best -- based on conformance to process, e.g. was the budget completed on time, was the planning process done right, was all necessary reporting done on time, etc.  I say "at best" because most government agencies have no formal performance metrics at all.  And this is where I get to my favorite incentives / metrics topic of all -- informal performance metrics.

An organization never has no performance metrics at all. They may have no formal, written standards, but every organization has to evaluate and promote talent.  If there are no formal standards, there have to be some informal or unwritten standards that are applied.  And I would argue from my experience that even when formal standards do exist, there may still be informal standards that are more important.

One informal incentive that exists naturally in almost every organization is "don't get caught in a mistake."  On its face this is one of those incentives that seem good -- sure, I would love to have an organization where no one makes mistakes.   But many companies have found that in competitive markets, allowing this informal incentive to become powerful can spell a company's doom.  It has at least two negative effects:  it limits honest communications, because people start hiding their mistakes which in turn keeps information from the rest of the organization that may need it; and it limits risk-taking, which is necessary for most companies to survive in competitive markets, because almost everything a company does to improve contains risks.  Powerful formal performance systems are one way to limit counterproductive informal incentives like this.  But many companies also put a lot of work into their communications and culture to help employees be more open to taking risks and making mistakes.   A vast portion of my communication with my own managers and employees are on this topic.  We try to make very clear the subset of mistakes that are career fatal and where we DO want risk aversion (e.g. racism, harassment, abuse, etc) and treat everything else as a learning exercise.  My response to one of my manager's mistakes is very likely to be, "sorry, that was my fault, I did a bad job of training you (or preparing you, or whatever) for that issue."

Recognize though that all of these corporate steps to head off problems with the informal incentive "don't get caught making a mistake" have largely been lessons of the marketplace.  Time warp back to the 1950's when American companies were fat and happy and not yet really faced with scrappy global competition, and you might well have found highly risk-adverse cultures where people were afraid of being caught in a mistake.  I do not have experience at companies like GM, but I would not be surprised at all to learn that risk aversion dominated the culture and that faced with market extinction, it has spent much of the time since the 1970's trying to purge this risk aversion from its culture.

But in large part, a government organization doesn't face these market corrective forces.  If an agency becomes weak and senescent, it does not get competed into oblivion, it simply goes on and on.  Maybe it gets more tax money to make up for its inefficiency, or maybe it cuts somewhere (such as deferred maintenance in public parks) to make ends meet.   Which means that in most government agencies I have worked with, informal incentives -- particularly "don't get caught in a mistake" -- are extremely powerful.

Most people are familiar with the fact that the default government answer to anything new is "No".  But did you ever wonder why?  I have heard a lot of folks say that it is because government employees are jerks or lazy under-performers or have evil intentions.  But that is really not the case.  With just a couple of small exceptions**, people who enter government are no different than people who enter private organizations.  If they do things that seem bad, it is not because they are bad people but because their information and incentives cause them to do things we perceive as bad.  Take the case of saying "No".  Without any output metrics, most government employees have no incentive to say "yes".  There is no incentive to, say, generate 20% more visitor revenue in parks so there is no incentive to approve new visitor facilities or services that might generate that revenue.  And there is every reason so say "no".  "No" is almost always safe, particularly if one does not actually say "No" but instead say something like, "well, that is an interesting idea but we need to do X, Y, and Z intensive 20-year studies first."  There is virtually no way for any government employee to get caught in a mistake saying that.  So that is the answer most of us get from the government.

Coming back to the original question, I hope this helps explain why agency employees who don't admit error act the way they do -- they are not bad people, they are normal people reacting to a bad incentive.   Imagine in my business if I, say, reversed two numbers on one of the 25 state and local sales tax returns we file each month.  When pointed out to me, I have no problem admitting the mistake because I know it is easily correctable and that it has little to do with my true performance.  But in the government world, things are completely different.  They don't have output variables.  Executives can have full successful careers running parks where the infrastructure is allowed to fall apart, the headquarters become bloated, and visitation stagnates.  But they can be fired for getting something wrong in the process.  Not very often, but just enough pour encourager les autres, particularly in an environment where there are really no other formal metrics to override this fear.

 

**postscript:  I have found two ways that people who enter government are different from people who enter private business  (people are more different at the end of their careers after they have been shaped by the incentives and culture for a long period of time, but I am talking about upon entry into work).  First, people who enter government tend to prioritize security (e.g. good benefits, difficult to fire) over other aspects of employment.  Note that this just tends to reinforce the risk aversion to making or admitting a mistake even more.  Second, people who enter government tend to be more confident of government solutions to problems and more skeptical of private solutions than people who enter private business.  This latter is another reason why my company, that offers private solutions for traditional government functions, hears "no" a lot.

 

10 Comments

  1. Matthew Slyfield:

    " because almost everything a company does to improve contains risks."

    No, everything a company does, for any reason at all, contains risks. Even sitting on your ass doing nothing carries risks..

  2. Matthew Slyfield:

    One major bad incentive in the Federal government bureaucracy, and I think it's likely most states do the same thing, is the way salaries are set for management positions. All the way from first line supervisors to agency heads, salaries are determined by a combination of budget + employee head count.

    This means that government agency management has a strong positive incentive to be as inefficient and ineffective as possible. If they are efficient and effective, their budget might get cut, but being inefficient and ineffective, they have a perpetual argument for ever larger budgets in which the mangers have a personal financial incentive.

  3. me:

    Honestly, this kind of don't-care-but-cover-your-ass mentality is endemic in big organizations. Try to get meaningful support from the likes of Google, Microsoft or Ford and see what you get. The phone chain run-around is very common, as is not admitting to anything or being unhelpful out of fear of litigation.

    Apple is one of the few large organizations that have customer service down pat and are very organized but even they are lately showing cracks in their process.

    My theory is that government is particularly bad, because the chains of responsibility are particularly long: whereas in a private organization, there are very few steps between CEO and support line employee, in government, those chains are long, and typically all middle layers need to practice CYA but the top of the chain (the president or organizational head in this case) is so far removed that he just can't care enough about the things that go wrong on a daily basis.

  4. Ward Chartier:

    Elsewhere in a few places I've read: [1] the imperative of a politician is to win election or re-election; [2] the imperatives of a government official are [2A] to maintain or increase the department's budget, [2B] to maintain or increase the department's headcount; and [2C] to not make any mistakes; [3] the imperatives of the civil servant are [3A] to avoid having to make a decision, [3B] to avoid having to take responsibility, [3C] to avoid making mistakes, and [3D] to stay under the parapet long enough to earn full retirement compensation and benefits.

    As a professional in the private sector, pursuing these during any of my jobs would have gotten me fired.

  5. Jim Bow:

    I've often thought that one of the things we've done a very poor job of doing is teaching people how to say
    "I really messed that up, I think I know where I went wrong, but I'd like your honest opinion of my mistake and your thoughts on how to limit the damage."
    The other half of this is of course,
    "I agree that was poorly handled and I appreciate you coming forward. Tell me where you think you went wrong and how we can work together to repair the damage"

  6. PapaMAS:

    I have seen this. In my military career I saw plenty of officers I called "drones". Drones initiated nothing and would do little. Their biggest concern was that they not be blamed for doing something wrong. So, they would sit around and do nothing other than make sure their people were following "the process". They took the inevitable beatings by superior officers in stride and steadfastly refused to do anything which would stop the beatings. I don't think any of them had any clue they were simply dead wood.

  7. Dan Wendlick:

    That's Type I vs. Type II risks. the problem is that Type II errors tend to be more difficult to spot. Type I error is "we made a change and the metric got worse." Type II error is "we didn't change anything and the metric didn't get any better." Nearly all large organizations are biased towards generating Type II errors, It is a lot easier to measure a drop in revenue or profitability that did occur than it is to measure the enhanced revenue or profitability that didn't happen because a change was not made.

    When a company or organization embraces this mindset, you get IBM, Ford or GM in the 1960s and 70s, or pretty much any government agency today. The cost of not making changes in a changing world ultimately is irrelevancy and extinction.

  8. Matthew Slyfield:

    "Type II error is "we didn't change anything and the metric didn't get any better.""

    Except not getting better is not the only risk attached to doing nothing. There are times where doing nothing can make things worse.

    "Nearly all large organizations are biased towards generating Type II errors"

    True, but not even the largest multi-national for-profit companies are anywhere near as heavily biased towards type II errors as the government is.

  9. cc:

    just sent a question to state about taxes. Got a reply in <24hrs --person's name was Stonewall and they didn't stonewall!

  10. JHW:

    Actually the reason why government officials, and government agencies, can't admit error is quite simple. However, it's not one that is easily identifiable to anyone from commerce.

    The motivators in business are based on their markets and the changes that occur. Thus, in business the constant need to evaluate, innovate and change to pursue revenues, growth and survival, are based on trial and error, as well as risk. Errors are both motivators for change and liabilities. Admission of wrong action is directly correlated to the level of loss and liability, to both financials and the brand image.

    Government, on the other hand is an institutionalized system designed, like most systems, to set and maintain a status quo.
    And, as any system, it is by rules that the system performs its function.

    These rules; laws, policies and regulations; dictate how the system will respond to any issues to be addressed.

    When a situation is encountered the officials must determine how the rules should best be applied to it.

    This has the effect of establishing a "precedent", an extrapolation of the rules that becomes, for all intent, the foundation process for addressing similar situations in the future.

    It also has the effect of entrenching the institution in a position. One that, for better or worse, becomes the rule of law.
    Should the decision prove wrong, it threatens the status quo. The institution now finds itself confronted by a situation in which every prior decision made under these rules, and this precedent, are now called in to question.

    Since these decisions are far reaching and tend to affect large numbers of people, the ramifications of this can reverberate through not only the institution, but all those under their purview.

    This can be costly to both the institution and those dependent on it for services.

    Given the alternatives, the institution will not admit error and will continue to espouse their position, even in the face of contrary evidence and officials knowing the truth.

    This is called Mala Fides, or Bad Faith. And it explains why, in such circumstances there appears to be a cognitive dissonance.

    Orwell called it "GroupThink."

    Speaking of precedent, US Supreme Court has acknowledged this scenario in several decisions.

    Several examples of this are frighteningly true. Over 20 years ago, the Illinois Justice system tried, convicted and sentenced to death, an African-American man for a brutal rape-murder. The man fought his conviction for 18 years, claiming his innocence. The IL States Attorney resisted this every step.

    When DNA technology reached a point where it was capable of testing semen recovered from the victim, the Innocence Project, representing the man, fought to have the evidence tested. The IL States Attorney fought the attempt up the courts to the US Court of Appeals.

    Their stance was the man was the perpetrator of the crime and that the testing was unnecessary. During the process, the States Attorney even petitioned the IL court to have the DNA destroyed; using the argument that all state appeals has been exhausted.

    The Court of Appeals ordered that the States Attorney release the evidence for independent testing.

    The test result excluded the man from being the source on the order of unlikelihood of 1 to the population of almost 2 Earths.

    The States Attorney then insisted that they still had convicted the right man. They then purported that the DNA had been left by an unnamed accomplice and that the man has uses a condom, even though no previous mention of a 2nd perpetrator had ever been raised and there was absolutely no evidence to indicate more than one offender.

    This became known by defense attorneys as the "Unindicted Co-Ejaculator" theory.

    The man was pardoned by the Governor.
    The error opened the flood gates. The Illinois court system was then inundated by thousands of appeals demanding retesting of all evidence for DNA, many of the cases not even homicides.

    Another example is the Forensics supervisor at the VA State Crime Lab who was found to have been falsifying DNA results, in the state’s favor, for almost 7 years.

    It forced the Commonwealth of VA to retest thousands of DNA samples, which the state resisted, caused tens of thousands of appeals and led to thousands of lawsuits.

    It also had the untoward effect of triggering the VA legislature to pass a law limiting the introduction of new evidence in a capital crimes case to 3 years.

    We're seeing this now in both partisan politics. No more so in one institution - the News Media. Both sides have picked a position, developed a narrative furthering that position and become entrenched in that narrative.

    They are invested. They are committed to that narrative, even as information directly contravening that narrative comes to light.

    And one side or the other will be significantly damaged, financially and in reputation, when the outcome becomes evident. The side in error will find all their previous, and all their future, reporting suspect.

    Sartre said, "I must know that truth very precisely, in order to hide it from myself the more carefully — and this not at two different moments of temporality."