Agenda: the soul-crushing politics that hide failures under-the-hood

“You can make mistakes, but you aren’t a failure, until you start blaming others for those mistakes.” - John Wooden

Every one of us who has ever worked for corporations or other large organisations, shiver from the word organisational politics. It’s those small agendas that too many people seem to have that makes us hate going to work. Besides politics like holding decision or controlling access to information, the same people often use manipulative tactics such as blaming, accusing or talking behind the backs. It goes without saying that all of these turn people against each other and poisons the workplace culture and all other efforts that has been to it.

 

Decision Holding

One of the most typical forms of organisational politics is holding decisions and information flow and use it for personal agenda such as promotion or recognition. This impacts the system of learning from failure in two ways:

  1. Unless your organisation is completely flat in hierarchy, many new practices require management mandate and budget even for proof-of-concepts. Without decisions there are no experimentation nor new practices.
  2. When front line learns that the incidents and problems they have reported stall in the middle management or don’t get acted upon, they become accustomed to the truth that it’s not worth the time to report anything. This is learned helplessness in action.

Some middle managers may want to sugarcoat information before reporting it above. But it’s good to understand that although top management may not like bad news, they want to receive timely and fast. More often than not, holding and sugarcoating information end up hiding the true problem.

Just listen to the words of one of the most successful people of all time, Warren Buffett, from his 2004 letter to Berkshire Hathaway shareholders:

“Most of the complaints we have received are of “the guy next to me has bad breath” variety, but on occasion I have learned of important problems at our subsidiaries that I otherwise would have missed. The issues raised are usually not of a type discoverable by audit, but relate instead to personnel and business practices. Berkshire would be more valuable today if I had put in a whistleblower line decades ago.”

It’s also good to remember that a well-delivered middle management serves well its money and purpose. It’s just the fact the today’s fast changing VUCA world requires middle managers to change their more traditional roles as evaluators and gatekeepers of information to more empathetic coaches.

 

Blame Game

Continuing with empathy, which is defined as ‘the ability to understand and share the feelings of another’ is a word that we tend to use a lot when designing for better user experience on our platform. But it’s also good to use as the opposite of our next major barrier to learn from problems: blame game.

In health and safety authors typically define two types of unsafe acts: errors and violations. Errors are the unintentional actions that originate from systemic conditions, whereas violations are the deliberate, intentional actions taken by an individual. Blaming for errors result in defensive fight backs and obviously lessons not learned but it’s good to understand that NOT blaming for violations is also a mistake itself. For example in the aviation industry, pilots expect that violations get punished because they know that everyone suffers the consequences when things go badly wrong.

We are not the only one to call this blame game and there’s by the way a great book that covers (also this) topic that we highly recommend every HSEQ, Facility and Customer Experience Manager to read: Black Box Thinking. 

Blame is obviously a sort of feedback loop, but it’s worse than no feedback loop at all. People may try once or twice more without clear feedback but getting punished for actions that are not your fault, is not what is supposed to be given to anybody. Again, in aviation all stakeholders trust that they are free and encouraged to report anything adversary because the genuine no-blame culture for errors and overall industry benefits of fewer accidents motivates them to give full disclosures.

 

Not Invented Here Syndrome (NIH)

NIH or Not Invented Here Syndrome is originally a software development term that is defined as a tendency to reject feasible external solutions to software development problems in favour of internally developed solutions. Outside of software development only, NIH syndrome can be organisational which is shown as a tendency towards hiring instead of working with external experts. What is often much worse is when NIH is individual or managerial. This shows itself as jealousy, presenting another's ideas as one’s own, not valuing others’ work or even directly laughing at the contribution of others.

Most of the problems, mistakes, failures, errors or defects (especially the most severe ones) require communication, investigation, teamwork, analysis and collaboration to be transformed into solutions, actions and ultimately better systems. It probably goes without saying that if NIH syndrome is present, not many wants to suggest their ideas, corrective actions or initiatives towards something better.

 

Final thoughts

As a manager, it’s your responsibility to do everything you can to keep the organisational politics at bay. This includes what you say, what you do and what you reward and recognise people for. Having more gatekeepers (read: information messengers or middle managers) increases the risk of Decision Holding, Blame Game and Not Invented Here syndrome.

Look at your current processes (especially those related to recording failures, defects and errors) and see if you can avoid gatekeepers there or if you can install a channel that bypasses them completely.

 

Continue to the blog posts of the series here:

 

Want to learn more about the subject? Download our VUCA white paper to learn more about how the operating environment is changing and why all organisations need to become more agile today rather than tomorrow:

New Call-to-action


We are building the world's first operational involvement platform. Our mission is to make the process of finding, sharing, fixing and learning from issues and observations as easy as thinking about them and as rewarding as being remembered for them.‍

By doing this, we are making work more meaningful for all parties involved.

More information at falcony.io.

Related posts

Friction: the number one killer of organisational habits

The resistance that one surface or object encounters when moving over another.

  • Definition of ...
VUCA
6 min read

Inertia: the organisational complexities that stall information flow

“a tendency to do nothing or to remain unchanged.”

  • Definition of bureaucratic inertia

...

VUCA
6 min read

Why It’s Hard To Learn From Near Misses

The COVID-19 epidemic is a clear example.

Near misses happen all the time, but it seems really hard...

HSEQ
2 min read

Involve your stakeholders to report

At Falcony, we create solutions that multiply the amount of observations and enable our customers to gain greater understanding of what’s going on in their organisations, areas of responsibility and processes.