You are currently viewing HRO (Healthcare High Reliability Organization) how they achieve Zero Error?

How do HRO (Healthcare High Reliability Organization) achieve Zero Error?

There are organizations that do work with Zero Error.

They do not consider it an option to have just one mistake, even if it happens sometimes. The best-known HROs are nuclear plants, aircraft carriers, missile installations, semiconductor factories, air traffic controllers and the aviation industry in general.

In these organizations, mistakes are not acceptable. The criterion they work with is that never events, under no circumstances can these events happen. On the other hand, we have many industries that work with 6 Sigma guidelines, that is, achieving less than 4 errors per million events or products. In health we have 10% Adverse events (AE) in care, that is, procedural errors that cause harm, 50% of which are avoidable. This indicator is given by the World Health Organization and Anvisa. But taking only 10% and 50% we arrive at 50.000 errors per million, a far cry from the 4 per million of 6 Sigma companies, or the 12 accidents per million flights in the commercial aviation industry. Although medicine is a much more complex science than aviation, when we enter the area of ​​human errors, systems errors, and the discipline of safety, the values ​​become comparable.

We must emphasize that 95% or more of AEs are caused by system deficiencies and not by human error. So we can take advantage of the safety criteria that others have adapted to arrive at much lower numbers.
After visiting hospitals in more than 20 countries, we found a huge disparity in the level of safety between them.

  • Many countries still do not talk about Patient safety
  • In many places, removing potassium chloride from hospital wards and leaving it only in the pharmacy, a simple task, becomes something heroic.
  • Many medicine packagings are the same and no one helps change that
  • Basic technology like barcode scanning is missing in many hospitals
  • The 30% waste in hospitals prevents investment in automation
  • The lack of a senior manager Quality in hospitals
  • A policy of guilt still applies to healthcare professionals
  • The authority gradient among employees puts a brake on comments, suggestions and warnings
  • Governments even buy expensive equipment, but invest little in basic systems such as the population's health system (water and sewage)
  • Health is a universal right in many countries and its provision is the responsibility of governments. Therefore, there is a lack of investment due to a lack of priority.
  • Investment in prevention and health education in general is already insufficient to reduce spending
  • There is no obligation for other sectors to help, for example: laboratories, food industries and education in schools.

Therefore, there is still a long way to go to improve the health system, and the first change we need to make is in attitude and concern for what is happening. In changing attitudes, we have a growing concern about the incidence of cognitive processes on results. In particular, the technique of full awareness is discussed.

O System Opuspac offers free e-books for administering medications consciously. See the link: https://www.opuspac.com/br/downloads/

We have summarized below the repertoire of policies and organizational criteria of the HROs:

  • Concern about failures

There is an awareness that mistakes can happen. To err is human and it is almost a natural condition that we can fail. Automatic systems, with their algorithms, do not interpret all situations in reality and only handle logical situations that can occur. The incidence of several simultaneous factors leads to unforeseen results. HROs focus on studying small mistakes and near misses with the same determination as if they were major failures. Management is oriented towards studying failures, as well as maintaining productivity.

  • Reluctance to simplify interpretations

In a complex system, simplification is a methodological error. In no way can we give a simple or simplistic answer to a problem that occurs within a complex system. We must consider the limitations imposed by the context: our mental system, the limitations of the physical structure, of logical thinking, which parts of the whole we are not seeing.

  • Sensitivity to Operations

Having the bubble is a term used in the US Navy to define when a commander has the perception of the integrality of the elements of the complex reality of the surroundings, together with the operational dynamics. It is perceiving the totality, through instruments integrated with human action. Being aware of possible misinterpretations, near misses, system overloads, distractions, surprises, confusing signals, interactions and others.

  • Commitment to Resilience

It refers to anticipating possible problems, including some simultaneity, and training to resolve the situation when faced with them. Resilience here refers to overcoming the surprises that some incidents can produce and resisting, being prepared for error, accepting the inevitability of error and preparing to fight when the incident arrives.

  • More horizontal hierarchical structures

It has been studied that in institutions with a strong hierarchical orientation, errors spread more quickly. This happens more when the incident starts at the highest organizational levels than at the lowest. Having an organization that works more on consensus helps to prevent these problems. The term “organized anarchy” has been defined as a means of control.

The responsibility for resolving an incident falls more heavily on experts in the field than on hierarchical leaders. The fact that they are present at the time of the incident also determines who can make decisions. A defined but looser hierarchical structure is a desirable characteristic.

An organization is reliable when, faced with an abnormal fluctuation in the internal and external conditions of the system, it maintains a result within the desired range.

Having desirable results when input parameters are controlled is not enough. HRT, High Reliable Theory (Charles Perrow) defines that it is not the stability of the inputs that will give us the stability of the results.

So the conclusion is:

a) Mistakes will happen and you need to be prepared.
b) Your goal should be zero error.

Our preferred security model is to always have two systems working simultaneously, such as the computerized system, which automatically manages all operations through predefined algorithms, and human control, which supervises all events step by step to correct and improve security.