Welcome to our session on Learning From Our Mistakes: Creating a Just Culture. If you have already viewed the session on Human Factors, then you already have some idea that the systems and processes that staff work within are responsible for a large portion of the root causes for the errors that occur. If you have not yet viewed that session, I encourage you to do so, as it will help you make your root cause analysis more robust. I am Melody Malone, a quality improvement with TMF Health Quality Institute, and I am happy to welcome you to this session. But before we begin, let's remember that as we discuss how a just culture can facilitate our learning about errors and mistakes, we must never forget the potential human toll as a result of those clinical errors and errors in our business practices. Even though we will speak about errors in data a lot in our quality improvement work, we must always remember that each piece of data may actually hurt a human being or has the potential to hurt someone. Every resident matters. Every staff member matters. Every visitor and guest matters. Professor Sindoor [assumed spelling] writes, in his chapter on "Failure Mode and Effects Analysis in Medicine - in Medical Errors", that " -- human errors are not drawn from an infinite set of possibilities. Instead, they are drawn from a limited set of meaningful things that an individual can do in a defined situation. So that our adverse events in medicine are really problems of psychology and engineering." So as it says in Murphy's Law, if something can go wrong, it will. The task is to discover what can be done wrong and then to predict what will happen when it is done. So in talking about just culture, I want you to think back to a time when you made a mistake. Perhaps it was one that actually hurt someone, either in a healthcare setting or on the road in your car or at home. And depending upon your perception of the relative safety of disclosing your mistake is the issue at hand with a just culture? If you felt you would be dealt with in a fair and just manner -- viewed as blame free, not disciplined for something that happened beyond your control -- then you may have reported the event. However, if you felt you would become the center of the investigation -- perhaps being found guilty of malicious behavior, even if it was not what you were trying to do -- then there is the disciplinary action that follows. And how ugly can that become? So you may not have reported. So to answer the question, "How hard is it to admit you made a mistake?" Depends. It depends upon your perception of how blame free or blaming your work environment functions. So when we are talking about human factors versus disciplinary action, it's important to know when to use discipline or punishment. And that's a key to a just culture. In some cases, a near miss where no one was hurt might result in significant discipline. While in another situation, even when a resident is harmed, analysis may reveal that equipment or systems were the underlying cause of the event, and no one should be disciplined. It's important to understand human behaviors and the intention behind them. You know, to error is human. We all make mistakes. This is a slip or a lapse that is completely unintentional. Where - a rule violation -- an intentional rule violation -- the intentional part is the key. This is what differentiates the behavior as a conscious decision that places residents or others at harm or at risk. Would staff do something that is dangerous despite having good systems and processes in place? The appropriate response may be punitive. Recklessness: Recklessness can occur when staff feel it is their only option or the best option at the time. In these situations, it's important to realize that the individual may not be intentionally doing something wrong. And the response should focus on the incentive for the behavior rather than the individual. And this is addressed through coaching and looking at the incentives driving the behavior. Look at your policies and procedures and available resources to ensure that they apply - align with workflow and do not create the need to develop a workaround. I want you think about your family for a moment. In thinking about just culture, most of us have a family, or we were a part of a family. You may have children, right? So think about that family. Well when something goes wrong and somebody makes a mistake, do we automatically excommunicate our family member and kick them out of the house? No. We treat them justly. We give them opportunities to learn, to correct. We provide forgiveness. And we work through this. It's only when it gets to that extreme time when we have done all those things that we could do, that we might actually excommunicate someone from our family. But that's usually a rare event. It doesn't even happen in every family. This is what I am talking about with regard to just culture. We have got to really work through how we treat people kindly when an error has happened. So I want you to think about just culture as safety thinking. Just culture promotes that questioning attitude. What systems and processes really happened to allow this issue to occur? A just culture is very resistant to being complacent. It encourages that challenging thinking, "How can we do things better?" A just culture is committed to excellence. And it fosters personal accountability and corporate self regulation in safety matters. So if I am an employee in an organization, and if I understand that we have a just culture -- we have this atmosphere of trust -- and I uncover a potentially dangerous situation, then I am free to report that and I won't feel like I am going to be blamed or disciplined. I am going to be celebrated for reporting a problem. So in a just culture, we have this whole culture of excellence and safety. So we are always looking, "How can we do things better?" versus feeling like if we report a problem, we are going to be disciplined. And this dovetails so nicely into our goals of quality improvement. In our goals for quality improvement, we want to constantly be identifying our problem areas. We want to constantly be identifying sources of variation -- so those workarounds. We want to figure out, "How can we simplify issues?" So when we are doing this, we should be eliminating punitive error-reporting systems so that reporting can be made safe for employees. When it becomes safe to report errors without fear of retaliation, we can then establish methods to track errors and the effectiveness of corrective action. When we stop punishing people for errors, we can remove error concealment. Then we can get to sustainable quality improvements and truly have a culture of safety that allows us to have a stronger role of system design and corporate responsibility in managing human error. But most importantly, quality improvement is an opportunity for us to move beyond blame. We develop this culture of being just, and we apply the concepts of human factors, when we identify our real root causes. Then we can develop a system of error reporting and moving through the quality improvement process with success and gaining sustainable changes. So in your next steps, review a recent root-cause analysis you have completed. Did you consider those human factors? Did you treat employees that were involved justly? And did you identify a test of change? If so, you are ready to move on for the model for improvement. But if not, go back. Revisit your root-cause analysis. Revisit the human factors session. Make sure we are developing this blame-free culture so that we can treat people justly and have blame-free reporting. I wish you well in your quality improvement efforts. And feel free to contact us if we can help you.