Just Culture

Restoring Trust and Accountability in Your Organization

Drawing from safety practices in transportation and medicine, Sidney Dekker outlines how to (and how not to) create a culture of trust, learning, and accountability. Key to his analysis is that learning from mistakes requires a blameless culture: blame closes off avenues for understanding how and why something happened, preventing the kind of productive and open conversation necessary to learn. His work underlies many current engineering best practices (including the use of blameless post-mortems) but is broadly applicable to other functions, as is evident by his wide range of examples. Most importantly, Dekker is explicit about how blameless cultures are an exercise of restorative justice—a set of practices that prioritize learning, expressing remorse, and healing over criminalization and retribution. Among the very best things I’ve read about how trust and accountability can and should work in any organization.

Reading notes

On accountability

The word “accountability” gets a bad rap. It’s often confusing—does it mean the same thing as being responsible? (Sometimes, yes, but not always.) Is saying that someone is accountable the same thing as saying they are the final decision maker? (Not necessarily.) Is the accountable person the one who gets fired when things go wrong? (Hopefully not.) Sidney Dekker has a better framing:

In both retributive and restorative approaches, people are held accountable for their actions. Nobody gets “off the hook.” Retributive justice asks what a person must do to compensate for his or her action and its consequences: the account is something the person has to settle. Restorative justice achieves accountability by listening to multiple accounts and looking ahead at what must be done to repair the trust and relationships that were harmed. This makes it important for others to understand why it made sense for the person to do what they did. Their account is something to tell. This also offers an opportunity to express remorse. Restorative approaches are open to multiple voices, and are willing to see practitioners not as offenders or causes of an incident, but as recipients or inheritors of organizational, operational, or design issues that could set up others for failure too. Restorative approaches are therefore more likely to identify the deeper conditions that allowed an incident to happen. They are better at combining accountability and learning at making them work in each other’s favor. Where retributive approaches to a just culture meet hurt with more hurt, restorative approaches meet hurt with healing, and with learning.

Dekker, Just Culture, page x

The notion of an account as something you tell transforms accountability from a nebulous set of responsibilities or duties or punishments into a clear and explicit directive. It remains, of course, a non-trivial task: Dekker goes on to talk at length about how to build systems where accountability, in the restorative justice context, is possible. Much of the difficulty can be ascribed to the ever-present instinct towards retribution, which seeks easy conclusions at the expense of understanding. But while accountability in the retributive context promises that someone will be punished, only restorative accountability offers the hope of preventing future harm. That ought to make it an easy tradeoff; that it still isn’t in most organizations shows how deep the commitment to retribution runs in our cultures. It also lights up where we have work to do.

Refusal

In Conflict Is Not Abuse, Sarah Schulman describes a common scenario in which conflict between two or more people has degenerated until the point where one or more of them refuses any further engagement. We’ve all witnessed this, or even been in that position ourselves, I’d wager. This is the move that often accompanies edicts like, “Don’t contact me again,” or, “I can no longer work on a team with that person.” If the situation has involved actual abuse—that is power over, in which one person has harmed the other (e.g., through an act of violence, whether physical, verbal, or economic)—then some temporary separation may be necessary, until or if repair can be made. But in many if not most cases, these situations are about power with—i.e., conflict—in which both parties have contributed to the circumstances and both are therefore responsible for negotiating their resolution. The refusal to engage in that negotiation amounts to an abdication of that responsibility. She writes:

The refusal…of looking at the order of events, or actually investigating what happened, is a kind of “dissociative” state, a level of anxiety about being challenged that is so high that they can’t even remember what the actual conflict is about, and don’t want to be reminded either. All they know is that they feel threatened. What really happened becomes unreachable. In other words, it is a state of being unaccountable.

Schulman, Conflict Is Not Abuse, page 173

This brings to mind Sidney Dekker’s work on accountability, in which he notes that just cultures require disclosure—i.e., they require the giving and receiving of each person’s account of what happened, not with the aim of finding the one true account (something that is neither achievable nor desirable), but with the goal of exposing the multiplicity of accounts, of the complexity of the circumstances. Refuse that disclosure—refuse to be accountable—and you prevent justice, or worse, create an injustice. Schulman notes elsewhere that:

shunning is wrong. It is unethical. Group shunning is the centerpiece of most social injustice. To bond, or to establish belonging by agreeing to be cruel to the same person, is dehumanizing and socially divisive. It causes terrible pain, and it is unjust.

Schulman, Conflict Is Not Abuse, page 279

So much of our common sense notions of accountability have become entangled with ideas of punishment that we often talk of holding someone to account as if it were synonymous with punishing them. But no account is revealed through punishment. In shunning, especially, the account itself is silenced, lost, muted. Blocked. That’s an impoverished, and—I think Schulman is very right here—unethical move. Real accountability requires both speaking and listening, both disclosing your own experience and agreeing to acknowledge other’s experiences too.

Accountability sinks

In The Unaccountability Machine, Dan Davies argues that organizations form “accountability sinks,” structures that absorb or obscure the consequences of a decision such that no one can be held directly accountable for it. Here’s an example: a higher up at a hospitality company decides to reduce the size of its cleaning staff, because it improves the numbers on a balance sheet somewhere. Later, you are trying to check into a room, but it’s not ready and the clerk can’t tell you when it will be; they can offer a voucher, but what you need is a room. There’s no one to call to complain, no way to communicate back to that distant leader that they’ve scotched your plans. The accountability is swallowed up into a void, lost forever.

Davies proposes that:

For an accountability sink to function, it has to break a link; it has to prevent the feedback of the person affected by the decision from affecting the operation of the system.

Davies, The Unaccountability Machine, page 17

Once you start looking for accountability sinks, you see them all over the place. When your health insurance declines a procedure; when the airline cancels your flight; when a government agency declares that you are ineligible for a benefit; when an investor tells all their companies to shovel so-called AI into their apps. Everywhere, broken links between the people who face the consequences of the decision and the people making the decisions.

That’s assuming, of course, that a person did make a decision at all. Another mechanism of accountability sinks is the way in which decisions themselves cascade and lose any sense of their origins. Davies gives the example of the case of Dominion Systems vs Fox News, in which Fox News repeatedly spread false stories about the election. No one at Fox seems to have explicitly made a decision to lie about voting machines; rather, there was an implicit understanding that they had to do whatever it took to keep their audience numbers up. At some point, someone had declared (or else strongly implied) that audience metrics were the only thing that mattered, and every subsequent decision followed out from that. But who can be accountable to a decision that wasn’t actually made?

It’s worth pausing for a moment to consider what we mean by “accountable.” Davies posits that:

The fundamental law of accountability: the extent to which you are able to change a decision is precisely the extent to which you can be accountable for it, and vice versa.

Davies, The Unaccountability Machine, page 17

Which is useful. I often refer back to Sidney Dekker’s definition of accountability, where an account is something that you tell. How did something happen, what were the conditions that led to it happening, what made the decision seem like a good one at the time? Who were all of the people involved in the decision or event? (It almost never comes down to only one person.) All of those questions and more are necessary for understanding how a decision happened, which is a prerequisite for learning how to make better decisions going forward.

If you combine those two frameworks, you could conclude that to be accountable for something you must have the power to change it and understand what you are trying to accomplish when you do. You need both the power and the story of how that power gets used.

The comparisons to AI are obvious, inasmuch as delegating decisions to an algorithm is a convenient way to construct a sink. But organizations of any scale—whether corporations or governments or those that occupy the nebulous space between—are already quite good at forming such sinks. The accountability-washing that an AI provides isn’t a new service so much as an escalated and expanded one. Which doesn’t make it any less frightening, of course; but it does perhaps provide a useful clue. Any effort that’s tried and failed to hold a corporation to account isn’t likely to have more success against an algorithm. We need a new bag of tricks.