Common denominator: Three kinds of self orientation
His argument makes a certain intuitive sense. If I think you are all about getting what is best for you and not at all concerned with what is best for me, I am not likely to trust you no matter how smart, punctual, or well informed you are. That said, it is also intuitively clear that self-orientation is probably a larger and more intellectually slippery concept than simply the unbridled pursuit of what one wants.
For that reason, I would like to suggest that it may be helpful to consider some of the various ways self-orientation can express itself, particularly in the workplace, if only so that people who wish to follow Maister’s advice for building trust have a somewhat more tangible game plan than “Note to self: Stop being so selfish.” I also want to make the case that it may well be that in important ways “self-orientation” is less about the direction the “self” is focused, and more about the kind of self one brings to complicated interactions.
Unilateral problem solving
“Don’t bring me a problem. Bring me a solution.” We hear it all the time, usually from a well-intended boss or manager who wants to nurture self-sufficiency within an empowered work force. Usually, there is nothing wrong with asking people to be problem solvers. And that is why our oft-rewarded instinct to be problem solvers can be so hard to break free from in situations where it does not serve us well.
Broadly speaking, professional problem solvers consume information, process it in their enormous noggins, and then deploy solutions based on their sense of what the information means. Too long a line outside your restaurant? Raise prices. Constantly running out of gardening spades in the summer? Rethink your inventory management systems. And so on and so on. Absorb the information. Figure out what it means. Deploy a solution based on your interpretations. This is the process one uses to bring solutions instead of problems.
All well and good, until we end up trying to solve people as if they were problems. Then, the pattern looks something like this: I see that you are not doing what I think you should do, or not thinking what I think you should think, or not feeling what I think you should feel. Therefore, I decide inside my own head what I can do or say that will change your acting, thinking, or feeling from what it is to what I think it should be. I deploy a solution, which usually sounds either like criticism or reassurance. Either “cut that out!” or “don’t worry!”. If you have ever told an angry person to “calm down!” only to find that they mysteriously get more angry instead of less, you have experienced the limitations of this approach.
Treating people like problems, instead of like people who have an independent and potentially valid experience of the world, is in many ways a kind of self-orientation. It assumes that the only thinking that is required is the problem solver’s, and that it is ok for the entire journey from data to solution to be carried out by the problem solver, alone in the recesses of her or his own head, in a way that is invisible (and closed) to the other person.
Barry Jentz and Joan Wofford have written usefully about the difference between unilateral and collaborative problem solving, and like Ron Heifetz and others, they make the point that moving from thinking about or for someone to thinking with someone requires creating some distance between our thought process and what we see as our essential self. The reason is fairly simple: the alternative to me unilaterally solving a problem in my head without your help is for me to explain my thinking to you so that you can test it against your own thinking and make me aware of anything you know or see that I do not. To tell you about my thinking, I have to abstract myself from it, so that I can hold it at a distance and consider it.
The implication here is that unilateral problem solving, which can most assuredly erode trust, especially when we apply it to people, may be problematic not just because it implies a potentially narcissistic preference for my own thinking over that of others, but also because it allows the problem solver to remain embedded in and undifferentiated from his or her thinking. It does not require being someone who has thinking that can be shared, evaluated, and if necessary changed. It only requires thinking.
If this is true, a person who has the cognitive ability to pull herself out of her thinking may be better equipped to build trust than a person who only has the ability to take in data and deploy or discuss the solutions her or his thinking suggests. Conversely, any practice or technique that helps one develop the cognitive ability to think about our thinking, to hold it at arm’s length and show it to others and make choices about it, increases our capacity for building trust.
Failure to reconstruct multiple perspectives
Comedian George Carlin famously joked that anyone driving faster than you is a “madman,” and anyone driving slower than you is an “@#$&%!!” His joke beautifully encapsulates a truism of human nature: if someone is doing, thinking, or feeling something different than we are, we are likely to immediately assume that they are wrong, bad, stupid, or in some way up to no good. We call this powerful reflex attribution of conscious negative intent. In other words, not only is the person in question doing wrong, we assume they must know it.
The truth, of course, is that very few people wake up in the morning saying “today I will work hard to be irritating and wrongheaded.” Instead, we are for the most part the hero of our own stories. The thing that looks so obviously wrong to you probably, in some way, looks heroic to the person doing it. Instead of thinking something like “boy, I sure am up to no good now!,” they are likely thinking something like “I have to do _____, because______,” or “as much as I hate ______, I have to protect ______,” or “I have earned the right to ______ by _______.” The ability to construct multiple perspectives is the ability to suspend our own interpretation of an action or event so that we can piece together how it looks from the point of view of someone else.
Constructing a hypothesis of how a thing looks to someone else from his point of view and then explicitly testing that hypothesis to see if you have got it right is what Jentz and Wofford call reflective listening. While doing this reflective listening seems like it ought to be easy, for most of us it is extremely hard, particularly when the “wrongness” of the action, thought, or feeling seems obvious to us. The reflex to hold on to our story about a thing so tightly that we cannot even momentarily reconstruct and consider another person’s differing story about that same thing is another kind of self-orientation.
Here again, the nature of this self-orientation may be more about what you are able to consider than it is about the place you direct your attention. Our quickness to ascribe conscious negative intent to people whose actions or thoughts make no sense to us is tied to the fact that, for the most part, we see our interpretations or stories about things not as stories but as truth. We are, in effect, stuck inside our interpretation so completely that we cannot objectively consider, critique, or modify it. It follows that, if we see our story about a thing not as a story but as self-evident truth, it would be nonsensical to be open to the possibility that someone else’s differing interpretation could be valid. Even provisionally allowing that perspective to come out of our mouths as a way to see what the other person believes is likely to feel uncomfortable if not downright dishonest, as if our saying a thing would make it true or deprive us of our ability to disagree. This powerful reflex that often makes articulating a point of view that is not ours difficult demonstrates how tight the grip of “embeddedness” can be. And, to a client, boss, or co-worker, a person who only seems able to talk about how we are in some way wrong without ever really taking in the reasons why we feel we are right is likely to come across as highly self-oriented. At Kenning, we have seen countless examples of how this rigid-looking inability to imagine alternative perspectives undermines trust.
Lack of system awareness
Systems thinkers see the world as a web of interactions, knock-on effects, and unintended consequences.
The great lesson of systems thinking is that we rarely operate in a world where what we do is unrelated to what comes back to us. A fish, unarmed with the ability to see systems, cannot discern the connection between his appetite for snails and the alarming drop in his water’s oxygen. He just knows that he has gotten very good at catching tasty snails and that he is beginning to suffocate. Were the fish able to imagine himself as part of a system rather than an independent operator, he might be able to figure out that, 1) The more fish eat the snails the fewer snails there are in the pond; and 2) the fewer snails there are to eat algae the more algae blooms in the pond; and 3) the more algae there is in the pond the less oxygen there is for the fish. The fish in question might also figure out that 4) the less oxygen there is for fish the fewer fish there are to eat snails. This last step means that what we have is in effect a balancing loop, where a boom in snail consumption will be followed by a boom in algae growth and a corresponding drop in oxygen which will lead to a fish die off and, eventually, a drop in snail consumption, which will be followed by a fall in algae and an increase in oxygen and a drop in fish suffocation. And on and on, through alternating cycles of boom and bust. What really matters, though, if you are the fish, is understanding that your consumption of snails is directly linked to the amount of oxygen you have. Only by seeing that connection could the fish understand that his best chance for not suffocating would be reducing the number of snails in his diet.
While we human beings are much better than fish at being able to pull ourselves far enough out of our own immediate experiences to imagine and investigate the broader systems through which our actions influence the things that happen to us, the discipline to do so does not come naturally to most of us. Imagine that you are a worker on a production line. Your job is to assemble thingamabobs. You get paid by the number of thingamabobs you produce, so you are motivated to make as many as you can as quickly as you can. The more you make, it would seem, the better you do. Given that simple rule of thumb, all you need to do to optimize your results is to pay close attention to your piece of the process – your work station. You don’t need to worry about the rest of the steps in the process.
Or so it seems, until you get laid off. It turns out that, in an effort to maximize your results, you figured out how to churn out a thingamabob every 5 minutes, 24/7. The problem is that the people who install your thingamabobs in the whatchamacallits your company manufactures could only do so 8 hours a day, 5 days a week. As a result, your company had to invest in storage space to hold the inventory piles that built up over the evenings and weekends. Worse yet, since thingamabobs go bad if they are not installed within 5 hours of assembly, your process led to huge amounts of waste. Waste and inventory costs were passed through to customers as price increases. Price increases led to a dramatic drop in sales. The dramatic drop in sales led to production layoffs. So, here you are feeling great about massively increasing thingamabob production (and your own pay), right up to the moment someone accuses you of ruining the company and hands you a pink slip.
Learning to map systems and to imagine oneself as operating in systems is a bigger topic than can be addressed in this post, but there is no doubt that people who seem to be focused only on optimizing their own outcomes will be experienced by others as “self-oriented” (i.e., not a team player), and people will trust them less. As with the previous two examples, what is experienced by others as self-orientation is in many ways a developmental stage rather than a measure of self-interest. To see myself as a operating in a system, I have to be able to abstract myself from my immediate experience so that I can consider and make choices about it.
Robert Kegan has made a compelling case that continuing adult cognitive development is in many ways a matter of building one’s ability to create some distance between what we think of as our “self” and our experiences, beliefs, values, and thinking, so that we can hold all of these at arm’s length and make conscious choices about them. At Kenning, we believe that developing one’s ability to be self-aware and to have conscious control over more and more things is a key leadership asset in an increasingly complex world.
Learning to think and talk about your thinking is, in our view, a developmental move. Learning to reconstruct and consider alternative points of view is, in our view, a developmental move. Learning to see oneself as a player in complex systems rather than an autonomous actor is, in our view, a developmental move. Why go through the difficult work of developing these mental practices? If for no other reason, because people who have mastered them have increased opportunities to “show up” as having a low self-orientation. And, as Maister has powerfully demonstrated, nothing is more important than perceived level of self-orientation when it comes to building trust.
 Trust= Competency + Intimacy + Reliability / Self Orientation. The Trusted Adviser
 Talk Sense
 Leadership without Easy Answers
 For a more complete introduction to systems thinking and its business applications, see Systems, Sense-Making and Organizational Change by Kenning Associate’s Partner Cathy Boeckmann and Thomas Heise.
 In Over Our Heads