Organizations, especially in a bureaucratic sense, seem to be faceless and machine-like. When one images how the IRS processes tax returns, one might imagine a huge robotic, emotionless, Ford Factory-like production line of paper pushers in suits. It is like a machine in its operation, but just as Bolman and Deal explains in Reframing Organizations, organizations can be ambiguous and prone to very human-like errors. These organizations are made of numerous individuals all prone to human-error, cognitive bias, and faulty communication and information.
David Hume, my favorite philosopher, loved to touch on and “expose the limitations of reason, and to explain how we make the judgement we do” by insisting that even though people seem to be using reason to support our decisions, we are really making judgments that are “ultimately founded only in human sentiment” (Honderich, 377). In similar fashion, my favorite public administration author Herbert Simon insisted that humans operate on “bounded rationality” where he studied the “limits of rationality on a systematic and empirical basis” and found that there are “inexorable limits of human nature” (Fry & Raadschelders, 293-294). Simon described two types of psychological models: the Economic Man, and the Satisfacing Man. The former model has complete knowledge in a given situation, knows all possible consequences and alternatives, and has all required skills to complete the actions needed. Of course, there is no such person as the Economic Man. The Satisfacing Man is more realistic: one that does not, or rather cannot, consider all possible alternatives and cannot possibly have all the information necessary to make the absolute best decision. The latter of the models is the one where most operate – not because people are incompetent, but have fundamental human flaws. And it should be made clear: the Satisficing Man does not make decisions purposefully uninformed, but they assume they have all the information they need and are acting rationally. Additionally, people have innate limitations on their processing capacity, cognitive economizing (using “rules of thumb”), and natural cognitive biases (Bolman and Deal, 36). It is this gap in rationality (or “bounded rationality”) that people make mistakes and make organizations fallistic, ambiguous, and faulty.
Bolman and Deal (2017) explain how smart people do really dumb things and make bad decisions. This is a common phenomena. Founder of the Skeptic Society, Micheal Shermer was asked why smart people do and believe foolish things and answered, “Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons” (Shermer, 2002). A perfect answer to what happened just last month in my organization. In this situation, one department wanted to use this year’s fund next year, but because of accrual accounting, that cannot happen. Our office argued and argued but it seem impossible because they were so damn good defending their position! They were doing mental gymnastics to ensure this department could use their funds how they wanted. It was eventually escalated and settled by the Controller but one can easily see how, even for smart people, these limitations within human nature can affect anyone, even really smart people.
In a similar vein, our departments works with every other department on campus and must coordinate accordingly. But each department operate in a “dense fog that shrouds what happens from day to day” and it is often difficult to “get the facts and even harder to know what they mean or what to do about them” (Bolman and Deal, 32). And on top of that, cognitive biases create ambiguity in an organization by assuming one department is doing what you want them to. Well, our department’s travel card processes often have to coordinate with the university purchase card and balance dates, rules, and software. These issues lead to confusion, errors, and leaves plenty of room for administrators to assume processes are the same.
These problems, in the cases I spoke of above and in other cases, are unavoidable when dealing with people but are becoming predictable and more anticipated with technology and AI. The organization as a machine metaphor is useful in that organizations are not perfect or flawless. They, too have ineffective parts making confusing and irrational decisions and should be assumed to be just as flawed as those operating in them.
Bolman, L. G. and Terrence E. Deal. Reframing Organizations. 6th ed. Hoboken, NJ: John Wiley and Sons, 2017.
Fry, B. R., and Jos C. N. Raadschelders. Mastering Public Administration: From Max Weber to Dwight Waldo. Los Angeles: Sage, 2014.
Honderich, T. (1995). The Oxford Companion to Philosophy. Oxford: Oxford University Press.
Shermer M. “Smart People Believe Weird Things” The Skeptic Society. Accessed September 7, 2019 from https://michaelshermer.com/2002/09/smart-people-believe-weird-things/