 
Leadership vs. Management

By Michael Driver

Text Copyright © 2017 By Michael Driver

Table of Contents

Introduction

Part One: Background and History

Part Two: Administration

Part Three: Issues

Part Four: Future

About the Author

# Introduction

When I was a child, there were two dreaded questions that adults, especially older adults, inevitably asked little boys. The first, "Do you have a girlfriend?" was absurd for the prepubescent set but adults seemed to enjoy asking and never pressed the issue. But the second question, "What do you want to be when you grow up?" was asked in earnestness and demanded a reply from every boy old enough to string together a short sentence: "I want to be a...." Specifics of the reply didn't matter much—at first—when occupational choices available to the average four-year-old boy born in 1950 consisted of policeman, fireman, cowboy and, for many, farmer.

Adults, inconsistent in so many ways, were strangely insistent on these questions and they persisted for many years as occupational permutations expanded: doctor, lawyer, teacher, preacher. The seriousness of the inquiries mounted as years passed and worldly possibilities flowered into scientist, accountant, engineer and artist. As time progressed, children were expected to provide increasingly thoughtful responses bearing connection to the real world.

Still, the question persisted and, over ensuing years, all children adapted into adulthood, most of us more or less successfully. I assume, for example, that the inventive ten-year-old boy who invited taunts from other little boys by saying that he wanted to be a kindergarten director eventually retired as a school administrator. Realism increasingly crept into the background of the question. When an adult demanded an answer from the thirteen-year-old me and was told that I wanted to be an essayist, the youth was promptly informed that very few of them make any money. The adult proved correct, but, as it happened, so did I, who have scores of essays to prove the adult's point.

Somewhere along the line, tenuous realism gave way to hard-edged necessity and my generation wound up populating the ranks of management, sales, and sundry marketable professions. These, if asked, would likely fail to mention the millions of truck drivers, factory workers and clerks who faithfully supplied people power to the twentieth century, who manufactured, bought and sold its goods and services and who struggled to a conclusion that begs for a fresh evaluation.

Somewhere along the line, too, the questions abruptly stopped. For some, it was when they committed to a course of study that would lead to a remunerative career. For some, it was when they landed a good job with a satisfactory, middle class future. For others of us, questions persisted a while longer: "What are you going to do with a history degree?" It was only after titling into academia or falling into some comprehensible form of business that we were granted peace, ironically, just as the employment scene began to feel heat.

But the reasons that adults stopped querying the younger generation about their career plans reveal a great deal about prevalent attitudes toward the evolving nature of work as well as the future. For a variety of unrelated reasons, the question is no longer asked, making it necessary to step back a generation to observe from an earlier perspective. Let's look at some of the reasons in no particular order:

The adolescent or young adult made practical plans that the adult considered sufficient to create a livelihood for a reasonable future. Enough said.

The younger person demonstrated commitment to a goal that the older person believed to be unwise but which was clearly being pursued seriously. Perhaps the younger person wanted to become an actor, a notoriously unstable career, but one which, with training and persistence, could be pursued. Argument, at some point, becomes useless.

The youth demonstrated commitment to a goal which disappointed the older person but which was clearly viable. How many times have young people sought professional training that would mean abandoning the family business? It makes no sense for adults to oppose legitimate plans.

A young person laid plans for a career that awed older people. Members of a generation who were the first in their family to leave the farm were often known to have children who were the first to attend college and as such, sometimes found career interests in professions that dumbstruck the older folks with admiration. Nothing could be said.

The younger person sought a career completely incomprehensible to the older person. As the scope of learning increased, youths sometimes uncovered previously nonexistent fields that older people were completely unprepared to address, not infrequently involving activities only recently invented. Nanotechnology anyone?

The older person was so outraged by plans broached by a younger person that silence was the only path to peace between them. This could be anything from attending a school contrary to the preferences of a scheming busybody, to disreputable employment, to unethical conduct: stripper, drug dealer? Where, in this, would a highly successful clandestine arms supplier fall? In any event, there's not much to say.

When the older crowd accepted a young man as a fledgling adult, most were apt to leave him alone, kicked into the world to soar or plummet as fate arranged or, perchance, to glide to a safe landing in retirement, secure in the arms of God, society and the Social Security Administration. Eventually, older people were guaranteed to slink away into their own familiar worlds, sometimes uneasy about the future and trying not to think too much about it. But what was once a comforting certainty that they, too, had raised questions for their parents but had turned out okay, became a vague mistrustfulness of a world that seemed to be teetering beyond comprehension in a way that made previous change seem tame by comparison.

The fact that uncertainty morphed into doubt and then into fear for that generation of older adults takes on new poignance when we realize that the younger generation absorbed full toxicity including paranoia and paralysis. Those of us in that younger generation, that has since grown old, tired and sick, thrash about for answers, reasons for what happened, especially as we see a yet newer generation struggle to climb out of the morass. Urgency is all the more acute because some in this now middle generation seem to have given up while others hint of anarchy. Those who remain conscious, lucid amid chaos, probing for answers, are the ones now asking questions.

One of the questions my generation never asked is "What do you want to be when you grow up?" Why we never asked that question is among the most fruitful inquiries to make. Here are some reasons in no particular order:

We were too busy to ask.

It never occurred to us to ask.

Youngsters were too busy to answer.

We didn't see younger people very often and when we did, we were as much in a hurry to go somewhere else as they were.

We didn't know how to ask the question.

We were afraid of the answer.

We were afraid we wouldn't understand the answer.

We didn't want to seem to be interfering.

We were afraid of how the younger person would react.

What's going on with the question when it was not asked by the now middle generation is at least as important as the meaning behind the older generation asking the question. So, what's going on here? A quick glance reveals two forces at work, first, being too busy; second, a multitude of fears. Often, the two elements combined.

For my generation, being too busy was not only an excuse, it was a daily struggle against forces that conspired to rob us of time. There were many places to go, of course, and things to do, things that seemed to multiply as our time diminished. These were not mere distractions but imperatives of life. Food had to be purchased and prepared, children, the newest generation, created an ever-widening amount of activity as they grew, going places and doing things that required participation from my generation.

With increased activity, families became more insular as time drained away from contact with uncles and aunts and cousins and more focused on core family units that increasingly lived greater distances from each other. Despite freeways and airlines, travel time increased, with adults often spending long hours commuting to and from work. Even when formerly close relatives lived in the same general vicinity, travel time and focus on core family activities reduced ties among members of an extended family and produced a fundamental shift in society that culminated in what is derisively and inaccurately known as the "me generation."

Often, it had been these same now distanced uncles and aunts that had asked the occupational question most frequently. Older friends and acquaintances of the family had asked, too, but as insularity increased, nonfamily members became reluctant to press what was beginning to be seen as a personal question. Time was also a factor among friends and neighbors pushing noninvolvement still further. It didn't take long for an element of discomfort to make the career question as unpalatable for adults as for children. And time was lurching forward at an astonishing rate, not merely moving inexorably ahead.

A solution arrived in that very same nick of time. Who, better to ask the question, than a professional? As job descriptions began to delineate the fantastical, they also encompassed the utterly practical, if heretofore unknown. A host of career counselors arose specializing in every aspect of work and preparation for work, including its termination. Better to let the pros handle the matter. Why? Fear. Fear for numerous reasons and on many levels.

Older people soon became aware that the younger generation was confronting a much more complex world than the one their generation had inherited. My generation has since variously both lauded the older people as the "greatest generation" and condemned them for fouling everything up and leaving a mess for future generations to clean up. While neither extreme is entirely accurate, there is enough uncertainty to sow a bumper crop of fear for everyone. And, perhaps with delicious irony, perhaps in spiteful punishment, depending on your perspective, the older generation lived long enough to experience fear fully in every aspect. For the few who could avoid casting blame on any group, class, minority, political party or religion, it must have been cause for tremendous introspection but it remains fear, nonetheless.

The career question ceased to be asked entirely when fear became intractable among older people. No longer able to comprehend the future, they lost all but the dry husk of a faith in concepts and people and institutions that they had nurtured and which had nurtured them. Nothing remained for them except unquestioned habit; fear expanded to fill the void. It was a fear that, in the long run, portended the massive change with which we grapple today.

In the run-up, silence became the safest posture, risking less embarrassment, less commitment, less of themselves in any respect. Many, unable to adapt to existing change, became unable to participate adequately in the workforce, many others saw their employment snatched away and all felt alienated from the future. In these things, the older generation found common ground in an unexpected place, the younger generation, my generation that was rapidly becoming the middle generation.

Fear saps the energy and will to compete. Unable to see a way for themselves in the future and observing that my generation was struggling with the same issues, the older people retreated, utterly frozen by fear into silence and inaction. Thus, they stopped asking the career question. My generation, never experiencing time as the luxury once available to older people, moved quickly to adopt fear on a basis similar to that of the older generation. Thus, we never asked the career question.

None of this should be interpreted to mean that parents stopped talking to their children about pursuing career options. Instead, it establishes vital background to help understand fundamental changes that are taking place not only in the world of work, but also in the internal space of values and of life.

That values have changed radically is no surprise. Unknown to little boys of my generation, little girls our age were being asked the career question, too, but from a different perspective and with different expectations. Part of the fear being created as our generation matured came from the fact that little girls confounded the anticipations of adults with a desire to delve fully into career opportunities that had previously been reserved for men. This produced a large-scale change in life that further unsettled older people and paved the way for even more rapid change in a future being transformed at an astonishing pace. Yet, it all comes down to values and life and they are changing, as well.

With the generosity of hindsight, a review of the background underlying the change we observe will explain much. Or it might seem to explain. But history and background alone lack a vital element of perspective. We must consider an unseen factor, also—leadership—the coping mechanism designated by acclamation that we employ both to guide us through our travails and to produce substantive worldly results. An honest evaluation of leadership will reveal that we wrongly conceived contradictory expectations for leadership that exacerbated our problems. In a nutshell, the reason is that we conflated leadership with management. How this happened, again, the background and history, will help us find a path forward, and, as we will see, a growing number of people are dedicated to rectifying the error that continues at the core of the preponderance of businesses, nonprofit organizations and governmental agencies.

It is helpful, at this point, to leap ahead and highlight some important facets of leadership. One of the most striking is the near universal contemporary agreement that leadership is the route to salvation in all endeavors that involve more than personal introspection. It is a mistake to overlook the role of introspection in the formation of leadership but the popular consensus is basically valid on its face. Where there is more than one person involved in something, yes, leadership is the answer. Finding that answer has occupied thinkers for generations and is now ginned up to a fever pitch in the written media, on the Internet and among motivational speakers. There is a veritable leadership industry, a growth industry, judging from the number of words being generated about it as well as the dollars expended in pursuit. Here, after all, is yet another book on the subject, a free book, which agrees that leadership is the answer.

A fundamental point of this book involves the failure that sustains the leadership industry and the failure that is easily observable in organizations of all kinds. Often, this failure is attributed to "a failure of leadership" without realizing that it was really an absence of leadership that caused the failure. Still, it is claimed that had leadership succeeded, the problems would have been solved, therefore, there was "a failure of leadership."

This misdiagnosis occurs because of an overwhelming misconception about leadership that is almost as pervasive as the belief that leadership is the answer. Another major point of this book is to explain how huge numbers of people in every corner of business and society have wrongly conflated leadership with management. This is not a unique perspective. Fortunately, there are an increasing number of scholars and gurus of various types that recognize this fact and are working to shift businesses and organizations onto a different path. This book takes the position that confusing management with leadership is self-evident and will treat the subject in the background and history section. But there are new wrinkles in this general misperception that many have lost in a rush to offer prescriptions for what ails many organizations and businesses. This book examines the issue with new analysis that opens the door to effective solutions overlooked in the onslaught.

One of the new perspectives is a reassessment of the old concept of administration, for many years the butt of business organizations, a maligned activity relegated to employees thought to be dull and uncreative. Often, in fact, that was the case which didn't help matters. But we will take a fresh look at administration and find an avenue for success that has eluded many who are lost in the jungle while searching for management that is both creative and effective. Stopping here would threaten a more fundamental premise of this book, that leadership and management are not synonymous.

Many have spent their lives trying to exercise management beneficially. For the benefit of whom? Many have veered over the line of outright criminality in the pursuit of management. Many others have spent their lives trying to improve management by refining leadership only to fail because tailoring leadership to management is like embedding diamonds in the chain of a leash only to have the dog run away with it. This book takes a different perspective to make a cogent argument to get rid of management by allowing leadership to flourish. We will attack the confusion between management and leadership, then observe how leadership can better supply the needs of organizations and the people involved with them. We have bought a bill of rotten goods from management; it's time to demand a refund, time to clear the confusion and devote ourselves to leadership instead.

Leadership is not linear; it does not move from point to point to point. That's what management does. Leadership is immersive; it envelopes like the tide, it nurtures on all sides, it sustains, it lifts, it carries. And it's natural. Management is an artificial construction that attempts to circumvent nature, channeling all resources to fit a contrived outcome devastating all in its wake.

How did it happen? Why did we just stand here and let ourselves be gored by the beast, then devoured from the inside out? The answer is that we were drugged in one way or another, thinking that we needed the salaries and wages, seduced by compliments and remaining, fearful of losing the flattery of zombie supervisors, managers, owners, shareholders, risking our souls for ever more of the tonic of delusion. If this sounds like harshly over the top condemnation of the means by which organizations of the twentieth century operated, of the conduit of our livelihoods, of progress itself, well, it is what it is and it's true. So why didn't we rouse from the stupor years ago? First, answer this question: Why not simply sleep at the teat, ready to suckle again and again and again?

We'll come to more detail on all of this but it's important to understand that everyone, including the perpetrators, have been badly damaged. It's even more important to understand how it happened, how management co-opted and weaponized leadership and fought tenaciously for years, wielding like sharpened steel for greed what should have been a benign force for good. But multitudes accepted receipt for the ongoing disaster and lived for generations amid the carnage.

There were a few all along the way who saw through the bloody mess and even more now who understand. We'll praise them later and add some new trumpet blasts to what is becoming a symphony. But first, we must acknowledge some discordant notes. In one of the less laudatory developments in the history of ideas, some of the earliest and most persistent voices in the house of horrors sing a litany of platitudes. Platitudes! And listicles. I have nothing against lists; they're fine in their place (and you'll notice that I've already used a couple). But spare us the platitudes entirely and reserve lists for use like adjectives, like sauce on the steak without confusing the dressing for entrée.

The irony that cuts worst is the one that slashed us to pieces earlier and that returns to the job again after we thought we were healed. That's the problem with platitudes. Management strung everyone along (or up) partly with the use of platitudes and now that many believe they see the light, platitudes are back, this time in the employment of people who tell us they're the good guys. See their white hats? Believe. That's what they want you to do: believe. Believe the words whatever they say, whatever they mean. Believe and all will be well again. Many seem to believe that belief is all that is necessary, these are the purveyors of magic words, incantations for the guileless, business voodoo for the desperately busy to substitute for the effort involved in actually locating and slaying dragons. We are, after all, so busy with real work, you know, answering email and going to meetings, that we need armor that's ready to wear, not tailored, weapons that magically find their target. Most of all, believe that it will work. Believe that if you follow these seven easy steps your employees will hail you as the greatest leader ever and sales will increase exponentially.

The gullible ensure the success of endless clickbait, reading lists like they're Shakespeare and platitudes as if they were poems from the gods.

The more literate take a platitude and expand it into an article, an essay that uses a few hundred words to say the same thing over and over again, hopefully over again often enough for us to believe if we weren't hooked on the short version. And the truly scholarly add some data along with a generous heap of pop psych and blow the whole thing up into a book, that, if they're lucky, they can peddle at motivational gigs disguised as business conventions. All of this from the lowly platitude.

But the damage isn't done yet. Well-intentioned observers of all kinds use platitudes honestly as a shorthand for serious meaning. What it really does is expose the dissatisfaction within ourselves and those around us. Our remedy tends to be grasping not only at whatever straw is proffered but to the past out of delusion that surely whatever kernel of worthiness it ever possessed remains if we can only find it and plant it in the present. Never mind that the disjointedness spells doom even if all else is accurate and sure. Grasping backward seeks stability, the figment of reliability that we recall falsely, having forgotten that nostalgia merely conceals nostrums behind sweet memories that were really bitter in the action. We strive for relief but there is no comfort in the effort where strive is close to strife in the doing of it. The admission of inadequacy that is uncovered is merely evidence of the dissatisfaction and hunger that results from a diet of junk food. The digital world has encouraged the trend all the while that our attention span diminishes and work escalates demanding more accomplished faster. Platitudes and the certainties found in lists only cloud understanding while sowing lack of cohesion that prevents ideas from connecting and bearing fruit. And now that a little light has crept between the 0s and 1s, between the emails, among the customers, amid the chaos, we want better, too, and true. What's a writer to do if not resort to platitudes?

Read on and find out. Comes now yet another book about leadership, but one with a money back guarantee (this book is free) that claims not to rely on platitudes. Or bullshit. Instead, together, we will dig deeper and think our way through the quagmire of sophistry that seeks to entomb leadership in management or hijack it for the benefit of elites. To accomplish that, we'll take one of their favorite devices and give it a hard twist.

Telling stories, we have come to understand, is much more effective than spitting out data, revealing facts or expounding theories. One of the most popular journalistic methods is to start by telling the story of a specific individual who is said to represent a multitude of others. It's an attention grabbing and holding device that permits the journalist to move from the person being profiled to a more expansive exposition of facts and figures, specifics and details that are the real meat of the article. Then, they inevitably swing back to the personal example they are using for a neat, tacked on conclusion that often represents hopefulness for the future if not outright sweetness and light. Here, I picture the journalist being chilled by his own cleverness and an editor smiling with corporate satisfaction. (You have permission to retch at this point and, having written some of this stuff myself, I'm right there with you.)

The typical journalistic approach to telling stories, while usually entertaining and often informative, lacks the philosophical satisfaction of grappling successfully with ideas to a conclusion that is both widely applicable and sufficiently concrete. On the other hand, much of the writing about leadership bends too far in one direction or the other. Often, it seems merely self-promotional, so limited in detailing the adventures of a person or organization that it may as well be fantasy or science fiction as far as being applicable in the lives of others. From the opposite direction, some leadership writing focuses on processes and ideas almost to the exclusion of the people it is intended to serve or influence. Perhaps this is a good point to insert the word "abstruse." If so, it is entirely alien to the point of writing about leadership, an undeniably people oriented endeavor.

Interest in telling stories coincides with an ability to tell them, not simply through the functional use of language but growing emphasis on its creative and effective use as well as the physical means of delivering the goods. The digital age has spawned not only a widely adopted and inexpensive method of conveying stories unencumbered by the limitations and costs of paper and ink but also the means of becoming aware of them, of reading or hearing or even watching them and responding with the possibility of initiating escalating communication involving numerous people or smaller, focused groups. One of the reasons that stories are being told is simply that people can now tell them with a reasonable possibility that they can be heard.

But it would be a mistake to attribute the increase of storytelling merely to the availability of the Internet. More and more people are on a quest for answers, seeking commonality and exploring avenues of mutual support as well as exchanging information that has the potential for both wide-ranging and focused impact. They crave the understanding that is derived through collaboration and sharing. Instinctively and increasingly, they reject imposition from sources that would manipulate and control them, holdover influences from the Industrial Age management mindset. As a result, people taste the liberation of self-determination and value guidance that nurtures and sustains while rejecting constraint. They arrive at this point through experience, both objective and subjective, and they're placing bets for the future based on analysis of their experience and that of others.

Hence, people have stories to tell, stories they are desperate to tell, some, it is true, in hope of glory or gold from the telling, but most sincerely wanting others to avoid the horrors they have experienced or find the better path they scouted. Having reached the point that many have begun to understand the past and the present with a glimpse of how the future can be improved, confidence is being developed that propels people to tell their stories, that makes them eager to speak their state of mindfulness in a way that has never previously been known and those to comprehend what is happening and respect others understand that the rest of us have good reason to listen. Stories will out, including the ones that are blurted extemporaneously with neither plan nor design, even those spewed rudely in our faces. We need to expect it and like it; we need to seek it and cherish it because these stories embody the manifold experiences that will lead us to a better future.

Having begun to understand the past and their own present, there is a vast yearning among people to tell their stories, a zeal to confront inadequacy wherever it is encountered, and to chart a better course for themselves while leading others. Unlike the past when many saw themselves in a static position, people now perceive themselves as travelers and they understand they are approaching a crossroads. They see multiple routes available and they hear a cacophony attempting to mimic the sirens' lure to draw them their direction. But today, people are wising up, resisting what they sense are tourist traps because they are more experienced travelers now and seek authentic cuisine, realizing that, as never before, their journey is long, featuring numerous but brief stops before they're on their way again and wondering why anyone would settle for something less than real, especially knowing that they would soon be up and out again.

None of the crossroads we face will be more decisive than the one we are approaching. Many understand this and are maneuvering to be in the proper lane as they approach. Others continue to be distracted but most of these are rapidly becoming skeptical and anxious about the direction they should take, nearing the point of decision at a high speed. The time is upon us to analyze the uncertainties as rapidly as possible, shift gears and transition quickly into a change lane before screeching through the turn without wrecking. Counterintuitively, perhaps, it is also time to coalesce with others on their journeys, which, for some, will require quick retooling because so many of us have bought a false identification with a fake individuality that we resist getting in line and acting in unison for a more successful conclusion.

And the direction we take will determine the ending of our story, whether it is a comedy of errors culminating in tragedy or an exposition of experience leading to satisfaction. The urgency to tell our story is great and growing rapidly for it is our story, as we must first understand, not a compendium of individual tales. Experience from the twentieth century will help us comprehend why, in the twenty-first century, there is a crying need for interdependence, meaning, in some instances, organizations, and in others, collaborations that meet and part and meet again as need requires. Admitting this present perspective provides license to understand the past and thus tell a story with a past, present and future.

Unlike typical journalism, the approach this book will take is from the general to the particular, cited as examples, and back to the general. Along the way, I trust that our story can be told interestingly enough with verve and enthusiasm but informatively, too, in a broader sense than is the case where an abundance of facts and figures are thought sufficient justification. People, singly or workers and managers employed by businesses or other organizations, have both individual and collective souls that aspire to peace and prosperity, to lofty goals and worthy ambitions, or, more modestly, to simple satisfaction. They are also subject to individual and collective fear, uncertainty and doubt. In all of these, having sought leadership as a means of success or remedy, and having both experienced and observed a range of failure and success, they yearn for explanation and answers. It is this primal craving for satisfaction through the attainment of understanding and the offering of solutions that this book seeks to address. Toward that end, I hope that the foregoing has not been a digression but a necessary predicate.

It is the gnawing lack of fulfillment through leadership, since become urgent, that this book seeks to address. Well-intentioned words alone, even solid advice backed by rigorous studies, is insufficient. In 2015, new leadership books appeared at the rate of four every day and that was only print editions, excluding the more numerous articles and ebooks. Altogether, Amazon features about 60,000 books on leadership, many of them offering excellent prescriptions for what ails us. We read them and urge our colleagues to read them. We apply their suggestions, sometimes whether or not they fit our needs, and still we are not satisfied.

What is required is to revisit the cause of our dissatisfaction, to determine its source in order to reposition ourselves. We vaguely accept that, despite a shell of prosperity, disaster befell us in the twentieth century but we fail to follow-up this obvious lead to its point of origin and subsequent point of departure from our basic values. We see change all around us and feel that somehow this change is a transition to something else but instead of seeking to discover what or where the future lies, thus enabling us to influence its direction and outcome, we seem content to drift, vaguely aware that any solutions we craft along the way will be temporary and that none of this confronts, let alone solves, the problems we face. What is needed is first, an understanding and next, a strategy that will yield supple solutions that can be applied widely and successfully.

Oddly, we have failed even to attempt an understanding of our past, having been too quick to lump leadership into a present moment fraught with confusion and artificial control there to let it scramble for relevance, happy to accept as valid even the slightest measure against symptoms only. This is because the industrial juggernaut that steamrollered the economy and individuals in its path after the Civil War dwarfed almost every subsequent attempt to rise above it during the twentieth century. To support and protect itself, it created a voracious monster known as management, that claimed title to all known resources.

What management could not control was the mind of anyone willing to exercise sovereignty over their own faculties. In an elitist putsch, much contemporary concentration on leadership leaps to stress the role of independent academic, entrepreneurial and other white-collar thinkers who broke ranks with management. While we are grateful to all intellectuals and affluent individuals who made the break, theirs is only part of the story. As we know all too well, it is the victors who write history and management was prepared to spend the bucks to ensure that their side was recorded as having won. It's okay to confuse victory with wealth, that is, after all, what much of the twentieth century was about.

Come we now to the twenty-first century when anyone with a computer and intelligence enough to glimpse truth can write a different history, one accurately nuanced with struggles of both the well-educated and the experienced workers of every economic level. Despite the unwillingness of elitists to admit it, the struggle at all levels has been for leadership as well as income.

Enough individual initiative survived here and there among employed white-collar workers and quasi-independent professionals, almost all of whom were dependent for income on the management class embedded in business organizations. Their stories are becoming clearer as contemporary thinkers delineate what happened to them, and, most especially, as they age and come clean in retirement about their experiences. I fall into the latter group and I am clawing my way into the former because I see something else that happened.

It has become rightfully commonplace among more enlightened observers to cite the managed workplace of the twentieth century as a source of much of our trouble, a development necessarily simultaneous with the degradation of people in general and the utter control of "leaders" in particular. As we begin to shake ourselves awake, academics and business gurus have begun to cast light on the situation from the perspective of their own training and elevated experiences. All of that is helpful and we will examine some of this as we go forward in this book. But the experience and rich actual knowledge of both the workplace and leadership stemming from front-line exposure should not be discounted.

Failure to credit personal experience with legitimacy and hear its valid lessons feeds elitist objectives. Top dogs bark loudly and rouse members of their pack to a noisy chorus in order to overwhelm dissident yelps of alarm. If our objections are attempted to be raised solely by competing decibels we will miss the vital nuance of other, sometimes more informative voices educated in the trenches. To the extent that I have one of those voices, I want to use it to illuminate aspects that have been overlooked both in the confinement of the workplace and in the effort to break free from it.

Simply listing ways to make work life better or more tolerable does nothing but put lipstick on the pig and the boar, bearing a blood lust for dissidents, would rather wipe away offensive ornamentation and outright kill objectors than acknowledge truth. By listening widely and paying careful attention we can go deeper in our understanding and what we will uncover is that, although there are many things that can be done to improve the work experience, the fundamental need is to reconsider the meaning and role of leadership and allow it to flourish. This is not about flowers and butterflies on a perfect spring day. If we can understand leadership from a fundamental perspective that has been both intentionally and accidentally obscured over the years we stand a chance of committing ourselves to a brighter future.

By now, it has become axiomatic that much trouble originated through the experience of twentieth century management. That's true and it deserves careful examination but it is customary to approach the subject exclusively from the standpoint that management controls everything, which, again, is true, but we must go deeper to the point at which management sought to corral leadership for its own purposes, then to destroy leadership wherever it appeared outside its own confines. To the extent that management was successful in subduing leadership, the rest of us suffered. Actions taken by management in this process helped to construct a new type of professional elitism apart from financial elitism and, although they were conjoined in many ways, the new elitist element became the more powerful and holds sway today, culturally if not through financial domination. While it may be argued that financial domination manipulates and calls the tune for cultural elites who merely respond to financial elites, the management class is far wealthier than old money elites ever were and flesh out the structure according to their own presumptions. Regardless, the two segments of elitism are closely aligned, each accepting the other.

It is necessary to defeat elitism in order to fully liberate leadership and make its benefits available to everyone. That is another of the peculiar perspectives from which this book is written. While it is necessary to attack all the manifestations of management control in the workplace, it is even more important to separate leadership from management and more important still, to detach it from elitism. It is elitism that keeps us in bondage to attitudes that separate people and true leadership unites rather than separates. The particular expressions assumed by elitism are intentionally complex, the better to obscure themselves. Non-elitists tend to be guileless, blissfully unaware that they're being taken for chumps and robbed, not only of their income, but of humanity, to say nothing of decency. Elitists believe that workers are objects to be led, that they're barely sentient, wholly incapable of leadership. Thus, they excuse and justify, not only control, but much, much worse with the result that workers are denied the fruits of their labor but also of the rights associated therewith, including that of leadership.

We will look at how everyone can be a leader and why being a leader is not an elitist perk as it is often treated. It is not a stretch to broaden the concept of leadership to include everyone; it's been there all the time, just as it is unnecessary to broaden a concept that is already sufficiently broad only lacking recognition, the factor that must be increased. Elitists of every stripe and management of everything fights this recognition as if their lives depend on it. In truth, their lives do depend on it because without acknowledging the virtually infinite capacity of leadership in everyone, the restrictions they impose on others will turn against them and they will choke to death on their own control mechanisms and misbegotten beliefs. They tried to hide the truth by saying it was something else entirely. Now, they're shown to have lied.

The fact that big leadership has been there all the time and is only now being uncovered, much to the chagrin of elitists and management, must be thoroughly galling to them: The impertinence of those upstarts. Who do they think they are? Some, it turns out, are highly credentialed academics. "Oh, boy," the elitists say, "we've got that covered." Who, after all, endows most of the chairs at elite universities? They can easily trot out other academics in opposition. In fact, they keep whole "think" tanks of them on call for just such a purpose. These institutions are actually drunk tanks where a combination of discredited academics and political hacks gather to hallucinate under intoxication at the money teat.

Some who have uncovered truth are anti-academics, successful business leaders who eschew the customary resume contents of managers but who have demonstrated the efficacy of what they say in the real world of business. The elitist strategy here is to deny, to marginalize, to contest, to dig deeper into familiar stories of purported success that somehow turn out to be more about exploitation when their cover is blown.

Some who dispute the elitists are workers. Oooh. That must hurt particularly badly, the very idea of workers, not merely remonstrating with their bosses but holding morality over their heads. Is it fair to put Pope Francis under suspicion for some of this nonsense? These must be commies, sub rosa reds of some sort. And union members for sure. Gotta be union thugs involved. Gotta be. They're disgruntled ingrates at the very least. Who do they think provides jobs? "We'll see about this," they huff. And they do see about it with public relations firms, lawyers, security guards, politicians, and more lobbyists than there are members of Congress. But we won't leave workers standing here flatfooted. We'll circle back. I promise.

Another challenger on the field are managers themselves, or ex-managers like me, people who are not so much turncoats as thinkers who have wised up. Just as importantly, many of us had positive leadership experiences in a relatively halcyon period before confronting a beast that broke loose on the world. Experience is one of the keys here and while experience is one of the credentials we have, an attempt is made to turn it into a liability. Management and elites have come to think of us as having been mere functionaries and when they criticize us, they do so with the assumption that everyone shares that perspective. That's a perilous mistake but they keep making it. When we identify with workers instead of management, we are vilified as traitors, dismissing us with that epithet as if that settles the matter, when, in fact, it elevates what we offer.

Dismissiveness of experience is both counterintuitive and antiscientific but that's the basis of an argument against those who bring experience to bear in opposition to elitism and managerial abuse. It is the possession of experience that permits us to glimpse simultaneously a truth that eludes mere description. We are informed by experience both individually and collectively and our experience is both individual and collective. We act and we observe and learn from both. What management wants us to do is hear and learn and they only want us to hear what they say, what they tell us to do. Astoundingly, they also want us to see only what they tell us to see. In gazing (dazed?) at their business, for example, they want us to note only the perfection of execution leading to profits instead of what might be corruption that contributes to those profits. This is a good place for a picture of the proverbial tip of the iceberg.

What we witness independently as well as experience in our own lives, is that everyone is capable of exercising leadership in some capacity. What we know from having participated in what is now verboten exercise of discretion, is that people can manage themselves better than managers. It would shock owners and top tier managers to know how often successful units of their business operate clandestinely through self-governance and how often their vaunted departments fail because of adherence to top down leadership.

Rejection of compartmentalization focuses on form and somewhere in all the millions and millions of words written recently about HR and management, there are bound to be some lauding the value of this self-evident fact. What those with experience want to propose is going further to reject compartmentalization of leadership as well. Plumbing the depths of roles doesn't accomplish a damn thing unless leadership is unchained from them. Let people be unconscious of supervision and they will perform to excellence. And how to be unconscious of supervision? Remove it.

We learn well through absorption, something any management hack will tell you OJT is designed to accomplish. But absorption is more than mindless repetition. The same mind that absorbs information recognizes imposition and arrogance. When repetition is designed to enforce instead of to educate, learning is relegated to an auxiliary requirement of task completion. Animals deserve more respect and better treatment than many employees receive. Think small cages.

There was a time during the twentieth century, even in the post war period, when managers typically had some degree of independence to exercise discretion. Timing varied widely, as did allowable latitude. I was fortunate to get started during that window of opportunity before it slammed shut for virtually everyone in the early eighties at which point straightjackets became de rigueur for managers. That early experience was vital for me and although it made it impossible for me to drink the Kool Aid subsequently, it preserved a knowledge of freedom, the mindfulness of which enabled critical comprehension later. While I stress the experience of freely expressed leadership, it was really the whole experience, including oppression of the later years, that ripened my understanding and allowed me to relate to the many who grew up knowing nothing but delusion, iron-fisted control and the worst possible working conditions. It is that total experience that I offer as valid credential.

Something that possibly disturbs elitists the most is that valid experience implies the capacity to observe, analyze and report accurately beyond the scope of direct experience alone. Not only are many former and current managers on the same page with our perspective, we are seeking the acknowledgement of unity even as workers everywhere are beginning to coalesce. While it might be tempting to interpret our growing unity as doom writ large for the Industrial Age management mindset, as we will see, there is too much to be done for a declaration of imminent victory. The old mind boys and girls of modern business find new ways every day to squelch leadership and control workers, a term that must consistently be read to include managers. The intensity of their reaction in the face of growing demands for change serves an important purpose of our own. Recognizing the determination of the dismal, we can see the necessity of strengthening our own resolve.

The evidence is clear. It is time to reevaluate leadership. The misbegotten previous is manifestly unsatisfactory. And the new is breaking forward. But how?

We need a strategy that is not only a plan of action but also a means of understanding. Although that indicates a simultaneously broad assessment and close examination, it also means that we must be prepared for each individual to determine what personally fits, all the while knowing that the outlines of a collaborative future are evident but the trajectory of each individual is theirs to determine. That recognition entails foregoing definition that would be a crutch to comfort rather than a key to comprehension. Can we be both relaxed and wide-eyed with excitement? I believe we can and I believe that the twenty-first century is so vastly different than what has gone before that attention to it demands exactly that seemingly paradoxical commitment. Attaining understanding is essential.

#  Part One: Background and History

Both as individuals and as a society, we seem always to be in a hurry and if we slow down for any reason there is usually someone around to urge us to move faster. That applies in both personal and work lives and has become so infectious that we push ourselves even when no one else is around to do it to us. Note that I did not say "for" us. There is no better example than sleep deprivation sourced ultimately in ourselves whether or not we blame others. An employer may be making excessive demands for increased production that results in longer hours devoted to work but our own drive to break free of that work may be pushing us deeper into the night on private projects or courses of study designed to secure better employment. In any event, it's rush, rush, rush.

Mistakes are inevitable but become more frequent under pressure to do more and do it faster. Lacking time for reflection, we're apt to make costlier mistakes, not of function and calculation but of judgment and evaluation. All the rushing makes it probable that we will entirely miss important perspectives and complexities that yield far greater consequences than delivering a report a day later or designing an icon in blue instead of green, all of which should have been done last week. And why aren't we putting out more knowing the pressure the company is under? It is difficult under these circumstances to be sufficiently aware, adaptive and strategic to satisfy all important demands, including those that we make on ourselves.

As a solution, the overwhelming majority of us seem to have settled on leadership with the apparent expectation that leadership would show us a way out of our troubles. Hold the phone for the time being on contradictions and misunderstandings while we simply go with what we seem to expect from leadership which is quite a lot. For many, leadership seems to be science, religion and business covered in the arts and rolled into a superjoint from which every successful person and organization is expected to take endless drags.

It's all quite medicinal, of course, with active ingredients compounded to address what ails us. Many objects of attention focus on "teamwork" because, presumably, leaders lead teams. Some aspects are purely organizational because, again, leaders lead organizations, right? Much of it is directed toward sales and profit because surely leaders concentrate their attention on what's important. Oops. Maybe what's important is people and if so, that's covered by leadership, too. Amazing how widely applicable this leadership snake oil is, especially because it goes on to cover a host of much smaller problems that may plague individuals, things like fear of public speaking, ways to reach reluctant followers and how to showcase a person's abilities. And on and on with platitudes and lists and studies supporting each other all the way to the bank.

Notice how business oriented all of that is? The reason is pretty obvious. Business hijacked leadership with the complicity of just about everyone. So, while we may rightfully look to leadership to provide a solution, the problems we expect it to solve may not be within range of the limitations we impose on leadership. A kind of universal fog has settled over our collective minds, obscuring greater dimensions of possibility than we allow. In ceding to business what was once and properly remains an individual concern, we abandoned not only the true nature of leadership but its most valuable potential. To correct our misconception, we must first clear our sight of obstructions placed there by business interests.

The best way to help see through business obstruction is to start looking at a period of history before businesses tried to monopolize leadership. It would be good to find that sweet spot out there in the dim dark distant past where business did not corrupt the perception of leadership but I have no idea when that might have been. My guess is that it would have antedated business development entirely which would make it virtually prehistory which, by definition, is out of our range.

While it would be wrong to accuse business of always being greedy, it is readily seen that the tendency is not only toward greed but monopoly—absolute greed—or at least its closest attainable approximation. Medieval landlords prove the point concisely. In a circumscribed region they were the ultimate monopolists, owning and controlling everything, including the people, within the confines of their domain. Even so, perhaps under the theory that the best defense is offense, nary one among them seems to have been content to have merely everything where they stood.

Merchants then, before or after medieval lords might be said to make the point better if not purer. Better because of their direct participation exclusively in commerce, a more modern niche than, say, direct ownership of everything. The fact that somebody forgot to tell the modern game plan to John D. Rockefeller, who famously owned everything from the ground to the pump, says something, too, along with the fact that merchants throughout history have benefited from government action whether through trade agreements or war. Perhaps the point is made sufficiently that barriers to expansion are of dubious durability. In our own time, we faithfully give lip service to anti-trust laws and sacrifice the occasional merger while loosening corporate belts to accommodate expanding waistlines where the stakes are highest. Excuses are as plentiful as the money available to promote them. Monopolistic tendencies, in other words, know no bounds.

But bounds were severely circumscribed as the world entered the "modern" era and it is in that period that we can find a kind of leadership unmolested by big business. In the United States, we can think of the Civil War as the rough approximation of the endpoint of this era. At that time, although the Industrial Age had been underway for decades, its dominance had yet to gel. In many respects, the nation marched off to war small potatoes and returned home Captain America. The South needed the Spanish-American War to straggle into line but essentially the middle of the nineteenth century is a convenient spot to mark the emergence of corporate America and the beginning of large scale business influence, including its sway over leadership. That period, especially as we know it from the twentieth century, deserves careful examination.

First, we need to mine the antebellum period for evidence of leadership prior to its pollution and manipulation by business. A caveat is needed, too, because the wonderfulness of incipient leadership being addressed co-existed with its antithesis. Slavery omitted its victims and disqualified its adherents from consideration of leadership. The labor and economic system of slavery so badly scarred the South that it would require two world wars before the region would even begin to emerge. The effects of this unnatural system continue to the present moment with impact felt throughout the nation, leaving a legacy that must yet be overcome. The effort will require further study beyond the scope of this book, but it should be noted at the outset that there is no place in leadership for any form of discrimination except to separate the willing from the slothful. Unfortunately, that leaves a wide gap but we will demonstrate that leadership is capable of successfully bridging the divide. In fact, it is within the nature of leadership to do exactly that, one of the reasons that the twentieth century was lost in some important ways.

A broad-brush caution needs to be stressed. Although the trajectory of this book is definitely positive, we will be mired in deep negativity before the turn is made. In seeking an approach to leadership, we must defer consideration of its definition until a factual basis has been established. Perspective and context are equally important and equally elusive without first delving into the past. We must never forget the tendency to relegate leadership to workplace issues and business. Scrambling to earn a living reinforces this tendency as does its conscious development by business in its overwhelmingly successful effort to corral leadership during the twentieth century. Commentators have largely been unable to break the hold over thinking about leadership that business continues to exercise. Altering that mindset is an important objective of this book and it will require weaving through the thicket of misperception that characterizes the subject. The route is littered with conditions and circumstances that require interpretation.

Having established a period of United States history prior to the Civil War as a beginning point for the consideration of leadership, it is necessary to make clear that this should not be taken to mean that it is the only point of departure for relearning leadership or interpreting its history. Leadership, after all, is a universal quality available everywhere and was not created but originated independently everywhere. We begin at this point in American history for three reasons. First, it contains the elements needed for our study from a relatively late point in history as opposed to trying to ferret out incipient characteristics from, say, eighteenth century French villages or even Paris. Second, by using the American example, we can thread our way with some sense of continuity into and through the twentieth century during which American business thought seems to have dominated the western world. Finally, the period involves material more readily available to my limited knowledge. I trust that anyone who might happen to read this will sympathize and recognize the universal application of its elements.

The necessary sense of freedom and independence was also available in abundance outside of slave states in antebellum America. While there were certainly poor people and ne'er-do-wells, two distinctly different, if disadvantaged, groups, it is generally recognized that success was available to those with the requisite energy to pursue it. In an era of few statistics and government reports, there is an abundance of evidence in contemporary literature of all types. Journals, newspapers, travelogues and even novels attest to effectively applied energy of Americans, both native-born and immigrants. There was despair, too, even chicanery, that sometimes made the pursuit of financial success a turbulent adventure with failure always on the table, but so, also, hope, and typically with it, modest prosperity and sometimes rousing affluence. How so?

With considerable justification, freelancers sometimes claim to exemplify the original work paradigm stretching back into prehistory. Self-employment, they correctly note, was the original form of employment. Working as an employee, they point out, came much, much later after businesses developed and grew in size to dominate the market. If that sounds like the twentieth century that's because it is, an atypical period in the work history of humankind, according to freelancers. Freelancing and small-time entrepreneurialism is exactly the primordial soup where workers in antebellum America found themselves and prospered. And it is there that we can begin to distinguish characteristics of leadership and identify specifics and patterns capable of providing both an explanation of what was happening then and contrast with later developments. Ultimately all of this will point to conclusions applicable to our future.

Let's start with ever-important context. In many cases, freelancing and entrepreneurialism were virtually indistinguishable during the antebellum period. The two approaches shared numerous characteristics not the least of which was independence. Workers, whether freelancers or entrepreneurs, regarded themselves as workers, neither perceiving separation from other workers nor inclined to invent separation. They appreciated independence but as yet did not revel in it or make a fetish of it because they shared independence with so many other workers. These workers might be said to be less ruggedly individualistic than contributing members of a community. And they readily acknowledged that their own community was part of a larger community which, in turn, on and on, was part of a yet larger community. (Without dwelling on slavery or explaining further, the astute observer of American history can see fundamental differences between these northern workers and citizens of southern states.)

Their independence was collective as well as personal with the concept of responsibility accruing to each aspect and endowing subsequent generations with the will to oppose whatever force rose to sever their sense of community. It was thus that they gathered the skirts of national honor and human respect and later marched to war and later still squirmed uncomfortably with the rise of big business and periodically justified resort to utter resistance. But this gets ahead of the story, which, for the moment, is one of gestation, nurture, growth and harmony. It was deep inside of each worker and within their collective consciousness that workers nourished the qualities that made them successful and manifested leadership.

Courage played no small part. In an era when the depredations of disease and all manner of external misfortune could unexpectedly terminate life, not to mention mere plans, it was a personally momentous act to move away from family members and familiar surroundings to strike out for what could prove to be difficult circumstances with unknown consequences looming ahead. But the element of courage must also be understood in the context both of need and the examples of others as well as sheer self-reliance.

Plucking characteristics out of the context of their inception, growth, utilization and benefit does a disservice not only to history but to those who presently would gain from their study as a unified environment. It is all too often the case that we dissect components of leadership without regard for how they function together; they certainly do not operate apart from each other. At best, such results might be good for one-off tasks, mules, so to speak, good for hauling but socially inept, incapable of reproduction into any sort of future. Unfortunately, that's all that many businesses seek. In their natural habitat, which should be recognized as the world at large, the environment increasingly common to all of us, the characteristics of leadership attributable to workers of antebellum America make sense now as they certainly did then.

Being careful to keep this background in mind, we are required to examine some of these early characteristics of leadership separately in order to comprehend a richer experience later. Sticking with fundamentals at first will ultimately facilitate a wider view later. What element is more fundamental than grit?

In many cases, the generation of age at the time of the Civil War was still occupied in clearing land. Those already settled in rural areas, villages, towns and cities had awareness of the frontier close at hand in the direct memories of their elders. There is little that requires more tenacity than clearing land of trees and rocks to make it fit for cultivation. To be successful, you need a dream along with the energy to pursue it and these settlers could boast of both. They quietly went about the business of fulfilling these dreams by expending the energy required for completion. Abraham Lincoln's well-known story explains this nicely. But we must remember that however isolated Abe and his early family were from close neighbors, such people as there were in the vicinity contributed to the well-being of others. If one family had what it took to survive and prosper in the slightest, they helped others to do so. In our urban centered world today, we need to remain aware and respectful of the rural roots of our nation that involved so many of our immediate ancestors.

Crossroads communities populated by these very hard working and effective people added the touch of social interaction that expanded cooperation and opportunities for all of them. Elementary specialization began in this context; blacksmithing is a great example. There you had a person uniquely fulfilling a valuable role in the community in response to local needs. The smith's labor was hard but he was also an innovator, not yet a technician, but someone who could be relied upon literally to make one-of-a-kind solutions to all sorts of problems that arose. His work was appreciated and he never tried to stiff a customer because his customers were his neighbors from whom he acquired food and with whom he lived in harmony.

This feeling is not dead in America today. I recall the time I was driving through rural Georgia on Christmas Eve when my Fiat broke down. You can imagine the stock status of a Fiat part in that neck of the woods. A small-town mechanic made a part in his shop and got me on my way without robbing me up in the process. Like the mechanic in Georgia, it would never have occurred to the blacksmith to take advantage of anyone under any circumstances. The feeling of attachment to and cooperation with others was too strong to admit negative actions.

The rural village was a hub of communication with lines extending into the countryside and small towns and cities as well. Quite apart from news delivered to the village from passersby, however rare, the communication was among those who lived in the region and who visited the village even occasionally. People communicated their fears, their aspirations, their knowledge, their superstitions, their ignorance to each other. On the whole, being decent people, the outcome was positive. Thus, was developed influence, some of which, no doubt, found its way into the state legislature but more importantly, it was influence that swayed expectations of those who listened and interpreted according to the measures of their hearts and minds. From this, the world gained everything from improved agricultural techniques to the vague dreams and wild hairs that led youths to strike out for more distant locales, often cities.

They had as much psychic baggage as clothing when they arrived in these towns and cities, large and small but inevitably larger than those from whence they came. And in these new locales they joined an approximation of the social milieu from which they had lately disengaged. The fit was not difficult; everyone was provincial. But there were important differences arising from scale and diversity and these divergences projected consequential changes. Although people could easily accommodate themselves to their new environments, they were required to deal with ideas that were different from those rooted in the comfort of their own village. Some of these ideas concerned new ways of approaching circumstances, problems and work as well as ideas themselves. Mindsets where thereby expanded by encounters with diverse values. The most fundamental values remained broadly the same across the culture. Issues of honesty and integrity, for example, while more often tested in the context of larger towns and cities, remained rooted in familiar ideals. Other ideas, such as the need for education, were boosted from a practical standpoint beyond the level associated with rural expectations. And that practical perspective itself bespoke something very new to all the recent arrivals.

Opportunities of all sorts blossomed in these growing towns and cities. Social intercourse was greatly expanded, requiring the successful citizen to navigate a wider swath of public expectation than had previously been the case in rural areas and villages. This meant greater knowledge, to be sure, but also accommodation and comfort based partly on confidence in his own abilities but also the lack of fear of unfamiliar ideas and people. Diversity was to be encountered in greater abundance and one had to be able to appreciate it for the value it presented instead of shrinking in fear, dogmatically clinging to the old and the familiar.

A big part of what was new concerned work. Repercussions from the trajectory of change in this essential aspect of life continue to plague us two hundred years later but now we need only concern ourselves with what was in the process of emerging. The earliest stages of these changes progressed ever so slowly as to be barely distinguishable in a single lifetime and it is more than a quaintly peculiar observation that changes in the methods and context of work grew directly out of the rural values of the workers who were beginning to gravitate into cities. In fact, changes that were evident are attributable primarily to magnified scale and the impact of more profoundly changed social conditions. It was the interaction of work with changed social circumstances associated with the move of workers into cities that set the stage for the modern future and the juncture that we must examine.

Wait a minute, you may be protesting. Movement into cities had been occurring literally for millennia. And truly it had, often on the cusp of the rise of great civilizations. But what is unique in this process underway in antebellum America is what followed, wholly unprecedented change in character as well as appearance. We can recognize many components with direct influence over existing leadership conditions in all organizations. But of immediate importance, is to spot the factors as they applied at that time. Understanding these factors is important unto itself as relative and related facts but it is also important to comprehend their moral nature and ethical application.

Morality and ethics, in fact, constituted the most fundamental strength that workers brought with them when they migrated into cities. These fundamentals meshed readily with existing urban culture and helped perpetuate a strong bond among each other both as workers and customers. Workers were very aware that they were both, and not being removed from either category, were inclined to treat each other respectfully in all contacts including business dealings. There was a clear understanding that greed crossed the line of decency, that a just and equitable opportunity to earn a living without lordship over another was the right of every man.

Imbued with a sense of human decency and living in close proximity with each other, workers were aware of the needs of their fellow workers and sought to tailor their work to fulfill those needs. It was in this context that innovation was as much an outgrowth of a moral social order as a calculated business practice. Note the communication inherently necessary in this process. Having been conceived in this manner, innovation perpetuated itself in like process and spread, if not at Internet speed, then certainly at that appropriate to the age, as if "best practice" was the non-copyrighted and non-trademarked extension of decency. Power to the public domain was the commonly understood but specifically unthought means embedded in doing business and doing right. The results were simply aspects of everyday life.

With these moral and ethical precepts as a basis, life and business was apt to continue along the lines of decency and modest prosperity. With a commitment to consistently hard work, a person was virtually guaranteed to be able to live, a simple assurance that was withdrawn in later years as will be seen. But at that point in history, workers, self-employed, self-energized and self-directed, enjoyed the ability to live and work among their fellow human beings, workers all. The specifics of that work deserve at least a quick examination.

Quality, later to become an advertising campaign as "job one," was a fundamental fact of life for antebellum workers. Those huge pieces of furniture created for rooms built without closets serve as an excellent example. This furniture was so solidly constructed that it has endured for centuries, stubbornly refusing to yield to generations of adults cramming it full and children swinging on its doors. The artistry that embellished it was also of high quality with carefully symmetrical flourishes of such extravagance that attempts to duplicate them in later years proved extremely difficult. Not that many people wanted to duplicate them. In fact, these articles of furniture, so highly prized early in the twentieth century by antique dealers and the few who could afford to furnish their homes with them, fell out of favor during the later years of the twentieth century. Lately, nobody wants these finely wrought works of artful but outdated practicality, not the mega-millionaire inhabitants of colossi, not even the merely affluent in their McMansions and certainly not the inhabitants of apartments behind McDonald's.

It's true enough that things change but have we supplanted the quality of old with the disposability of contemporary rush to the next great thing? To answer that, let's look a little deeper at that fine furniture maker. Chances are that he also made wooden toys for his children and those of a few close acquaintances. We've seen some of those toys in museums, the few remaining relics of what today are known as "side projects." When I was growing up, after a trip to the museum, we would go home where that same craftsman's work was still being used as practical furniture in our homes. The same quality that went into the furniture was also evident in toys of the era, it's just that few of them survived the testing laboratory of childhood but the fact of the transmission of a passion for quality to those children who exemplified it in their own adulthood proves the point. But let's literally dig a little deeper.

We often think of archeologists uncovering shards of pottery as dealing with the remains of ancient cultures, forgetting that they also dig for evidence of a much more recent past. Shards of pottery are regularly found from the period under consideration here and they require us to understand that because they were broken when they were discovered does not detract at all from their quality. It simply means that the cups and plates were considered of proportionately less importance than the cupboards where they were stored. We continue to deal with relativity today and appreciate prioritization. But those broken pieces of excellent pottery mean other things as well.

They reflect the fact that the furniture maker acquired items from other craftsmen who also projected quality in their work. He trusted that his tools were of equal quality to the furniture he made with them. He purchased food for his family from vendors he knew and trusted to provide fresh agricultural products. His wife bought cloth and all sorts of household implements from merchants who priced their goods fairly and who wanted to sell only quality items that would return his customers for more when the need arose. When the furniture maker transported a piece of his work, he was sure that the wheels would not fall off the wagon he bought from someone else he knew or perhaps had bartered with, quality for quality.

None of these things survive and yet they all do, numinously embedded in a psyche that both appreciates and perpetuates their principles. Our tendency at this point is to leap too far ahead into the future, even to present time. We see, for example, the concept of networking at play in a much earlier age, an abstraction we might be inclined to associate more with LinkedIn and even cocktail parties but which first routed through spit 'n' whittle clubs meeting in general stores or on the courthouse steps. Backtrack, first. Notice how that same networking produced wide-ranging benefits in its earliest days and immediately beyond. This is how credit sprang up for the benefit of known workers and extended to neighbors with merchants faithfully marking ledgers and customers faithfully paying due. This exact relatively primitive form of business was still being conducted at least as late as the 1980s.

All the while, much else was taking place in the background of that earlier period. Children were schooled both formally and informally with emphasis on knowledge growing steadily as its importance gradually increased both to society and individuals. Apprenticeship kicked in at an early age for boys. Although the downsides of this system were thoroughly and scornfully explored by Charles Dickens and others, not the least of whom was Benjamin Franklin, apprenticeship transmitted solid trade knowledge to continuing generations, again, up to the present time. As other forms of education more widely disseminated different types of knowledge, choices developed that helped drive innovation as youths realized greater opportunities to exercise their interests. But none of this would have been possible without the apprenticeship system and its continuation provided solid circumstances for less adventurous young men.

Whether content to survive or restless for prosperity, young men who arrived in cities either brought families with them or formed families after arrival. Or not. Remaining single for various stretches of time was typical as was eventual marriage and growth of families. However the issue played out for an individual, families formed further important context quite apart from household functions. Sharing within a family not only transmits all sorts of knowledge especially valuable to children and sets examples for future emulation or rejection, but plants the seed, literally and abstractly, for continuation that is necessarily modified as it advances from generation to generation.

An important part of the family context involves ideas, some of which are not strictly generated within the family but originate in the wider context of community uniquely accessible by families. Either when a family is formed within a community or when it joins the community fully formed, it is privy to and contributes to an amorphous, ever changing atmosphere with substantial influence over its members. No member of a family exists wholly within the family alone, but is also a part of the community and as such both contributes and accepts influence from individual members of that community and, overwhelmingly, its overall composition. No parent who wants to insulate their children from outside influences is entirely capable of that feat; their children will accumulate knowledge from everywhere. But that restrictive parent will influence other parents and their children, upon any exposure to others, will contribute something to them, as well.

Religion feeds this community consciousness and certainly impacts individuals and those same individuals influence the transmission of religion within their community and thus hold sway over the extent to which religion has influence over others. Religion, of course, is but one example of many; politics is clearly another. Both beg for the kind of restraint and accommodation that build tolerance and respect within a community. Given that both religion and politics are sometimes key ingredients associated with immigration, they take on added significance as community context. Despite the tendency of ethnic groups to congregate separately, the very existence of multiple identities in a city, especially in a country like the United States that has historically served as home to members of numerous religions and nationalities, interactions among disparate groups and individuals constituted an opportunity for growth well beyond the possibilities offered in rural areas. But they also presented the potential for problems.

It would be a tremendous disservice to ignore problems faced by workers in the antebellum period. Issues related to immigration, religion and politics constituted some of the most difficult to address. Regardless of the fact that the great body of American people are composed of immigrants or those with recent immigrant forebears, nativism remains a serious problem sometimes resulting in violence. When combined with religion and politics, the mixture is especially onerous. Now, as then, when public policy issues are defined along ethnic and religious lines, even more problems are created with majorities not infrequently abusing minorities. Despite the white-hot intensity of rhetoric that has sometimes incited violence, Americans today need to understand their passions in terms of history and should not fall victim to the belief that contemporary evidence of ethnic and religious violence is somehow excusable relative to the past. Instead, we need to examine how antebellum Americans addressed their differences apart from violence.

Politics was one way and it was a way that sometimes also led to violence. Historians occasionally rise to remind us of the extreme political positions that separated Americans well before the Civil War, pointing to a trove of evidence of incendiary behavior, much of it related to questions of immigration and religion. As bad and as periodically violent as it was, and despite occasional violence on a larger scale, the fabric of civil life was not completely torn until the Civil War. In the meantime, war entered the equation involving immigration, politics and religion in the form of a war with Mexico (1846-1848). Americans would do well to confront the background of that war to help us understand ourselves better and appreciate more constructive conduct. The United States has never lacked war mongers but has also successfully woven the fabric of peace. The fact that this country has managed generally to cohere with strength boosted by cooperation among diverse elements of society speaks to a means of turning conflict into the functioning outline of solidarity with extensive benefits.

Within the parameters of that larger antebellum society and directly related to some of the factors of its success, other problems festered. If we laud the apprentice system, for example, we must also admit enormous failures with it, keeping in mind that our critique has the benefit of hindsight. In tipping our hat toward the exposure of abuses of apprenticeship, we must also acknowledge that some of it was outright child abuse with child labor being an increasingly disgusting feature of the nineteenth and early twentieth centuries. While child labor remained a long-term problem, it should be noted that it escalated in relation to the nature of work during the nineteenth century and was less of a problem in the earlier antebellum period during which a legitimate effort was made to prepare youths for successful adult lives as opposed to crass exploitation for the immediate gain of greedy capitalists. While this fact cannot excuse abuse, it does explain the ongoing process and keeps context paramount.

Unexpected tragedies, mostly associated with work, account for another category of problems encountered during the antebellum period. It had always been that way and its nature was about to change significantly with the onset of modern workers' compensation laws but in the meantime, we witness another growing problem in the midst of a changing process. During a yet earlier period when workers became disabled, often through some activity related to work, they were cared for by an extended family who saw to the needs of the injured man's family as their own. As workers migrated to cities unaccompanied by extended family members, tragedies became even more devastating without the presence of extended family members to provide assistance. Instead, we see an awkward period of community involvement that proved inadequate to what was becoming a growing problem. A mitigating factor that gradually receded as time advanced, was the fact that as an independent worker, men could often manage the terms of their work in order to minimize risk to themselves. That could take the form of acquiring cooperation in a task, refusing to perform a requested task or innovating a means of approaching a potentially dangerous task. As the terms of work changed in future years, we see more and more devastating injuries to workers who faced life with little support. Sudden dislocation was not exclusively caused by work activities, however, and all sorts of naturally occurring events such as floods and fires could suddenly throw the lives of workers and their families into chaos, requiring responses beyond their means of coping adequately. Ruined land, destroyed tools of trade, even disasters befalling numbers of their neighbors could prove devastating. Lives of the period, particularly those of workers who engaged in strenuous activities, were shorter, sometimes perhaps mercifully truncating a lifespan that was financially unprepared for elder years when extreme difficulty was impossible to bear.

The context of life in cities, that same change that was praised earlier, could itself be part of a growing process that proved problematic for workers and their families. Sanitation, for example, was notoriously difficult to secure in cities where all types of waste was deposited in streets that were little more than open sewers. Lack of medical care that was often inadequate even when it was available also contributed to difficulties of life in cities apart from its as yet undeveloped scientific basis. And while these were problems of process that were addressed by countermeasures of other processes, including laws and scientific advances, they nonetheless represented difficulties that required constant attention from workers. The antebellum worker experienced work to some extent under his own terms but to an increasing extent he found himself pliable under accommodation to those for whom work was performed.

Everywhere a worker looked there were problems to be overcome. Even the terms of his work were onerous despite being self-directed as an entrepreneur. The hours were long and he was apt to be in demand even in off hours, responding to the needs of others as they arose whenever that might be. Without understanding the nature of the conflict, antebellum workers were caught in a complex web of change that represented both the juncture of centuries of work and further startling change that was imminent, even as they were ignorant of the momentousness of their position in history.

If an entrepreneur is one who establishes and instigates business under his own direction and a freelancer is one who performs specific work of limited scope and duration at the behest of one who pays for his time or services, then which was the antebellum worker? Direct employment was growing very slowly and there was little of it apart from apprenticeship outside the experience of the antebellum worker caught somewhere between freelancing and entrepreneurship. The antebellum worker was part of both freelancer and entrepreneur but fully neither. Some of the dilemma could be attributed to escalating specialization. Although tendency toward specialization had been increasing for many years, it rose significantly as population centers grew. Perhaps that is an obvious conclusion that might seem pointless, but it quietly indicated a profound change afoot in the character of labor and the lives of workers.

The point of departure between freelancers and entrepreneurs was the development of professionalism in the division of labor as workers and owners began to separate. While there was once an entrepreneur who might complete all tasks associated with a product or service, now there were specialists who concentrated on a single aspect. The classic example is that of cordwainers who made shoes and cobblers who repaired them. At a distance of centuries, that may seem like a distinction without a difference but it acquired additional significance when, as an entrepreneur, the cordwainer began to employ cobblers either directly or likely in the beginning, as freelancers, to supply specific elements of the process. Concomitant with this change was the development of professionalism wherein freelancers sought to distinguish themselves by proffering the assurance of quality tied to a specific task, the construction of soles, for example, or uppers, to the exclusion of other portions of the shoe. The entrepreneur in this process was an owner who hired workers or arranged for freelancers to supply various parts of his needs. Thus, ownership of a whole enterprise became a critical difference between entrepreneurs and workers, whether directly employed or freelancers. In time, this important distinction would become immense, not only in terms of employment and economics, but also of leadership.

The element of growing conflict in this process is obvious. If struggle always becomes a point of departure, it then was poised for epic dimensions of conflict. While the earlier point of departure might have been between survival or death, it was now between survival and prosperity. The conflict, for a while tenuously in equilibrium in antebellum America, became skewed in a desperate conflict that characterized what is sometimes considered a permanent struggle between the haves and the have-nots. That is the point at which business begins to take control of the United States in a way that we understand it in twentieth century terms, the kind of conflict that measured wealth and arms, ownership and labor, dictatorship and democracy, the classic conflict between greed and altruism in which rights become an issue as much as production and politics become subservient to ideology. Leadership, in the meanwhile, is hijacked for the use of business but it doesn't die out as an independent force, becoming merely misunderstood and overshadowed in the public view. But we are not quite to that point in the story with two more factors of antebellum importance to consider.

The frontier has always been important in American history; it remains so today and will continue to be important, even surprisingly more important, in the twenty-first century. During the antebellum period, the frontier retained the kind of traditional importance that we anticipate. It is not simply, as we sometimes like to think in shorthand, that it was a matter of people picking up and moving west. What they did after they moved is very important to leadership because they moved not only their families but the complex panoply of ideas in which they were immersed before leaving. It's a little too simple to say that sod busting became the new land clearing because, as occurred earlier, complex associations were formed as villages were established and became towns and cities. Therein we see the whole process taken a step further to plant ideas, methods and organization as much as crops.

It is not coincidental that the presumed closing of the frontier—its actual westward closure—and the assertion of control by business occurred more or less simultaneously. Any number of events could serve as the symbolic reference to the closure of the frontier. Establishment of a long sought transcontinental railroad in 1869 fits the bill perfectly, timed for the frontier closure, business development and changes in the practice of leadership. The United States seemed to expand territorially in leaps and starts. Vast amounts of the land acquired in the Louisiana Purchase, for example, remained wilderness for years before development, even before complete exploration. But it is both symbolically and factually important that armed conflict erupted between proponents of slavery and those of freedom over the very issue of occupation of western lands and that the conflict played out prior to the Civil War. To the extent that there was open frontier, workers took advantage, filling it with their ideas as well as their bodies.

As shelters were built on newly opened western lands, as crops were planted and as enough settlers appeared for recognizable human society, the ideas that people brought with them replicated and adapted in the new conditions. The ultimate success of these settlers and their communities proved the vaunted southern self-sufficiency to be a lie, but it also did much more. Ideas became manifest, not in rigid duplication of what had gone before, but in fluid means capable of serving the interests of people as those interests themselves adapted to change. Newspapers, for example, had always been important to Americans and reflected highly charged partisan values as well as the commonweal, all of this in a process that promoted widespread success.

Solutions bore fruit against the multitude of problems encountered by people as they first moved into incipient, then ripening urban areas before striking out again, perhaps as a new generation, carrying the ancestral memory of previous expansion. Solutions were often stressed under extreme conditions of new surroundings but overall those vexed by circumstances overcame their difficulties thereby providing stronger foundation for days ahead. As it turned out, all this work, this accomplishment, was preparation for the future. While there is an immediacy element to work in that it can be fulfilling, often obsessively, it does much more than supply the bread of the day. The nature of work is such that its overwhelming component is an embedded regard for the future. As people closed the physical frontier, they were not running away from their problems, they were acting out a solution found deep within them. They carried the terms of that solution forward and allowed them to act afresh against newly encountered problems. They were certain that they would prevail because these people saw themselves as being successful. They saw themselves as large against the backdrop of come what may. This was no idle or misplaced self-confidence.

From the beginning, as positive outcomes followed one upon another, antebellum Americans gradually built trust in their abilities and faith in their fellow human beings. Hard work was required and there were setbacks, problems aplenty from morning to night, but they survived and prospered. Then, they took that fateful step that established a path for solving the greater quandaries that awaited in the future. That path, the key to solving problems was found in how they addressed problems and that was through leadership, slowly at first, then rising, extending, venturing forward.

The initial step was simply one of biology: they had children. It is impossible to contend that the lone farmer acting utterly without contact with other human beings in the wilderness was a leader. He might have been disciplined and ethical but these qualities, however laudable, are not of themselves leadership. But when that farmer gained a wife and children, leadership made its first appearance in the most fundamental way possible. Without venturing to define leadership, it is advisable to take a moment to examine this most basic aspect of its manifestation.

Working alone, the farmer was simply himself. As he taught a son the essentials of farming, he was beginning to be a leader at the most elemental level. Upon seeing how it was possible for four hands to accomplish more by working together, even when two of those hands were small, the farmer was adapting his surroundings by applying intelligence and when that process became conscious, it was leadership. The same applied to the farmer's wife as she began to lead her daughter in the execution of other tasks. This is not intended as portrayal of gender bias but simply acknowledgement of historical development that has begun to break the mold of gender restrictions entirely.

Having begun to provide conscious leadership, the farmer and his family including their children were being prepared for the next step that was gradually made into village life, then into small towns and cities. It was among other people in these communities that leadership began to bloom. Again, without seeking to define leadership, apart from situations involving only two people, it is fundamental that organizations and societies are required for the manifestation and function of leadership. It does not matter whether or not there is formal acknowledgement of these organizations and societies; tacit understandings suffice. Rigid hierarchies such as those found in military organizations are completely unnecessary to the operation of leadership. In fact, it is helpful to the advancement of history that these newcomers to community life at first had no formal relationship that could impose itself and channel leadership toward specific ends. As we have seen, these migrants did not even have a boss at work, instead supplying that function from their own considerable resources that were more than adequate for the job.

If at first there were no bosses and little if any structure to other relationships, how, you ask, was there any leadership. The answer that I hope has been made clear is to be found in the execution of those relationships. Here we come to a critical juncture in our understanding, one that will be important for the remainder of our consideration and for the future of all people, especially workers. It involves the fact that people generally get along together and prosper as a group. Left to their own devices without external control, people will voluntarily organize their efforts to maximum efficiency with each individual contributing according to their ability and interest. If that sounds a little too sweet for today's greedy world and too idealistic to fit the conflict we encounter daily, ask yourself how it was that human beings formed associations to conceive and execute plans for community life in the beginning, during a period of prehistory. What we are talking about in antebellum America is simply a larger version of primitive association, a version governed to some extent by laws and customs but one that made progress for itself based on dynamic relationships that changed over time as needs were confronted. Think about the value of cooperation and how, on a societal scale, cooperation can be leveraged for overall success. Thus, curbing tendencies among some individuals was part of the nature of leadership, functioning, if necessary, "in the wild" beyond the laboratories of laws and established custom. Keep context closely in mind, the environment and atmosphere in which leadership functioned.

Free functioning explains how leadership addressed problems and created the success and prosperity that ultimately attracted efforts to control and channel it for the benefit of a few exploiters. In the coming segments, we will see how businesses sought to co-opt the cooperation that had been established and lasso leadership for its own purposes. But as we proceed with this story, let us not lose sight of the fact that it was by leadership being exercised through cooperation that we understand how antebellum Americans confronted and overcame problems. As we will see, everyone has something to contribute to the larger good and everyone is a leader in some respect. It is when we began to listen to those who told us otherwise for their own benefit that we lost our way heading toward the twentieth century. But it is critically important to remember as this story unfolds, that the same qualities that our forebears exemplified in such stunning, if fundamental ways, exists within us today. We will trace its existence through the dark days of outside dominance and ultimately see how leadership remains the answer for us on the cusp of an uncertain future.

Nineteenth Century: Period of Transition

It is impossible to view the nineteenth century as anything but a period of transition. The century opened with the antebellum features we have discussed but things soon began to change and the pace only quickened as time progressed. What happened is easily enough observed and explained; it simply comes down to change and the reaction to it. What had taken centuries to mature suddenly came under increasing stress from multiple directions.

The fact that reaction to the changes was inadequate and misdirected should be no surprise. What had worked so well for so long could not easily be cast aside, especially when no reason for doing so could be found. Everyone expects a degree of inconsistency and some problems along the way. It was assumed that these new difficulties could be handled through adaptation as obstacles always had been. No one could have foreseen the magnitude or rapidity of change that was afoot. Lack of prescience then should be a caution for us today.

The political sphere offers a large example, particularly because of its numerous tentacles that extended deeply into many facets of public life before the century ended. Political turmoil was certainly historic, having marked all quarters of Europe for centuries. Americans, too, had not found immunity from these conflicts despite its physical isolation. As recently as the fourth quarter of the eighteenth century, the fledgling country had been embroiled in a protracted war with the preeminent global power. Soon afterward, its key ally descended into social and political chaos before the United States found itself once again at war with England. And these were just the most obvious aspects.

Below the surface, Americans had always experienced political and social conflict. Political differences in and of themselves were less consequential than their deeper economic and social causes. Perennial differences over financial policy with a long-standing dispute over a central bank, for example, merely reflected a much more significant class conflict. Political disputes, even while Washington was president, had often been intense and bitter, sometimes monumentally so, but the bank conflict, involved as it was with finance, wealth and a fundamental direction for the country, managed to connect all the dots.

The dots were elements of financial, social, economic and class distinctions, all of which were undergoing escalating change. Although the United States remained primarily agrarian for many decades, the shift toward cities that we discussed earlier, intensified. And while it is true that westward expansion, famously encouraged by newspaper publisher Horace Greeley in 1865 but pushed by others years earlier, peeled off hearty adventurers as well as the down and out with their backs against the wall, even more remained in cities, fueling growth and intensifying change. By mid-century, cities were populated with citizens who had lived there for decades and who formed habits and perceptions that had changed dramatically from their forebears who originally migrated to cities. Size alone would have been a great influence, magnifying, concentrating, expanding in some respects while being restrictive in others. The whole mix of elements must be considered. Again, context is of paramount importance.

Specialization increased, at once narrowing focus and expanding opportunities. Improving technology fed the trend and with notice of increased efficiency, the pace of these changes only multiplied. As work specialized, so did economic circumstances so that workers gradually found themselves isolated into niches to a far greater extent than ever before. Literally capitalizing on these changes came the growth of specific businesses.

Whereas the entrepreneur or freelance, independent worker had navigated earlier economic circumstances on their own or in loose conjunction with others, the narrowing interests that drove life in cities caused dramatic changes in the workplace. These independent workers became business owners or employees of businesses. At this stage, most businesses were owned by individuals or families, a fact that extended the transition of American life for decades. But there were enough combinations and larger businesses to change the character of work for workers. Even small businesses gradually adopted some of the practices of larger businesses as they sought to compete more effectively. This pressure is critically important because it is a strain of business development that has misled us into twentieth century disaster, an object of attention in future pages.

Reading ahead in your mind, you might be envisioning the days of really big corporations to follow, the time of robber barons and labor strife to come during the nineteenth century. But first, consider the life of workers in social terms. Recall specialization and the concentration that it brought within businesses and think how that impacted the lives of workers, narrowing in unplanned ways the scope of their existence. It forced greater economic restraint, given the lack of elbow room previously independent workers had in which to develop their own directions. Consider, too, the restrictions that once proud, expansive citizens faced as they were expected to enter specific niches and remain there. The vaunted economic and social mobility enjoyed by previous generations of Americans took a serious hit.

And these restricted circumstances were the receptacle of immigrants being funneled into American life, a far more restricted life than previously experienced. Families reacted within those constraints, living at locations in the city befitting the specialized work of breadwinners as well as ethnic divisions that were becoming more pronounced as opportunities based on initiative gave way toward rigid expectations and restrictions. Life as an employee was far different from life as an independent worker.

Everything about city life changed for workers as the nineteenth century progressed. Whereas independent workers in cities often lived in quarters above their shops, employees were forced to commute from their residences to the businesses that employed them. Living conditions in many cases deteriorated with the slums typically associated with city life exploding on the scene with tremendous impact on bodies, minds and spirits of workers and their families.

Simply moving about cities in the nineteenth century was a nightmare featuring mud that combined dirt, tons of horse droppings, household waste, human excrement and foul water. In dry periods, all of this pulverized into a toxic dust that covered everything. In these circumstances, the availability of fresh, safe food was always a problem and health problems regularly erupted with only primitive medical resources to confront them. If any of this is apt to make you think longingly about agrarian pursuits or even the South, think again.

It bears repeating, if only briefly, that the South was a region apart from other sections of the country. Extensive reliance on slavery not only separated it because of social, ethical, and economic reasons, but also agriculturally. Invention of the cotton gin, patented in 1794, made cotton cultivation more economically viable, especially when labor costs were reduced, a major incentive to continue slavery. Growing cotton without alternating with other crops ruined land and encouraged plantation owners to press westward in pursuit of fresh soil for their profitable cash crop. Expansion of cotton with its concomitant use of slavery produced political and social tensions throughout the country as northerners increasingly opposed these plans. The point here is that southern agriculture became something quite different from farming in other parts of the country. While yeomanry accounted for the majority of whites in the South, plantations were economically dominant and overshadowed mixed use farms with their assortment of agricultural products intended for independent consumption and limited distribution. In this respect, as in many others, the South differed from the North. In addition, there were few cities of any notable size in the South; such as there were, they tended to be coastal ports like New Orleans and Charleston where cotton was aggregated for shipment elsewhere to textile mills. The basis of the South, while once tentatively traditionally agricultural, descended into a plantation oriented culture that intensified its peculiarities over time, increasing its separation from other sections of the nation.

Any temptation that urban dwellers might have had to return to their agricultural roots in order to escape unfavorable living conditions were stymied by practical considerations. Not only had the skills of their forebears not transferred to subsequent generations unaccustomed to farm labor, there were other compelling restrictions. Lack of land was one of them. Family farms had been sold or passed into the hands of distant cousins after migrants made their way to cities; there was no longer a place for them on the land. For immigrants without families attached to distant farms in their new continent, there was no remote chance. Purchasing farmland was typically out of the question with the development of a paycheck to paycheck proletariat that continues to this day. Workers required employers. Ready cash that sustained workers in cities was not grown on farms. Over time, all of these problems increased.

Looking ahead, by the time Upton Sinclair serialized The Jungle in 1905, it was impossible for starving workers in cities to find employment as day laborers on farms where northern families were barely getting by on their own land. Few other workers were needed; such as were required, room and board sufficed for pay and there were very few of these positions available.

Things were even worse in the South. After the Civil War, yeoman farmers continued as before except they eked out a living from the soil in an economy that had suddenly become chaotic and inhospitable to absolutely everyone. Plantations were typically ruined, many of them split apart into small farms worked by tenants or sharecroppers. Of those that remained intact, the labor force was vastly reduced with the migration of former slaves into southern cities. Free laborers that remained on the land had to be paid and plantation owners had no means of paying. The resulting sharecropper system gradually bound white and black workers alike to the land under the terms of unending debt that ensured generations of utter poverty and misery. It was a situation that continued in some cases into the second half of the twentieth century and further isolated the South, leaving it outside our economic considerations for decades.

All of this necessarily leaps ahead leaving a required review of the Civil War. Too often viewed through the lens of military or political history, the Civil War had tremendous social and business impact that continues to be felt. Quite apart from the issue of slavery and freed African Americans, the Civil War wrought significant change to the lives of citizens and the conduct of business. From the outset, it is helpful to understand the conservative underpinnings of the Civil War, ideological functionality that is easily lost amid the trappings of patriotism, the drive toward ideals and the fulfillment of hopes and aspirations engendered by abolitionists who appeared more radical than conservative but who, in the final analysis, succumbed to conservative values even as their quest shifted toward newly minted objectives that were characteristically seen as liberal in the years ahead.

Lincoln, too, should be viewed as a conservative despite remarkable departures from precedents, not a few of which are associated in the minds of modern conservatives with the big government they so despise. Part of this stems from the fact that Lincoln, to save the union, first broke it. To salvage American ideals, he first savaged them. Consciously or unconsciously, we see this method employed with similarly conservative results at another time of crisis by another president three score and twelve years later. But Lincoln was the original and the mightiest; his chief accomplishment was the founding of modern America with all since being only perpetuation.

At the time of the first founding of the United States, many people, including some of the founders, didn't think the unity would hold. The states, first conceived and acting as sovereign entities, really had little unity beyond language, distance from the mother country and some hotly disputed interests. Many expected the whole shebang to fail and it probably would have had they actually tried to maintain unity. Efforts to repair ruptures and create real unity resulted a few years later in the Constitution whose flaws permitted the structure to glide toward a crash instead of nosediving immediately.

The brilliant, enigmatic and frequently self-contradictory Thomas Jefferson typified one of the sets of contending views while his erstwhile buddy, the self-righteous and dogmatic John Adams represented very nearly opposite views. Neither man was hardly alone in the fight and passed the torch to another generation who continued the dispute. John Adams' son, in fact, the feisty John Quincy Adams, made sure the elements of conflict smoldered and occasionally flared, as in his famously lengthy clash with the southern brawler, Andrew Jackson. The role of banking was a major point of contention between them but their disagreement was much wider, involving context that proved as overwhelming to the South as the superior arms of the North.

That Jackson was a slaveholder and Adams an abolitionist serves to clarify rather starkly the terms of the South's favorite lie that the war was not about slavery. While originally northerners were not uniformly hospitable to abolitionists whose fervor tended to stress the manners and patience of civil people, those same people eventually adopted comparatively greater intolerance for the shenanigans of their southern cousins who made blood sport of a captive population whose skin color rendered them prominent targets. Remaining questions, made moot on the battlefield at the behest of southern leaders who preferred the company of slaves to that of their northern cousins, centered on multiple facets of context that enriched and stretched the conflict and that supplied definition to postwar development. Consider the many elements of urban growth and the coming of employment to replace much of the previously independent work and all its attendant social aspects. The bank issue feeds into this nicely because financial development was something needed by urban interests to promote the expansion of business, transportation, communication, education and governmental structures that are required in a complex society. For its part, the South adhered to Jefferson's view of limited government enshrined in contemporary practice. The North wanted—needed—something else. How long, apart from the slavery issue, the stalemate could have held out before breaking is uncertain, but it probably wasn't long. If we follow the rule of thumb to follow the money, both where it was and where it was needed, it would have soon led to a Northern victory without the 620,000 military corpses that littered mostly southern battlefields.

Speculation, historians remind us, is dangerously for naught. We need to stick with what we know, they say and proceed to inform us. Of course, a dash of interpretation is needed, too. Beware the great man school of biography, we are further cautioned, not bad advice either, before exceptions are commenced. Lincoln is a tempting exception, bestriding four years of American history in a way that few men master even a single hour. To study the Civil War, one necessarily must study Lincoln and to study Lincoln one is required to study the Civil War. Neither came from nowhere and we have seen something of the origins of the Civil War. What about Lincoln? Look in exactly the same places.

Before becoming Father Abraham, Lincoln's life followed precisely the course we have described previously moving from rural location to tiny settlement to small town to large city. Along the way, he matured exactly as we have described many citizens progressing, encapsulating in a single lifetime, hard labor, rudimentary literacy, homespun values, broadened education and lucrative professional career. Many associate Lincoln with log cabin poverty followed by country lawyer success, overlooking the fact that he was also a highly paid legal counselor to railroads whose estate, amassed before most men even hit stride, amounted to $100,000 (1860s money). That portion of his legacy points toward his son, Robert, who became a Harvard educated railroad CEO.

Our concern is with the Lincoln whose other, not entirely unique qualifications and accomplishments were dwarfed by a consciousness of singularity that understood the sprawling United States, in all its diversity, to be one nation. He understood this in a spectacularly awesome way that was unique and then he dressed himself in that nation, becoming the United States for all intents and purposes. As it happened, it also meant that for the remainder of his life, he was the Civil War. That is the Lincoln we want to consider. That is the Lincoln who could look duality in its squirrely eye and bid it hither before covering it in the cloak of himself, of oneness. The South was subsumed ever so reluctantly, fitting again as uncomfortably as it had always been in the union only to find a different nation than the one it tried to flee. What happened as Father Abraham brought forth this new American republic is briefly but importantly the focus of our attention.

The elements of the future were mostly in place before the Civil War commenced, a crucible of conflict that served to combine them in a way that would have otherwise been impossible or at least excruciatingly slow to come together. Some of these aspects such as the infrastructure of roads, canals and railroads were among the factors that, before the war, the South derided as unnecessary, preferring largely to rely on its own natural waterways and less complex system of railroads. The much more urbanized population of the North contributed much to the advantage of the North, engaged as it was in manufacturing and agricultural production for food and population maintenance as opposed to the vicissitudes of cash conversion. Urban development necessarily brought social, business and governmental structures that, while operating as intended for peaceful purposes, were also poised for extended development in times of emergency. Technology, for example was in daily use to an extent in the North and in ways that eluded the South, and being in use, could be improved and developed for a variety of purposes including the production of armaments.

The entire northern scenario fed into possibilities for the future and the whole complex organism of private business, industry, education, infrastructure, social intercourse and cohesion and all levels of government required something modern that did not exist in any meaningful way in the South. That is a system of finance quite apart from retail banking and gold reserves. Here, the hand of government played a huge and growing role. Secretary of the Treasury, Salmon P. Chase demonstrated remarkable facility raising and managing government funds, thus enabling the finance of the war but establishing methods and precedents that extended into the future. And it was not only the government at play because private financial institutions were employed creatively as they had never previously been. Possibilities were opened that could serve effectively in coming years and adapt as the future unfolded.

Lincoln was behind all of this and much more. His suspension of habeas corpus is often cited as an unprecedented demonstration of presidential power but it was much more, besides. Lincoln was a decision maker of large and small questions, overseer of minutiae that would tire others as well as orchestrator of grand strategy. We like to think of some of these things without considering the larger context of their meaning and especially how decisions were made to pursue those outcomes. The Emancipation Proclamation is an excellent example of complex, highly speculative strategy that produced enormous consequences that swelled beyond surface indications. It had the practical effect of immediately freeing few slaves but wrought an unsurpassed depth of meaning and moral force that strengthened the present while guiding the future. This was a big decision for life but Lincoln was charged also with administering a great deal of death. Quite beyond the bloodiest battlefields in history, Lincoln set policy that fed those deaths, not only through conscription, but relentless pressure on his military leaders to implement ever more aggressive tactics regardless of the deadly consequences. Then again, he toiled countless hours over every one of the many applications for clemency that he received from those condemned to death by military tribunals, ultimately aware of the activities of firing squads regularly within his earshot. He was the first president to be assassinated and the last to be under direct enemy fire in battle. It is important that of Lincoln's fifteen predecessors in the presidency, only six were northerners. Then, as now, the South held sway disproportionate to its population and access to economic leverage. It cast a pall over an otherwise vibrant nation. Its influence needed to be broken.

Lincoln's power is sometimes cast as an aberration simply because it was unique up to that point. Actually, he was signaling the direction ahead in ways that had not been previously encountered. His will filled a vacuum. His insistence prodded results that would not otherwise have been manifested. That he demonstrated possibilities should not be discounted as simply the results of wartime authority. The need, for example, of an income tax, never before instituted in the United States, foreshadowed the future. The magnitude of contracts let by the federal government for the supply of troops might reflect the war effort but it also foretold the power, not only of government, but of concentrated effort by any organized and well-funded entity. The war effort represented the coalescence not just of government and other entities, but also of business, finance, labor and social focus in addition to moral force. Just as all of this was established, all of it was also questioned, tested and sometimes found wanting in the years immediately after the war. The Civil War was a matter of power then and power later, but also of power used and power abused, of diversity and complexity capable of yielding great results, but also tragic consequences. In the years ahead, the nation would contend with itself in ways that the Civil War did not and could not.

In the aftermath of the Civil War, the federal government famously tightened then abruptly relaxed its control of the South. This is not a time to debate Reconstruction but it is appropriate to point out that government seized the opportunity to demonstrate its ability to exercise power when it made that decision. By flexing its might apart from the military, it was demonstrating potential power after the war as it had during it. Note was made of this possibility. By relaxing its grip, it was demonstrating potential power of a different kind. Note was made of this possibility, also, and it was evident that the government could ripple its muscles in some ways while simultaneously relaxing them in others. This was very important. These notes were referred to by separate interests in the future, interests that were vying for advantage from entirely different perspectives. Perhaps we should jump ahead again, more briefly this time, to observe that this played out in the 1920s to devastating effect, not necessarily as violently as before but, in some respects, with more violent results. It was in this decade after World War I that conservative forces were most successful in synchronizing the flexing and relaxation of government control to the most concerted advantage of business. In effect, the strength of one hand of government was used to effectively disappear the strength of its other hand. Restoring balance would be the stuff of the "American Century" to follow.

For the moment, we need to observe what was happening between the Civil War and World War I. What was happening was business conducted as it had never been previously and labor at work as it had never been before. It was as if the curtain had rung down on one act and the actors immediately rushed without a break into the next. The size, the scope, the implements necessary were all immediately at hand and relieved to use for peaceful purposes, especially making money, and not just some money but more than ever before. This meant that the shift, begun decades ago, toward direct employment by businesses as opposed to independent employment as individuals became the overwhelmingly dominant means of livelihood for workers. In some cases, the business entities that employed them, railroads for example, had grown during the Civil War. In many other cases, new businesses were formed to compete in every respect, in every corner of industry and commerce.

The structure of businesses changed to better function in these new circumstances. As size mattered, so did organization; not only were businesses becoming larger, they were becoming more deftly organized to take advantage of new opportunities and of labor. Organizational charts began to reflect greater complexity as more layers of management and supervision were inserted. The concept of business management took on a whole new meaning as it became necessary to "professionalize" the previously haphazard administration of enterprise. Managers were introduced at every level, an innovation that reverberated not merely through the ranks from straw boss to CEO but through the whole idea of leadership that began to change rapidly. If Andrew Carnegie found it expedient to have what he termed "clever partners," they, in turn, could not live without subordinate managers. With no thought as to meaning or consequences, the world of work and leadership hinged on the introduction of managers; consequences would explode in years ahead. Without knowing it, businesses were preparing themselves to live large in the future by becoming positioned to absorb, orchestrate and function on a scale previously unknown.

Consider elements of business that were developing at that time that were either entirely new or previously of such minor importance as to make them mere afterthoughts. Markets, for example, were growing in the United States and abroad. This made all sorts of global considerations locally important on a functional level. Information, knowledge, was suddenly vital which meant that rapid, comprehensive communication was essential. So was reliable, predictable transportation and the previous advances in the technologies behind these aspects of business were suddenly inadequate, calling forth new technologies that, themselves, quickly evolved into better and better, more and more. New businesses in previously unknown fields suddenly erupted resulting in new means of distribution, communication and more. These changes engendered sharper and ever-increasing competition as new ideas brought new processes, products, demands and business. Keeping up with all of this rapid change was, in itself, a new wrinkle for business. Suddenly, just keeping track of all that was happening within a business was a matter of vital concern as new accounts sprouted with advances in transportation that brought access to new markets. Changes in the concept of advertising pushed commerce forward and made new products enviable to growing numbers of consumers. Every aspect of these changes meant that new jobs were being created, upping the ante for businesses and extending the scale of outreach and potential for profit. If all of this sounds like big business, it's because much of it was big in ways that had significant impact on small local businesses that never sought great size. If you consider the extensiveness of mom and pop operations in the twentieth century, it helps to put sudden growth in perspective. Being big brought attention but it was not the whole story. Likewise, the response of workers to big business was to go big, also, but again, the actions of a prominent minority had significant impact on the small-time majority.

Unions began to be formed, broken and formed again with an eye to size making a difference in their ability to confront the growing size of business. Strikes represented the exercise of freedom as well as economic warfare and war is exactly what big businesses gave the unions in return. Labor strife became both frequent and bloody, setting a new precedent for both businesses and working people that extended a mutual threat well into the next century. The famous labor conflicts of the 1870s merely foreshadowed what lay ahead.

Conflict between labor unions and big business also did something else that is often overlooked and it is very important. Big bloody conflicts of the period tend to dominate in our minds but we need to consider that then, as now, most businesses were not big and most workers were not unionized. Still, the activities of these giants cast disturbing shadows over smaller businesses and ordinary, nonunionized workers. They could hardly help being aware of these major events but for the most part, small businesses and their employees kept their heads down and pushed forward within the scope of their environments, a fact that had enormous implications for leadership in later years.

Another thing that is sometimes overlooked when considering the activities of labor and business during the period is the fact that specialization continued to grow. Just as technology helped forge increasing levels of productivity, it also encouraged businesses and workers to think differently and to compartmentalize their activities, refining processes and production methods even as the skills to accommodate these changes were also being refined and redirected. The impact of specialization had clearly evident results in both larger businesses and among workers whose unions were constantly adjusting to new processes and the organization of workers in businesses. But again, this was having a simultaneous if less obvious impact on small businesses and ordinary workers.

There were specific qualities associated with leadership embedded in the challenges of ordinary workers and we will address those issues in coming pages. For the moment, it is sufficient to realize that there were multidimensional aspects to the difficulties faced by workers. Organization of labor into guilds addressed what seemed then to be ancient questions that were so outdated during the period after the Civil War that it became obvious that more organization was required among workers for comprehensive solutions. To that end, it is significant that the early attempt at unionization sought widely applicable answers that encompassed issues of day-to-day life instead of being confined to workplace concerns. The Knights of Labor, for example, sought to organize all workers in every industry without regard to wage status. This effort made a valiant if ineffectual attempt to deal with improving the lives of workers where they lived, operating from a reform minded plane that was more philosophical than practical. Successor labor organizations were trade unions with down-to-earth concerns encountered on the job, issues of wages, hours and working conditions. They found utility through organization by trade and kept their membership focused on a few widely applicable issues.

We must again be cautioned not to regard the problem confronted by workers to be the stuff of union representation and we must be reminded that relatively few workers were organized during this period, with most workers continuing to work for smaller businesses that have, with their readily available on-site ownership, the capacity to circumvent organization. Nonetheless, the fact that unions grew is important and as they grew, their message reached increasing numbers of workers. A parallel issue is the fact that increased industrialization necessarily meant that workers were thrown together ever more closely in tightly compacted urban centers where they were not only close to the location of their employment, but also one another, making communication easier and distress more palpable among greater numbers of people. Concentration of specialists of various kinds facilitated the conduct of business but it also opened workers to acknowledgement of despair as well as opportunities. Cities grew during this period as never before with twenty percent of the American population in 1870 and a third only twenty years later. New arrivals from rural areas came with negative words of religious leaders ringing in their ears blaming city dwellers for the rise of sin and social ills. But they also arrived with a fresh understanding of how technology was changing the nature of work. Farms during this period began a gradual shift from intensive labor for production to increased mechanization. Suddenly farmers were part of the national and in some cases, international agricultural market as reliable transportation made their farms accessible to the rest of the world. And just as commercialization called forth greater scale in manufacturing, so it did also in farming. Rural workers had become subject to specialization and issues of employment and unemployment the same as manufacturing workers in cities. More and more, it was one big country (except for the South, of course). In some respects, manufacturing got there first. Machines were made in manufacturing centers of urban areas which were most influenced earliest by marketing, advertising and all the elements of commercialization that were changing trades and other fields of endeavor into mere jobs.

Businesses responded comparatively deftly to all the change that was afoot, crafting structures for themselves that met changing circumstances and that created solid basis for their futures. They were, after all, on the leading edge of all the decisions being made and the changes themselves. Small businesses scrambled to keep up but their limited size and scattered distribution made them ideal candidates to adapt quickly or, through ignorance and inattention, to disappear. While expanding or even introducing specialization could be a potential liability for small businesses, it could, for the nimble among them, also serve up opportunities that larger businesses were unprepared to exploit.

Finance was another matter. Big businesses were taking advantage of new financial arrangements being engineered for the times. As capitalism assumed modern characteristics, not the least of these was monopoly, a factor that, by definition, excluded small businesses. "Robber Barons," of course, were all over this and had been in various ways from the earliest days of change in the nineteenth century. But for our purposes, the important fact is what was happening and why, not the who that was doing it. And it is important that industrialization and all that went with it made broad changes, not simply to work and the workplace, but to all facets of life.

Government tried to respond. But as government is controlled through politics, the response was slow and inadequate because politics mostly lags behind necessity and the needs of the greatest numbers of citizens. One reason for this, certainly, was the fact that businesses were able to insert themselves and their preferences ahead of everyone else. Often businesses were pushing advantageous laws before there was any chance for ordinary people to understand their impact. This left politicians often so behind the times that they continued to operate from a mindset that was separate from the changes sprouting rapidly around them, a fact that remains relevant today. Businesses seemed always to be a step ahead. Trusts, for example, were invented in order to sidestep disadvantageous state laws that restricted the ownership of businesses by other businesses. Federal bureaucracy grew significantly during this period but it often simply offered more opportunity to business to extend further control. Still, there were some innovative governmental responses. Harkening in a way to Civil War era examples, in 1887 the Interstate Commerce Commission was established by a law that essentially created governmental managers to deal with businesses. But none of these responses, and certainly not the machinations of businesses themselves, were sufficient to address the needs of society.

Too many people in authority failed to realize the state of economic and spiritual privation that beset many Americans. The cities, having grown in response to business activity and their need for employment, were home to many who were exhausted from the pursuit of work as well as its periodic scarcity. Unlike previous generations who found cities to be pools of opportunity and knowledge, many now foundered with neither direction nor hope. Their backs were literally against the wall.

In 1893, Frederick Jackson Turner, a young American historian presented a paper to the American Historical Society entitled "The Significance of the Frontier in American History." It was freighted with meaning for workers. The escape hatch open for generations of earlier workers was now closed, the frontier, Turner proclaimed, having closed around 1890. The immediate significance for workers was that they had to make a stand where they were, it being useless to run west when the west was already full.

While it had taken generations for the frontier to close, the next, similar milestone was not far off. The 1920 census revealed that the majority of Americans were then living in cities. Demands of all kinds from businesses and from workers themselves had forced a loop of reality to close around Americans. While agriculture would remain dominant in some regions for many years to come, the greater portion of the population were forced to sustain themselves through employment in ever denser urban areas. To glance ahead, even the rise of the automobile culture and extensive new roadways after World War II exacerbated problems rather than relieved them. But there was plenty of an immediate nature to alarm workers and occupy their full attention.

One way to think of American history during the late nineteenth and early twentieth centuries is to acknowledge two gilded ages, the first on the heels of the Civil War and the second following World War One. That's far from the complete vision as workers during that period would attest and the conflicting visions, each with its own evidence, is indicative of a period rife with paradox. Between these two periods of affluence swirled economic dislocation, exhausting stretches of hard labor on a daily basis, sometimes followed by forced idleness and its concomitant lack of wages. The resulting stagnation of workers' lives in neighborhoods beset by disease, malnutrition, ignorance and fear coexisted in the same cities with unparalleled affluence. It is little wonder that labor conflict also characterized the period and took on a dimension of class struggle that is frequently denied among and about Americans.

American class conflict is a prime paradox that continues to bedevil attempts at progress. Class conflict in the United States is largely subterranean, less because it is than because we often deny its existence. The sub rosa characteristic of American class struggle cannot be properly said to be natural but imposed, an artificial insistence wrongly denying that it exists. Excuses for denial are mostly contrived to suit conservative fantasies but there is a much deeper and important reason for persistence of the myth. Of course, it doesn't help that workers too frequently buy into the lies told about them by their overseers among the affluent who loosely conspire to obscure the truth.

What is being intentionally hidden about class conflict in the United States is the most damning evidence of its existence. Unlike older European and Asian societies that were for centuries tied to a class structure rigidly enforced by law and custom, white Americans were long accustomed to social and economic mobility that not quite, but for the most part, effectively ignored class distinctions. European visitors once marveled at this aspect of American life, which, itself, was not completely applicable. There have always been social snobs who resisted contact with citizens lower on the economic and social scale but the foundation of their perspective was never absolute and was eligible for modification through loss of wealth and marriage into less advantageously placed families. In aggregate, this amounted to constant social churning, that, although evidence indicates has hardened in recent years, was fluid for a very long time.

This social and economic mobility created memory in workers. We often hear of "institutional memory" that refers to both to the acknowledgement of broad concepts and history evident in the past performance of organizations as well as to business practices and culture. Workers have a broad memory also, a corpus memory that is collective and common among them rather than institutional. For white Americans, this meant the memory of their equality among other white Americans. It was the memory of times both good and bad that were shared. It was the memory of struggle for improvement on whatever endeavor was at hand. It was a memory that reassured them of their continued equality in the face of all contradictory evidence.

In practice, this shared memory among workers meant that their class struggle was founded on expectations based on past experience and participation of both social and economic success. Thus, American class conflict was based on the expectation of reemergence into their rightful place, both socially and economically. It meant that their condition was viewed as temporary with the expectation of eventual relief. They harkened to the memory of their participation in positive experience. Because of this benign perspective of class distinction that they trusted to remain fluid, Americans typically did not begrudge the elevated status of the wealthy. Americans customarily wanted nothing to do with bomb throwing anarchists that occasionally appeared, often linked publicly to immigrants who did not share their history. But the American class conflict was a struggle, nonetheless.

The glossy reading of history preferred by modern business and conservatives misses a crucial point. They fail to understand that in seeking to fulfill the expectations of their collective memory, Americans were engaged in a kind of class conflict that is just as strong, if somewhat hidden, and just as imperative as the struggle of oppressed people elsewhere. And workers themselves, the vast majority of Americans, too often misunderstood the nature of the class conflict as it matured, preferring the halcyon days of their memory to the perfidy with which they were confronted. In this sense, they created the Horatio Alger myth and held fast to it for generations. Another paradox is also evident. As Americans struggled with the fact, if not the acknowledgement of class conflict, they were tightening the grip of their oppressors while hardening themselves against the day of reckoning.

Again, we must consider who was absent from the collective memory and consciousness. First among these, were African Americans. It should be evident that the white South was absent for reasons previously discussed. But the fact that African Americans were now free might raise a question about what I am contending here. We must remember that their term of freedom had been very brief at this point and many living African Americans had direct memory of slavery. The terms of their later employment were always tenuous and subject to the caprice of white landowners. The system of sharecropping that developed in response to land rich cash poor whites enforced a kind of virtual slavery even in the days after freedom had been won. To that must be added Jim Crow and the convict lease system that traditionally kept generations of black men incarcerated long after slavery was abolished. In addition, we must remember that most African Americans lived in the South where our discussion of collective memory and class conflict does not apply. The Great Migration had not yet begun and African Americans were yet to experience the vicissitudes of capitalism and the travails associated with free labor that awaited them in northern cities a few years later.

Recent immigrants must also be excluded, not having this collective memory and peculiar participation in the American class conflict. Earlier generations of immigrants easily assimilated into American culture and readily joined the movement west and development of new states and cities on the frontier. By the time the frontier closed, later immigrants found themselves stuck in existing large cities where they helped to swell the ranks of cheap labor without attachment to the past that clung to resident white Americans of longer standing. In fact, assimilation under these circumstances was more difficult and ethnic enclaves more rigid with the maintenance of a mindset that sought to be American but could not fully engage in the meaning of its past. That meaning, in many instances, had been what attracted immigrants in the first place, the knowledge of opportunity, economic and social mobility that slammed the door shut to new entrants as the factors that created openness and potential dried up. It would take at least the turn of a new generation for these immigrant white families to get the drift of things American as the younger members of those families grew up with the language and more nearly inculcated the culture.

What we are talking about is an American white working-class phenomenon. Given the dominance of whites at all levels of economic and social strata, whites not only called the shots then, but they also defined the terms of the future for all workers including African Americans and immigrants. By the time the blister finally rose on the burn, all workers had some sense of the collective memory and operated accordingly as confrontations manifested more insistence for renewal of their expectations. This renewal might loosely correspond in the evaluation of many minds, to recognition of the "social compact" except that in the United States, there had never been, according to the memory of white workers, a separation of classes such that one would neither deign to offer solace to another, nor receive it, given that, according to this perspective, they were all of the same cloth. This renewal has sometimes been referred to, especially in reference to civil rights for African Americans, as "cashing the check" that had been issued long ago except that in the narrowly and generally accepted version of this idiom, it referred to freedom for African Americans in a strictly civil sense. Applying something of the same feeling to economic issues such as employment and the "American dream" stretches its intent and timetable into the future. And it wouldn't hurt to note in advance that the economic struggle of African Americans in the first sixty-five years of the twentieth century was fraught with extreme difficulty that has yet to be resolved. Unions, for example, sometimes resisted extending their membership, and with it, assurance of the kind of work that could lead to participation in the collective memory established long ago by white Americans.

Problems were piling up fast for workers during the latter part of the nineteenth century—all workers, including immigrants and the smaller African American population in the North. A key factor was economic dislocation as businesses began an unbridled assault on workers, running roughshod over what are now considered fundamental human rights. As business organizations refined themselves, they also began to practice the kind of arrogant control over workers in their businesses that would gradually define much of the twentieth century conflict between management and labor.

Most workers were not organized and were left to drift helplessly against dangerous currents coursing through society and the workplace. The response for those who were organized in labor unions sometimes included use of the strike. Given the paucity of laws favorable to labor, strikes were often bloody battles that workers lost under the baton of police officers, the gunfire of private security guards hired by management, and of course, scabs who, being out of work themselves, were willing to disrespect other workers in order to find work. All of this should suggest economic privation for many, and because they lived in cities, the impact of all of this on workers and their families in these cities was horrific. All of this should also suggest the lack of adequate government response. Given the historic restriction on the role of government, it is understandable that politicians kept their hands off both arrangements of employment and of social relief in the face of malnutrition, disease and starvation, not to mention other social ills such as extensive child labor.

To be fair, or biased, as may be your perspective, some politicians tried to intervene, at least in limited ways. Too many of these were seen as cranks and demagogues regardless of the value of some of their proposals. The monetary system, for example, was grossly poised in favor of the rich but with a remedy proposed by populist presidential candidate, William Jennings Bryan, being cast aside along with his defeat on multiple occasions, that route was effectively closed. Attempts at agrarian relief made sporadic headway but were mostly smacked aside, as well. Overall, the demagogues and populists proved ineffective at towing the political process against the roiling business and wealth favored currents of the day.

Yet, no response was not the complete response because, beyond the limited scope of government, a fundamentally new progressive vision began to percolate. Among the characteristics that distinguished progressives from populists was the fact that they recognized that tinkering with government policy was insufficient in and of itself to solve problems that had, in their perspective, become intractable. The progressives sought the kinds of change that would reconnect workers with their expectations, recognizing them as the workers saw themselves, as integral Americas, not merely the possessors of labor for sale to bidders who rigged the process and divided the spoils.

While progressives are famous for legislative achievements and political ascendance, particularly in the person of Theodore Roosevelt, famous as a trust buster among many other things, their mark was a great deal more than a list of policies, such as the establishment of the Federal Reserve system and reintroduction of the income tax. Although such achievements as women's suffrage and the direct election of United States senators indicated the exercise of considerable political acumen as well as determination, there was something much deeper and ultimately a great deal more important than the implementation of any policy or passage of any particular law. Progressives had a consciousness of change on behalf of their fellow Americans that extended beyond laws. On the local level, they were able to change municipal laws to some extent with their overriding concern for honesty and responsibility to the interests of voters. But their concern showed up in other important ways beyond the confines of government. Relief of wretched conditions in urban areas, for example, signified a lasting quality of concern that helped Americans to have hope and identify themselves with reform potential to revive the expectations embedded in their memory.

The single example of workers' compensation will illustrate the scope and depth of progressive accomplishment and its importance as a lingering influence. This example also happens to touch every base along the way from the preindustrial era to current controversies. Workers' compensation has never been more important and it is potentially on the verge of significant change, threatened by forces as old as greed itself.

In the early days of the United States when rural life was the customary pattern, families took care of injured relatives with neighbors, be they near or far, contributing to help the family survive until the farmer was able to return to his fields. As workers began to form villages, the same pattern continued and when they began to move into cities, again, the same pattern continued because, as we have discussed earlier, at least some family members were likely to have moved with them to the city where they formed new relationships among other workers that would have sustained them during most of their difficulties. In the events where injuries created permanent disabilities so severe as to incapacitate the worker entirely, the worker retained sufficient depth of contact and relationship with families at "home" in the environs of their origins that they could return to their former homes.

It goes without saying that injuries to these early American workers left them otherwise on their own, there being no insurance policies or significant individual monetary savings to see them through. But we must also consider substantial ameliorative factors. Because these were independent workers, they had the ability to choose the specifics of their work. They controlled the conditions of the work, the materials, the tools, the pace and the outcome. They were not forced to work beyond the level of their natural capacity or at a pace that endangered their health and well-being. They were not required to use tools with which they were unfamiliar or with which they felt unsafe. The materials were those they were accustomed to manipulating and they could acquire helpers for projects at their own discretion. Ultimately, they could elect to avoid projects they believed themselves unfit to address and could even abandon work if absolutely necessary, all by their own determination. Under these circumstances, injuries were less frequent and certainly often less debilitating when they occurred. Each worker was responsible for their own safety in every respect and obviously had the incentive to avoid injuries in order to continue to prosper and maintain their families.

All of these factors changed as the nineteenth century progressed. As technology produced more and more complicated machines, familiarity became problematic with much learning done quickly on the job as opposed to the slow, thorough process of apprenticeship. Not infrequently, these machines were dangerous under the best of circumstances and the workers had little if any control over the circumstances. As employees of businesses, they worked at the direction of bosses hired to ensure that deadlines were met and that specific actions were taken along the way without regard for the safety of workers, all the while maximizing profits through efficiency. This often pushed workers into dangerous work, risking life and limb in order to be paid at the end of the day.

When the end of the day arrived and the worker was dead or maimed or otherwise removed through incapacitation, they weren't there to collect pay and there would be none until such point that the worker recovered and was rehired. That is assuming that the worker recovered and that he was rehired. None of that was a given. When the worker was not working there was no income. Employers were under no obligation whatsoever to injured workers and that included lack of obligation for medical treatment, let alone providing disability payment. By this point the relationship days of bygone times had been erased in a commonly tolerated urban existence characterized by poverty. Workers in previously helpful networks were just as strapped as any other worker and were unable to assist families of injured workers beyond the very slightest degree, if at all. Worse, as time continued, the family connections that workers had once maintained in rural areas were often irreparably broken leaving no one for an injured worker or his family to rely upon. The economic and social dislocation this caused is obvious and the pace the problem escalated with time as work conditions worsened and became steadily more dangerous.

A solution was badly needed and the concept of workers' compensation began to take shape in response. The idea was simple and straightforward. Employers would pay into a state fund that would make payments to injured workers during the duration of their incapacity defined by limits and compensated according to predetermined scales. In addition, death benefits were available for the wives of workers killed on the job. Boards of overseers were established to review contested cases and an appeals process ensured that multiple officials would review applications. As refinements were made to the system, employers were required to have insurance sufficient to cover potential claims. The financial obligation of employers was far from onerous and the benefits extended to them as well as to employees for whom the protection was intended. If an employer had incentive to treat employees humanely, their continued good health would reap financial rewards for the business. This factor increased in importance as time went on because training time, scant though it tended to be, was required increasingly as the complexity of machines and processes increased. Another major benefit for employers, one that increased in the twentieth century as litigiousness mounted and workers became more assertive, was the protection of employers from employee law suits. Acceptance of workers' compensation benefits meant that workers could not sue for damages but received immediate relief. This would prove to be a significant point of contention in the future and it would be wrong to imply any sort of particularly congenial relationship growing between labor and management. To the contrary, every improvement had to be coerced.

It is easy to see why labor unions favored the implementation of workers' compensation laws but these limited organizations had little impact beyond the confines of the businesses where they represented workers. Something else, something very important, was also taking place. Seeing broad need, the public at large began to take an interest and promoted the promulgation of workers' compensation programs. This, in fact, is one of the lasting achievements of progressives that began in the latter part of the nineteenth century. But as helpful as it was, we must never lose sight of why there was a need in the first place. There would have been no progressive movement if there had been no repression. It is extremely important that citizens began to see that government could be turned toward broad benefit for workers and was not necessarily required to serve the narrow interests of management. Progressives, through the workers' compensation program and other achievements were able to demonstrate positive results for large portions of society.

The idea of progressivism is simply a long held and continuing belief that progress is possible for all segments of society together, a progressive being a person holding that belief. The fact that the idea of progressivism took sufficient hold in society that it became organized into a movement, however loosely formed, such that it required capital letters, indicated breadth of support among the citizens. The Progressive Movement itself proved to be of brief duration, unable to withstand concerted assault from entrenched conservatism abetted by extreme wealth. But the progressive idea, beaten back and occasionally influencing the less potent liberalism common in the United States during much of the twentieth century, has never died. In fact, during the early part of the twenty-first century, it has regained increasing traction as a political force, especially notable in the failed presidential candidacy of Bernie Sanders among whose supporters the progressive ideal will likely flourish more effectively in the future. The point of this digression is to distinguish the progressive idea from the early twentieth century movement that took its name and the liberalism that followed as well as to point out that the idea yet lives and could become ultimately very important for workers a century after the eponymous movement.

The key factor in the success of the Progressive Movement was its ability to connect workers and their needs to their sense of expectation and produce results in the process. This represented a coalescence of factors that had not occurred previously. Earlier in American history, as we have seen, workers were an integral part of society but as technology and other factors separated them from their holistic sense of belonging and made them merely one part of a factionalized process, they lost their sense of participation and value. By demonstrating that workers retained their humanity, that their living conditions as well as their compensation were important to sustaining overall progress of the nation, their expectations were once again validated. Certainly, without the ability to read the future into the twenty-first century, workers, having joined a progressive coalition with humanists of many types, set the stage for a revival of expectations in the future. As they were making progress in the early days of the twentieth century, they could only continue their work, hoping that it would strengthen and hold in the immediate future long enough to build sustained results.

The problem with a coalescence of factors is that what comes together can be rent asunder. It was one thing for rich owners to be widely dispersed among the general population owing businesses and factories and working in those businesses with their employees as part of the community, but it was another thing for wealth to become aggregated in the hands of fewer and fewer people. The tendency of capitalism is toward monopoly, a process that by its very definition reduces the distribution of wealth and concentrates power. When wealthy people were rich in proportion to their employees and worked with them, a constancy of expectation was maintained among all people, but when the relatively rich people themselves were pushed aside by colossi of super wealth, the previously harmonious system unraveled. There, ready to reweave the threads to their advantage, was a narrowing set of oligarchs. If this sounds familiar, it should because it is also the state of things in the early twenty-first century.

The well-heeled forces of avarice and conservatism proved too much for the Progressive Movement whose context was shrouded in a flux of social issues, emerging ideologies, the disparagement of workers, distracting politics and a false sense of growing affluence. The misbegotten ascension of a mislabeled president in the person of Woodrow Wilson betrayed core progressive principles, permitted repression and elevated racism at a key point in American history. The resulting defeat of progressivism and the Progressive Movement cast American workers into a dark age while the rich partied. The roar of the twenties drowned the anguished cries of workers as laws enabling and emboldening the rich encouraged excess for the privileged while squelching attempts of ordinary people to escape the shadow of repression if not its grasp.

Once again, the expectations of workers smacked the wall of apathy and active deterrence, contradicting progress that had been made and inhibiting the possibility of any extension of improvement. The flash of a fleeting gilded age appeared, then quickly vanished when the gold proved fools and the economy spiraled downward faster than it had risen. With capital stagnated and unemployment suddenly ballooned to at least a quarter of the workforce, general torpor descended over the entire country and a sense of personal devastation became widely palpable. The Depression, an incarnation of defeat, was the miserable denouement of repression and the endpoint of a significant period of American history, providing a conveniently realistic, if disreputable, close to the nineteenth century, as well, albeit situated three decades off calendar.

The government at that time, unwilling to shed the conservative perspective of its leaders and their wealthy allies, proved unwilling to help its citizens. This failure was a slap in the face of workers who had witnessed a demonstration of how government could, when it was willing, act as an agent for social and economic change benefitting the whole of society. This experience rekindled the memory of their expectations as Americans and promoted the hope of a better future. When that future collapsed and hope with it, American workers were unwilling to forgo their dreams of realized expectations. However faint progressive ideals had become, they were not extinguished and provided the basis of struggle at the effective beginning of the twentieth century signaled by the New Deal in 1933.

Individualism

One particular factor—individualism—provided the key for transition to the twentieth century. In fact, individualism and its closely entwined corollaries, cooperation and collaboration, bridged successive centuries from the beginning of American history to the dawn of the twenty-first century. The three factors, lumped for convenience under the single term, "individualism," functioned together and should be considered as a single, closely integrated concept of interdependent characteristics with each mention of "individualism." The fact that these three aspects combined in many different ways and operated under numerous guises and conditions contributed to concealment in plain sight. The fact that they, in turn, guided many other separate issues and determined complex outcomes with increasing influence over time contributed to them being overlooked in a rush to identify solid findings that could be more easily described. Together, individualism, cooperation and collaboration established direction, delineated paths and guided the flow of ideas and events more thoroughly than few other concepts. Later, we will see how individualism will extend influence well into the future in very important ways that are already evident.

By examining the origin and operation of individualism beginning with the founding of European settlement in the United States, we will be able to understand how it served as transition to an entirely new period of history. Europeans not only accidentally discovered America, they stumbled upon a unique opportunity for unplanned experimentation with a new perspective that would offer, in turn, a fresh model to the world, a model that would complicate existing formulas and ultimately drive the engine of commerce simultaneously in new ways and off the rails. Those offended by what might seem to be blatant effrontery of American exceptionalism in merely different terms should wait for the evaluation of a harsh outcome before jumping to judgment. Likewise, be prepared for world inclusiveness ahead that ignores national boundaries in ways never envisioned by neoliberals of latter day avarice.

Greed might be said to be the fundamental component of early American exploration, summarizing in a single word the old "God, gold and glory" motif used to typify European incursion. Even God gets in on the greed in a supposed quest to reach heathens, when actually the deity was simply being used, as any other handy device, as a combination tool of excuse and control to implement extraction. Glory, too, was a matter of greed, a kind of shining raiment yearned for by the exclusive few who could convince a monarch to fund an expedition. Gold is obviously the most telling emblem of greed, but, as a symbol of need and supply, it told a different story for some people.

The story of the New World often compares and contrasts the different approaches of Spain and England with virtually every Spaniard being focused on the big G with its three little G components while Englishmen pursued the benign objective of settlement composed of commonplace household affairs. We need to examine these Englishmen a little more closely. In them, we will observe the strategy of greed but with a tremendous difference that rendered a vastly different interpretation of the three elements. As opposed to explorers who sought personal aggrandizement through greed which they shared with their monarch, English settlers were looking for just enough manifestation of greed to sustain and nurture them individually and collectively. In other words, explorers exemplified a type of individualism marked by lascivious greed while the settlers pursued ethical individualism as a core principle of their lives. This bifurcation continues to haunt us and dealing with it will be a major issue in the twenty-first century.

European settlers didn't suddenly become individualists upon reaching American shores. The very fact of their emigration indicates stalwart individuality, hearty dissent from circumstances that bound others to established practice. This has often been linked to flight from religious persecution and, in some cases, that is an accurate portrayal of motive. But we have a tendency to treat reference to religion, not merely as sacrosanct, but complete within itself; we behave, after religion is invoked, as if we must not question further or see anything else. Yet, peering deeper, we see, at least in some cases, that people with religious beliefs at variance from their neighbors maintain other individualist perspectives, as well, indicating a more than ordinary willingness to stand apart from the crowd. What could be more characteristic of individualism? A great deal, in fact.

To stop with individualism expressed as separation tells only part of the story. Hermits did not emigrate. In every instance, even those of criminals partly among the population of Georgia, New World immigrants were functioning members of communities in their countries of origin and did not suddenly suspend cooperative natures in their new homes. Instead, they drew on familiar forms of social intercourse and adjusted patterns of collaboration to suit altered circumstances. The result was renewal because changing to address rising contingencies necessarily strengthens the composition of response. Hence, individualism moved with immigrants to their new homes, complete with the aspect of cooperation. But more is indicated in all of this which, so far, is essentially a merely obvious explanation.

Less apparent is an inherent need for many, although not all, individualists to expand territorially as well as mentally, although not necessarily for reasons related to location or space. If we stop with the conclusion that some religious people, feeling persecuted, sought to expand their physical horizons for the reason of uninhibited religious practice, we sell ourselves and them short of a full understanding. Many, not all, but certainly anyone who was willing to uproot their family or even just themselves to undertake the perilous move across an ocean to transplant themselves permanently on the edge of a still unexplored continent do so with multidimensional intent. Far more than mere motive, individualists seek to expand whatever boundaries they perceive whether spiritual or physical, typically both, especially inasmuch as one aspect influences the other and one signifies the other. As individualists express an internal need, a physical form is created that accommodates that need, often serving its purposes. By leaving Europe, settlers were acting out in advance what would become a characteristic feature of American life for generations, that of westward expansion, rerooting themselves as their economic and psychical needs required fundamental change.

The same impulse continues to be acted out today, long after the physical frontier closed. At the time of settlement and for long thereafter among participants in westward expansion, utilitarianism was the controlling factor of accoutrements of daily life. Household furnishings were spare, geared to serve need and function rather than décor and were common to all under similar circumstances in the region. With the closing of the frontier and as people ceased preoccupation with survival, often abetted by westward moves, individualism sought expression in the unique décor sought interminably today and expressed as a means of distinguishing inhabitants of otherwise predictable dwellings. Although for a while in the twentieth century as businesses moved employees around the country or the world for their own purposes, thereby creating a substitute for generations of westward expansion, by employing unique décor, we have taken the substitution a step further while satisfying the psychic impetus for physical moves.

At the time of American settlement, the physical nature of moving remained one of the dominant factors in the expression of individualism, one that clearly signified to others as well as addressing an internal yearning to be satisfied. There were numerous reasons for gratifying the need for individualism by emigrating from Europe including social, economic, personal and, of course, religious. As we look a little closer at some of these, it would be helpful to broaden the scope of our consideration of individualism because it is not, as might be suspected, a completely introspective matter or even one completely focused on singularity. By concentrating on settlers, the vast majority of whom were not wealthy (explorers tended to be detached and were not actually settlers, for example) we might forget that virtually everyone is subject to individualistic tendencies and that wealthy people are also individualists, albeit with a very different perspective from the majority of citizens. The motives of wealthy people obviously are very different from those of workers. To gain even greater levels of wealth might best summarize the attitude of many wealthy citizens as opposed to workers whose objectives might merely be to assure a minimum standard of living for their families. The gulf between these two extremes becomes even more significant when we consider the flip side of individualism, that of cooperation and collaboration.

Treating individualism, cooperation and collaboration together is an inescapable necessity. Contrasting how these factors functioned differently according to wealth status is instructional on several levels and contributes to a generally improved understanding of individualism over the course of American history. We have already seen how individual workers cooperated in families, villages and cities, moving from nearly single driven agricultural work to more cooperative levels of rural and small-town occupation to cities where more complex patterns of interdependence arose. It should be evident from this that while exercising independent will and satisfying individualistic visions, workers also found greater fulfillment through various stages of collaborative endeavors. What we have not much examined is the fact that a similar process was taking place among wealthy people.

Any degree of economic improvement leads to some level of accumulation, even if extremely slight in the sense of storing for future use, for example. And the wealthier an individual becomes, the more is stored until it becomes conveniently transferred into certificates that can be more easily manipulated than barns of hay, acres of land, silos of wheat or even of stacks of cash. The wealthy person found, at each step of the way, that it was expedient to cooperate with others to increase their riches. Cooperation among wealthy people take some of the forms familiar to workers who form unions because wealthy people typically participate in numerous associations that are intended to protect and increase wealth and influence. But it is axiomatic that as wealth increases so does the separation from workers. The factory owner of the past or the shopkeeper who actually worked with their employees to produce or sell goods were clearly closer to those workers than the typically wealthy person of recent vintage whose participation in the business is from afar and as distance increases, the wealthy person finds less in common with workers and forms closer association with those of a similar status. The outstanding thing for our purposes at this point is the fact that both workers and wealthy people found greater fulfillment of their individualism through cooperation, although the manifestation of the cooperation might be quite different. The relatively wealthy settler might arrive, for example, with the mindset of accumulation as rapidly as opportunity permitted, while the poor settler might only look to survival. It's a mistake to assume that all new arrivals in America were poor. The Mayflower, contrary to wide misunderstanding, bore modest tradesmen as well as more established, affluent citizens and as time progressed, the mix of immigrants continued to reveal economic diversity just as today, American immigrants range from the poverty stricken undocumented worker to Rupert Murdoch.

Before examining some of the specific factors associated with European immigration to America, it is wise to keep in mind a few fundamental aspects of individualism. As we have just seen, for example, both workers and wealthy people were involved with individualism and both employed cooperation for their separate objectives, branching collaboration in different directions to satisfy either collective benefit or to protect and advance accumulation. One of these might be said to adhere defensively to the most basic of human needs of survival while the other, through offensive action seeks to expand greed. Individualism and cooperation, however, are essential for both paths. We will return to their point of divergence as we consider the twentieth century and the future. After arrival in America and given the time to explore opportunities that afforded the memory of success, workers would mark their diverged path with a collective memory that yearned for fulfillment. The need for expansion is a signal aspect of that path that holds true today. But for the moment, we need to be aware that Europeans who sought to expand to America had a background that featured a rich history of expansion, albeit sometimes frustrated, that they sought to transfer to America.

Not infrequently, Europeans arrived in America with a jaundiced view of cooperation, institutions, government and agents. Many had, historically, experienced betrayal through individualism channeled into greed and governments that had betrayed their interests and sacrificed them for the success of objectives that did not respect them or serve their interests. Eventually, settlers in the North were able to replace their inherited fear and skepticism with a perspective that permitted the building of new social structures that responded to their needs. (This marks one of the failures of southerners who, under their peculiar circumstances, never saw a need to get past skepticism and never built responsive social structures, a lack of insight that we will briefly explore soon.) The conclusion that settlers in America ever suffered from betrayal in Europe and whether they even ever created institutions and social structures that were more responsive to their needs is debatable and could even be myth. The important thing for our purposes here is that the myth survives, if it is mythical, which I doubt. Given wars that interrupted lives and served governments without benefiting workers, ethnic upheaval and the intrusion of law, custom and religion on the consciousness and lives of citizens, I suspect that it is correct to conclude that Europeans who sought to expand by coming to America had been betrayed in their original homelands. The fact that they sought expansion indicates that they had attained some degree of expansion in Europe, mentally if in no other respect, in order to become aware of the greater possibilities of even greater expansion through emigration, particularly to a place often with little existing governance or other restriction. One category of evidence that validates the fact that Europeans were oppressed in Europe is the fact of forced association, a subversion of individualism, as opposed to freedom of association, the epitome of individualism and cooperation which they sought and created in America. And no example better serves this clear assertion than religion.

By the sixteenth century, European countries and governments were well established. Contentiousness remained, to be sure, borders were frequently reset by wars and sometimes, notably with the French Revolution, governments sometimes changed character, but the overall establishment was solidly in place. What government did not mandate, custom conspired to constrain. Social relations and structures were formulated as soft restraints, conspiring to guide overall behavior that benefitted most citizens in limited ways while supporting both society and government. In the ordinary pursuit of social intercourse and commercial conduct, these constraints were benign but could become onerous, burdensome and even unacceptable as they exercised force against personal will and collective inclinations, thus thwarting individualism and group cooperation that differed from expectations and requirements. Social constraints took many forms from mere custom to the rigors of established nongovernmental organizations, chief among which were churches. Membership was typically required either by reason of law or entrenched custom that essentially blocked nonparticipants from community life. For many people, this was simply the way things were and constituted no difficulty. But for those who thought differently, religious constraint was a matter of forced association that obviously violated the principles of individualism that call for voluntary cooperation with its concomitant respect for freedom. Under these circumstances, constraint was the opposite of expansion.

When emigration solved the need to expand, Europeans newly arrived in America encountered a different environment. In some instances, particularly early in the settlement process, religious restriction was no less prevalent than in Europe and in some cases, especially regarding people who offered objection, governmental and social hostility could be devastating. But in general, particularly with the passage of time, the religious atmosphere in America was notably more tolerant than Europeans had experienced in their homelands. Even before the famous separation of church and state enshrined in the United States Constitution had taken effect, immigrants found a much more fluid religious environment in America. Often new arrivals realized that they had left a place of rigid control and entered a new space where they had to form their own expectations and religious governance. The case of Methodism transferred from England after its conception by John Wesley, an autocratic leader who was an Anglican priest, serves this point. As it grew in the American colonies, Methodism became a much less top down directed religion than in the mother country. As Richard Niebuhr explains, churches as they are known today in the United States often began as sects and spent many years of development to reach recognition as a religious institution. These sects, that multiplied copiously in an environment where distance alone rendered hierarchical control difficult, were forced to engineer their own forms, rituals and governance which meant reliance on democracy. In this new territory that respected the opinion of landed, free white men by according them a vote, the concept of democracy played out not only politically, but through nongovernmental institutions, as well. Additionally, anyone or group could form their own sect and find religious liberty as they pleased. Democracy and freedom were the means of achieving expansion and new arrivals eagerly took advantage of an atmosphere that nurtured individualism.

Democracy, even representative democracy, fits hand and glove with individualism. It was abundant in America, both in government and in other organizations; any inhibition it experienced came mainly from within itself in the form of mob rule. There are numerous examples of this, even today, but the acknowledgement of law, provided that it could be enforced, offered protection. At every juncture, settlers were exposed to democracy, which, in turn, provided an opportunity to express themselves, a natural outgrowth from democracy itself. Freedom of expression did wonders in the support of individualism by providing an extension, an expansion, into cooperation. Because settlers could freely communicate, a rich dialogue, conflictive over public policy, led to cooperation in other important aspects of economic issues and society. Freedom of communication helped to overturn ignorance and extend information throughout layers of society.

As more and more elements enter the discussion seemingly raising its complexity, it is important to remember that the source was individualism, a primary position that provided a touchstone to every extended idea. Individualism not only came first, it continued to be an active foundation nurturing an endless array of helpful concepts. If reticence was set aside by necessity, strength was acquired through the acknowledgement of common values and the exercise of their implementation. At no point was self-reliance compromised, it was strengthened through nurture afforded through association. As time passed, successful execution of self-reliance and individual success provided a ticket for associated successes and cooperation with others on expanded levels upon which people built still further. Individualism is thus associated with successful cooperation and ever widening prosperity. And all of this became embedded in collective memory, demanding manifestation.

The effectiveness of individualism and cooperation provides the overriding reason for its survival and the insistence of its manifestation. If it had never benefitted people, it would have been dropped in preference for something else. But it endures because it works; it is revived after periods of partial repression because it well serves the interests of those who seek to manifest it. As complexity of use and multiplicity of forms and examples increased, it is easy to get lost in specific instances of its success and overlook the fact that fundamental self-reliance is at the heart of individualism, even in its cooperative incarnations. We are too quick to associate self-reliance with the commonly held belief that it is best not to be in debt to anyone. This is a very narrow reading of individualism that militates against the benefits of cooperation and one that holds many back from greater success either by hording their knowledge and expertise or insisting against all evidence to the contrary that it is the surest way to preserve oneself, particularly against what is seen as a vicious world that will devour anyone who stumbles into its jaws. With self-reliance, no one is to blame but oneself and success is more apt to be assured when a person relies solely on their own resources. Such success is thought to be all the sweeter because of total self-reliance. It is true that self-reliance emerges from self-respect to become self-assurance when demonstrated repeatedly. But self-respect and self-confidence can be shared and the sharing produces strength that extends the effectiveness of self-reliance producing yet greater benefit for all involved and, by extension, to everyone who comes within reach of its impact.

Self-reliance took on a larger, deeper meaning as individualism matured in America and eventually adopted a cooperative aspect when opportunities for mutuality arose, then flourished and matured. Thus, the whole concept of individualism expanded along with the people who took advantage of its effectiveness. And as people changed, with more of them participating in every level of individualism whether or not they realized it, the context of individualism changed along with the examples, expanding the bounds of its effectiveness. When people realized that individualism created a basis for competence, more options became available as they garnered greater value through cooperation. Individualism provided insurance against the uncertainties of depending on unreliable others and when those others discovered its potential, also, and became involved, thereby creating cooperation, the changeable and uncertain future became less scary. People realized that individualism in all its manifestations was more apt to bring success and ensure the future than self-reliance alone. That became a critical feature when survival itself became an issue. By that point, individualism had been transferred into collective memory, making it available for future generations. As individualism gradually matured throughout the nineteenth century, unfettered capitalism countered its finest expression with abuse and repression. But by that point, individualism, including cooperation, was embedded in consciousness. Briefly, progressivism revealed the potential for improvement before being beaten back by unrestrained capitalism.

It is ironic that as capitalist enterprises sought greater productivity, those gains were attained through the cooperation aspect of individualism. Not only did capitalists benefit through their own manifestations of cooperation such as associations that promoted their interests and those of compliant politicians beholden to them, they benefitted from the cooperative success of their workers whose productivity rose with greater levels of cooperation. Unknowingly, a bridge was being built to the twentieth century, the essential structure of which, after another calamity, would serve, in turn, as a bridge to the twenty-first century, all based upon the ever-maturing aspects of individualism and cooperation. For the purposes of realistic explanation, the twentieth century did not begin until the collapse of repressive capitalism began to change under newly progressive ideas expressed in Franklin Roosevelt's New Deal.

Except for the fact that financial stress generated by the Depression exacerbated pressure everywhere, there was little fundamental change in the already weak and enervated South. The maturing phases of individualism failed to penetrate the self-built wall of separatism in a region that dogmatically rejected cooperation. While the North adopted self-reliance and individualism with its cooperative aspects, the South never moved past self-reliance. Instead, it adopted an extreme version of individualism, made a creed of it, in fact, and indoctrinated successive generations with false pride in separateness that guaranteed imperviousness to outside influences or new ideas. To be sure, the physical isolation of farms, agriculture being the overwhelmingly dominant economic factor, made it easy to fall into a trap of lonely self-reliance that made cooperation seemingly unnecessary. The South had always lived separately, prided itself on its separation, and studied how it had maintained its separation, calling that study learning from the past when all it was doing was perpetuating the past. The South never tried to envision any sort of future different from the past and condemned itself to repeat that dreary mistake endlessly. By the time the twentieth century arrived on the tip of FDR's upturned cigarette holder, the South was, as Roosevelt termed it, "the nation's number one economic problem." While the South continued to shun cooperation, the rest of the country forged ahead with renewed vigor into new territory.

Twentieth Century

In many respects, the twentieth century began with the inauguration of Franklin Roosevelt in 1933. Literally up to the last minute before taking the oath of office, Roosevelt was forced to fend off attempts to tie him to the repressive policies of the previous administration that represented the darkest forces ever assembled up to that time against workers. Conservatives wanted to taint Roosevelt by obtaining his endorsement of their policy failures and hopefully string him along to implement more of the same. In retrospect, it is shocking that President Hoover thought that it might be possible to subvert the wily incoming president who shrewdly avoided most policy commitments during the campaign. It has often been observed that Roosevelt saved capitalism, an irony given its resistance to his progressive policies. But observers have often become lost while attributing success to Roosevelt's leadership. Even more lost are those who try to emulate his leadership "style" that earned as many brickbats as plaudits for allegedly being manipulative in the extreme. The accusation is true but explains nothing about leadership. Return, for a moment to FDR's resistance of Hoover's entreaties. The political master simply made it angrily clear to the failed agent of capitalist control that he wanted room to assert his own ideas. It was in the implementation of new policies, fresh perspectives and powerful initiatives that Roosevelt exercised leadership. Forget, for a moment, the content of ideas emanating from his administration and focus on their origins and implementation. Roosevelt clearly had his own ideas, traceable in part to his success as governor of New York during the Depression, but he understood that he didn't have all the ideas or a monopoly on thinking about policy. Instead of exercising self-delusion, he attracted innovative talent in the form of eager participants and then gave them the latitude to operate successfully. However much he may have exerted personal charm and regardless of the canny circumlocution that characterized many of his meetings, Roosevelt found leadership success through clear, bold, direct action and the unleashed abilities of others, a trait he shared with Lincoln. And it all started by stopping conservatives in their tracks. It is a telling indictment of our present situation and the past forty-five years that we again confront the same forces of greed and failed economic policies. Equally, it is an indictment of the lack of leadership during those years.

While it's possible to pick the bones of the New Deal, finding bits and pieces of policy and leadership tissue that contain the DNA of continuous linkage, this approach makes masters' theses history and even books on political economy along with not a few on leadership. Library shelves are filled with books about Roosevelt and members of his circle of advisors and administrators such as Harry Hopkins, Frances Perkins, Harold Ickes and many more. These books bulge with examples of both policy prescriptions and leadership exercised thoughtfully and helpfully for the majority of Americans who had been overwhelmed by powers contrary to common interests. But the thicket of facts in these studies is apt to obscure the most fundamental aspects of leadership and the ones most salient for our purposes today as we face an uncertain twenty-first century. Those basics grew out of a matured individualism that located and refueled the smoldering, deep-seated belief in themselves found in the collective memory of workers. Having examined this background repeatedly for some pages, there is nothing new here and no surprise to offer except in the nadir that soon followed, the negative dip that the reorganized forces of greed would have us accept as denouement.

The New Deal, representing renewed progressivism, had a good run for a few decades and through multiple presidential administrations. Eisenhower is included here, partly because whatever sour disposition he foisted upon the country during the fifties, was relieved under Kennedy and Johnson who, especially LBJ, managed to push the policy objectives to heights FDR could only dream about. Medicare and Medicaid, for example, brought to realization long sought goals of the New Deal. Successive improvements in civil rights laws during the Johnson years cleared new paths for viable democracy beyond any in American history. Additionally, Eisenhower recognized the value of such initiatives as Social Security and even termed any thought of abandoning it "stupid." More than most other conservatives, Eisenhower realized practical limits and identified some of the excesses of conservative policy such as unnecessary military spending being pushed by many Republicans and their business allies. We should also not overlook the fact that every president during the period, including Roosevelt himself, was required to face conservative opposition to good government. They didn't always win and they didn't always make the best decisions. The Taft-Hartley Act, for example, that emerged from Congress during the Truman Administration was wrongly conceived and applied ill-advisedly.

Inconsistency is always a characteristic of politics with unevenness an inevitable if unwanted feature of any governing political organization. If nothing else, economic vicissitudes ensure occasional disruption sure to be managed with the most convenient tools available even when they may not necessarily comport with the advancement of established policy objectives. This leaves the public bewildered, largely failing to understand temporary setbacks and what they perceive as waffling on the part of their leaders. The public is essentially right to be confused and correct in their assessment of inconsistent leadership. Incrementally, however, progressivism plodded ahead for about forty-five years, forging some notable successes and benefits before coming in contact with a brick wall that we know as neoliberalism that remains ongoing after its uncertain beginning in 1969. The Nixon years must be seen as uncertain because the period was fraught with political upheaval and preoccupation with Nixon himself as well as foreign policy, especially the Vietnam War. The Carter Administration that followed Nixon and Ford was also bumpy and unfulfilled given that it was required to administer clean-up after Nixon and fell into a trap of giving in to a clamor for deregulation. Additionally, Carter exemplified the propensity to think simultaneously highly of social liberalism and fiscal conservatism. Besides also battling a nasty bout of inflation that began in the Nixon period, the Carter Administration and the general public were also distracted by foreign policy difficulties, especially the Iran hostage situation that ultimately brought down the Carter government. It was under Reagan that neoliberalism finally took firm control of economic policy and converted both political parties to unquestioning adherence to its free market tenets.

Reagan, famously smooth, was the long ago New Deal fan who morphed into an ardent Goldwater supporter. How quickly voters forget or ignore. That is, perhaps, the point. Reagan was the bilingual serpent who spoke pious platitudes to the public with one fork of his tongue while signaling policies favorable to the rich minority with the other. The venom immobilized the victim and temporarily anesthetized the meaning of the memory of "happy days." Political and economic policy battles today are being fought because an increasing number of Americans are coming out from under the influence of poison and seeking to reassert the remedies in their collective memory. Things went well for neoliberalism for so long that when capitalism crashed in 2008, the default assumption upon regaining the public footing, was that neoliberal, which is to say, loose cannon capitalism, policies would revert to control and dominance. That is why there is debate over the Trans-Pacific Partnership Agreement and it's why it was negotiated in the first place. The Clinton Administration essentially presided over an ameliorated form of neoliberalism that was more palatable to the majority of Americans than the harder edge version of Republicans who forever seek to overplay their hand by reaching higher and higher and deeper and deeper for fewer and fewer, stranding more and more in the process. The Democratic version of neoliberalism, while easier to swallow, is neoliberalism nonetheless and the longer it continues the more Democratic Party nostrums sound like those of Republicans. Increasing numbers of people are waking up to find themselves on the side of the road with lots of company while the capitalist engine sputters on, ever weaker but not yet stopped.

To the extent that neoliberalism is a unifying force, even if temporarily and imperfectly, it means that any manifestation within its range acts as continuous policy that blocks true anomalies. Progressivism and the New Deal are not neoliberal and both neoliberalism and progressivism fit within the twentieth century. New Deal progressivism might be said to be a reaction to the repressive capitalism of the nineteenth century and neoliberalism might be interpreted as an attempt to revive that repressive capitalism but all of it fits into the twentieth century which we can neatly conclude with the crash of 2008. Attempts to revive neoliberalism after that crash are unsettled and less effective than those who benefitted from its previous largesse would have hoped. It is tempting to date the beginning of the twenty-first century from the Occupy Wall Street movement of 2011, partly because of the stark contrast it forms with previous periods and because it fits the expected radical reaction to the crash of capitalism but we should move the beginning of the new century to the beginning of the Obama Administration in 2009, partly for convenience and partly because Obama policies tried to bridge the past and the future, a subject of much importance, delineating a point between the two centuries.

Existence of progressivism and neoliberalism within the twentieth century is important, both because of the differences inherent in the two perspectives, and in how they were formed, why they were formed and in their implications for the future. We already see, for example, that progressivism is attempting a comeback. While the future is unknown, the context of a progressive revival is a matter of historical interpretation. Before pressing forward in the twenty-first century, it is very important to understand the specifics of the previous period because they form the basis of why and how a new perspective needs to be implemented. In the course of these considerations, we will observe the differences in progressive and neoliberal policies and how they impact the lives of everyone within their extensive reach.

If this emphasis on economics, politics and public policy seems like a digression from the subject of leadership, even stooping to the level of partisanship, it is because we have reached the twentieth century in our discussion and must necessarily begin to address the twenty-first century. The apparent digression is also because the manifestation of leadership changed during the twentieth century along with much else. Recall that changes were taking place as the nineteenth century matured toward the twentieth, that the frontier closed during that time, that urbanism became the American norm as opposed to agricultural focus with the 1920 finding that most Americans lived in cities. Businesses got bigger, government got bigger as the twentieth century approached and the organized response to these changes reflected an evolution of leadership as well as changes among all participants. Recall that as leadership was being developed earlier in American history, government remained small as did businesses with aggregated wealth relatively minor and all organizations fairly subdued with limited focus. Leadership, as we observed, changed to meet new challenges along the way but remained small bore as befitted the perspective of leaders at the time. But as all organizations grew, so did government, necessarily responding to the greater size of businesses, the complexity of financial arrangements and the needs of those impacted by them, including workers who had virtually borne the role of leadership in solitary fashion in the earliest days of the American experience.

As these changes transpired, public policy became increasingly more important as a reflection of leadership whether on behalf of workers or as the tool of oppressors. Progressivism is important from a leadership perspective because it better reflected the mind of workers and best served their interests in contrast to businesses that have always maintained more extensive ability to influence events and control resources even without the assistance of government. The tendency, of course, has been for government to become a prime tool of businesses in a struggle to further their advantage over workers. For these reasons, public policy, politics, government and economics make a huge difference in the lives of all people and therefore merit consideration regarding impact on leadership. The twentieth century happened to contain a greater concentration of these considerations than previous periods. At its close, the twentieth century was also poised to influence the coming struggle in a vastly new environment with massive changes ahead during the twenty-first century.
During the twentieth century, politics, economics and public policy had more impact on every individual than at any previous time and to the extent that these factors were organized, they also impacted other organizations, which, in turn, interacted in an altered fashion on the very forces that exercised impact upon them. This represented a new dynamic in the history of leadership, one that played out very publicly. To be sure, individuals were intensely involved and continued to play important roles as individuals but association had advanced significantly from the days of casual consensus and helpfulness determined by meager needs and restricted circumstances.

It is entirely wrong to think that there was a shift of leadership from the individual to groups or organizations because individuals have always and will always play a fundamental role in leadership. But as the twentieth century opened new problems and opportunities, much of the response was characterized by leadership through organizations with leadership simply adding a new dimension to its activity. Part of this was related to size by multiple measurements. Population, wealth, poverty, production, scarcity, technology, disease, wellness, longevity, mortality, scientific advancement, medical treatment, war, ethnic conflict and many other maladies and improvements fluctuated, often increasing, on a historic scale, with the consequent need of more organized response that required leadership, also of a measure disproportionate to any of the past. This new, organized leadership was called upon to address more issues than previously as well as greater complexity than had been experienced before. This meant that there was leadership within and of organizations that was separate and apart from the leadership that the organizations exercised. While not entirely new, the scale of organization leadership was different than at any time in the past. The response itself was different, too, not only of scope but of demand, quality and evaluation. Springing from this was a different type of continuously evolving criticism that resulted from greater media access, data compilation, information dissemination and constituent requirements. Much of this was new, not only of concept, but also scope and practice and all of it was addressed within and by organizations as well as the individuals within those organizations and others impacted by the organizations.

One type organization that merits special attention is that of businesses and corresponding organizations (or lack of organization) by workers in response. Even in stances such as war mentioned previously, business organization played a tremendously important role in the twentieth century, making pivotal decisions and course alterations at critical times, often with enormous consequences. Business organizations were an integral part of each of the factors and many others listed here, in the course of which, they came to function differently than previous business organizations, to take liberties with their position and to behave adversely to the interests of workers and governments in ways that realized and even surpassed the wildest expectations of monopolists of the nineteenth century. Many of the business activities can be traced to leadership gone awry, also in new ways.

The twentieth century witnessed a ballooning of leadership on every front as more of it was required to meet increased demands of growth and rapid change. There was a different feeling about the twentieth century and the response to leadership changed as well, but never ceased to reflect the importance of individualism that characterized leadership from the beginning in America. As the success of working Americans in the twentieth century swelled and ebbed and flowed differently than before, the fulfillment of shared expectations based upon collective memory took upon itself a countenance that changed in numerous ways that is important for understanding how to apply leadership during the coming century of even greater change. Along the way, the generation that grew up under the New Deal discovered that progressivism was not about Franklin Roosevelt; instead, it was about them and all their sisters and brothers. Those who failed to comprehend that were condemned to repeat the misunderstanding well into the future where new questions arose about what was an anomaly and what, if anything, represented a standard. It gradually became evident that the response had to be different from the past, that solutions would be different, too, that nothing would ever look the same again or be the same again. Yet, not only was leadership the key, it was more important than ever.

It might seem to be helpful at this point to define the term leadership but, as stated earlier, a strictly imposed definition will not work because definitions of dynamic concepts tend to stifle continued exploration and inhibit understanding rather than increase it. Each individual must be allowed to comprehend leadership according to the flexibility uniquely required for each person. Lack of definition permits clear, uncluttered observation and results in greater clarity. The past tends to be mired in misunderstanding and the last thing we need to do to the future is to restrict perception. Most definitions are really lists of descriptions and attributes instead of actual delineations of meaning and, in this case, noting a few descriptions and attributes will better serve our needs than precision.

The twentieth century virtually begs to be stopped in its historical tracks for reconsideration. We have seen, for example, leadership emerge into the twentieth century from a basis in individualism and that individualism incorporated small group association and cooperation. This applied equally to workers and businesses with size presenting a struggle for each to master for its own purposes. There were, therefore, leaders and followers that were both individuals and groups. They offered and accepted suggestions and understanding according to their abilities and circumstances. Mentorship was small bore as befitted the lack of enormity of those being mentored. There was a great deal of self-guidance that, when it broke through into the larger world, tended necessarily to remain of modest scope even if of significant influence within the small scale. All of this maintained a close link to the past of individual endeavor and small association cooperation and it applied in the workplace as well as in social relations of all types. To the extent that there was gradualism prior to the twentieth century, it ironically outstripped the even slower to adapt individual response to environmental growth. Circumstances enlarged before the capacity of people to cope with them. Part of that reflected the relatively faster and larger scale of growth of all organizations but these, too, were restricted by the understanding of individuals who were unprepared to think and react larger than the organizations that surrounded them, often dominating them. Individuals reacted with leadership according to the limitations of individuals while organizations quickly scaled leadership to envelope individuals before they were aware of what was happening and even before many of the organization leaders were aware of what their organizations were doing.

Much of the ensuing discussion will focus on the workplace as a venue of leadership largely because that is where business forced the focus during the twentieth century. When we begin to look into the twenty-first century, we will see that business remains a focal point of leadership but for reasons that become much more complex than previously, reasons that entail renewed focus on individuals such that we will see that a revival is renewing leadership of both organizations and individuals in ways that drastically alter the direction of business in the twenty-first century. While it will appear that we are becoming grimy with twentieth century business preoccupation, it will eventually be seen as digging a mound upon which to build twenty-first century leadership. And at no point in the discussion of the twentieth century should we forget the contribution of individuals to leadership during that period or the fact that leadership was exercised beyond the workplace.

Restrictions imposed in workplace settings escalated the breakout of leadership into greater diversity during the latter portion of the twentieth century. Whereas opportunities to express individual leadership were more or less limited to aspects of life that evolved as traditions passed from the nineteenth century, the later period exploded with new opportunities. Beyond family life, earlier involvement with churches and clubs, perhaps with politics and a few associations such as unions or organizations supporting business interests constituted most of the outlets the ordinary individual had to experience leadership outside the workplace. But with advances, not only in technology, but also in education, transportation and communication, all sorts of opportunities began to flourish that afforded the means of leadership participation. Old style civic organizations, for example, while they continued, found themselves vying for attention with activist groups of every conceivable perspective. With greater education, citizens were able to express interest in more aspects of life all of which combined with other factors to enable widespread participation. Interest in the arts, for example, advanced as more people were prepared through education to understand, were enabled by improved transportation to visit and were empowered through communication to demonstrate new avenues of participation in leadership. Aspects combined with each other to produce a complex web years before the advent of the information superhighway and leadership was at the core of all of it occurring outside the workplace.

An up-to-date reader is likely thinking about social media after reading the previous paragraphs but that jumps the gun and misses context that ripened during the years following World War Two. The first goal of the New Deal had been to stabilize a very precarious economy and a population that had become themselves, precariat. The intervention of war and much larger government expenditures than had been envisioned for economic stabilization alone improved the economy faster than would have otherwise occurred. In recent years, we have seen how governments, hobbled by conservative constituents, fail to envision the spending requirements necessary to maintain prosperity. So it was in the 1940s, brought up short by an almost existential emergency. Having planted progressivism as a viable means of public policy and having countered an incomparable military threat, western nations, especially the United States, were poised to turn attention, not merely to civilian concerns and a functioning economy, but business and a progressively improved lifestyle for citizens. Evolution commenced from that point.

In order to understand the second half of the twentieth century, it is necessary to keep evolution in mind. Education, for example, neither remained the same as before the war, nor was it changed overnight. Much attention is rightly provided to the GI Bill under which returning veterans, mostly white men, were able to secure advanced educations which they parlayed into experience and income during the remainder of their lives. But that represented only a fraction of the change taking place in education, beginning with the fact that the baby boom flooded schools with greater numbers of students than ever. The curriculum they encountered initially was a mishmash of outdated information, old but proven methods of learning fundamentals and sincere effort on the part of inadequately staffed schools scrambling to update what they were teaching as well as the methods used. Add to that the 1954 Supreme Court decision in Brown v. Board of Education that was intended to terminate racial segregation in schools and you have wholesale social change in the midst of old mindsets and new facts all of which required simultaneous accommodation. This meant speedy evolution on a massive scale. Newly constructed school facilities reflected the change, not only in classrooms conducive to a curriculum that stressed science, but also in construction materials, brighter colors and an atmosphere more suggestive of optimism than before. Although none of this happened overnight, it transpired quickly, so quickly in fact, that educators were hard pressed to keep up with the change and the public lagged sorrowfully, laying the groundwork for more social upheaval in years ahead. But it still represented evolution that changed, in every conceivable way and some that were entirely new, how workers worked and lived their lives outside of work. The education example is but one of many in which tremendous change is evident in the course of a few years.

While examples of employment and people outside the workplace are illuminating to grasp what was happening in the twentieth century, there is an even more fundamental change underlying both work and other activities that must be comprehended first. These developments, while not all entirely new, evolved rapidly, producing an impact far beyond their antecedents and were so influential that they paved the way to the edge of the twenty-first century, bringing with them a host of ancillary changes that complicate circumstances of recent years. Regarding context, it is tempting to credit education and the rise of technology for many of the alterations in the workplace, but we must keep a multiplicity of factors in mind. Before what we are calling the twentieth century, a major war and an influenza pandemic thinned world population; then came a global economic depression with consequent loss of life due to lack of nutrition, if not outright starvation, and reduction of reproductive opportunity. And, of course, there was World War Two. With all of these things behind them, people begat people as never before in history to the point that population boom must be considered a salient factor underlying much of the change ahead. The fact that more of this population was better educated than previously was all the better but hardly the single aspect of change. In the workplace, we also have to look at specialization and the rise of "professionalism."

Education certainly played a part in these twin workplace issues and influenced many associated aspects of their development but cannot be considered a singular cause of either specialization or professionalism. An almost evil power of management lurking in the background of specialization and professionalism will become apparent shortly. In the meantime, we can be cognizant of sufficiently independent influences quite capable of enormous impact without the concerted control of management. We can credit, for example, other forces at play in the American culture of the emerging twentieth century that were highly impactful if not spiteful. Americans overwhelmingly accepted as received wisdom that constrictions and restrictions in the workplace were necessary. Besides leading to greater specialization, this attitude hobbled at least a whole generation of Americans who took for granted that they were incapable of thinking for themselves and influenced an educational system that valued objectivity at all costs, even the squelching of inquisitiveness. But it certainly had its share of impact on the spread of specialization. As time progressed, management liked to blame unions for specialization and the constriction and restriction that seemed inevitably to accompany it. This is a twin fallacy, first, because management pursued specialization as a means of cost reduction, and second, because most people believed they could navigate change and comprehend new technology best through reliance on specialists. Fear, in other words, drove people to seek the comfort of those better equipped than themselves, and, they thought, surely better than any one person could handle as a matter of work. They applied this nostrum across the board from intellectual pursuits such as that of historians to the earthy endeavors of gardening. But, to smite the face of consistency, they loosened their grip where personal economic conditions and convenience applied, readily, for example, ditching butcher shops and bakeries for supermarkets.

What, exactly, was this constriction and restriction newly imposed in the twentieth century workplace? A precise definition may be unfathomable, but we for sure continue to operate within its grasp and under its spell with certainty that it emanates from thought but manifests in both thought and action. Constriction and restriction in the workplace and the general culture is a negative influence, partly by definition, partly in practice and partly through the pall it casts. Perhaps it is an oddly withering dampness that insinuates without structural traces, almost as if we decided to try a chemical substitute for water only to be seized by insatiable thirst that somehow makes us think we are pleased to imbibe. We continuously seek more of the same without knowing why. Even as we approach the twenty-first century, the pace of our attraction and pursuit has just lately begun to falter with the barest perceptibility. Constriction and restriction is a rut that channels our attention and our activity ever deeper where it is increasingly less likely that the light of diversity can penetrate, requiring less effort to remain and maintain than to expand and improve.

Once begun, and it has long since started, inertia proves the force of greater specialization. And while the nebulous reason may be difficult to corral, evidence of its existence is abundant. Workers are laden with inertia that keeps them bound to processes and habits that simply bolt them into repetition until the processes they sustain become not merely outdated but often dead. On the most trivial level it shocks the system to have a stapler moved by another worker and in a critically significant way workers are often left without livelihood when the one skill they possessed and repeated endlessly becomes utterly useless and is cast aside along with their employment. Somewhere between the extremes, workers lost the awareness, will, ability to adapt, learn, expand and the larger forces surrounding them, society, government, management failed to provide the push to do so or incentive for replacement skills or willingness to exercise themselves in similar ways on behalf of their constituents, the workers. (Never mind, for the moment, that society, government and management fails to comprehend their human composition in deference to shareholder value.)

Quite apart from workplace issues, life as a personal endeavor suffers from inertia that merely reinforces every obstinate tendency and degree of lassitude that rivets people at work, with life and the workplace doubling back upon each other and pounding hapless workers with the weight of unconsciousness. In the twentieth century, we have seen the concrete structure of home life begin to crumble amid complaints from the most conservative observers that Americans were going to hell in a handbasket what with divorce rates soaring, women misbehaving in skimpy clothing and violence on television. In some perverse reversal of expectation, at least some of these have contributed to daylight showing through the dense ceiling of private inertia. As more women entered the workforce, men were required, at least to a limited extent, to reexamine their roles in the households. To the extent that this reassessment faltered, some homes have broken apart with partial consequence of greater self-reliance among women who continue to labor with superior performance under the burden of inertia imposed from men and from the past that restricts their remuneration and career attainment. At home, we see women expressing themselves as capable managers under often adverse circumstances and education is gradually opening to accommodate expanded horizons. Further lessening of old attitudes such as taboo sexual orientation apart from insistence on a "straight" norm, has added to change slowly occurring in private life in ways that impact the workplace. The fact that strictures outside of the workplace are crumbling makes restriction in the workplace ever more tenuous. While, at the dawn of the twenty-first century, we often regard progress as slow, we must realize that it was glacial during the previous period, even as everyone's moorings loosened and habit and other expressions of inertia forced slow departure from previous norms. Religion in America has been dilatory to admit change but has lately begun to give way to secular humanism, a distinctly less dogmatic perspective than the constraints of preachers and pews.

Despite the incipient breakup of prior normalization in the face of pending catastrophe reckoned by individuals on the threshold of job loss amid uncertainty of the future on every front including climate, the environment, technology, personal relationships, education and financial stability, inertia continues to control. To some extent the past holds us back and acts as a cage that must be broken through, for men as well as women, but personal inclination also plays a significant role. The lure of unproductive entertainment, for example, robs us of time and diverts energy into uselessness. Think television. But if that's oh, so twentieth century to you, think Facebook. Even if you consider that clickbait is one person's albatross and another's ticket to prosperity, ultimately, it remains the burden of stultification for both. Inertia takes many forms and has existed from time immemorial but remains a leaden weight against our better instincts for expansion.

To break the bonds of inertia, we tend to look self-defensively in all the wrong places, assuming, in our ignorance, that the answer surely lies somewhere out there as it would be impossible to be within us. There you have it: a nutshell case for celebrity worship. A propensity to seek elsewhere for answers is as prominent among human beings as the tendency to avoid introspection. There is a double whammy involved wherein we not only assume answers that lie elsewhere, but we also don't like what we see when we check under our own hoods. Celebrity worship is almost as old as humanity itself and we are likely right to assume that it began long before cave dwellers, stretching back to the time before we dropped from the trees, carrying all the adulation our primitive brains could handle for the hairy creature that was both determined enough to reach the topmost branches for the best fruit as well as to escape unscathed from the hairy creature that wanted to take the fruit from it. We can leave these distant ancestors picking their fleas while we zip forward to a distinctly American preoccupation but not before we raise the question of religious association with celebrity worship.

Presumably, our most distant relatives began religion far, far from earth as the sun rose in the distance and continued its daily routine as generation after generation departed. Persistence, while an asset, is monotonous and prefigures dissatisfaction. Additionally, although noted for its warmth, the sun lacks personality; spectacular rock formations and sacred mountains aren't much better. We know for certain that it wasn't long before people projected their own calamitous foibles on unseen but surely existent gods. There. Up there on the mountain beyond our reach. We know they're there. One of them came down last night in the form of a...you know the story, just enough titillation to peak curiosity and cause, with disgust, a rendering of moral wholesomeness possessed of sufficient supernatural power to command both interest and fear. Voila! In Europe, then America, it was Jesus.

George Washington, perhaps skeptical but ever practical, paid periodic respects to the unseen deity while carefully plying his own brand in the flesh. One of his earliest electoral ploys was to compliment voters with free liquor while reminding them of his heroic military exploits against the French and Indians. As time progressed and other heroes emerged, the savvy general and politician played a mean defensive game on multiple fronts while engineering masterly offensive maneuvers that had everyone standing aside for his passage through their hearts into the sanitized pages of history. On top of all this, he was a great president. One thing he knew was to give people what they wanted. As his second term wound down, the aging warrior-statesman made a farewell trip through New England traveling in a carriage behind which his famous white horse walked, ready to be mounted for equestrian entry into towns and villages and the esteem of his countrymen. Washington became—in his lifetime—an authentic American celebrity, an early version of what many others would fall short coveting, and in the process would be something he doubtlessly never intended, a repository of reflected honor so bright that its true source was unknown to most who contributed their light and a reason to excuse their shortcomings. It is in reaching for celebrities to worship that others fail to develop themselves and ultimately sell short their own abilities and deprive everyone else of potential benefits stillborn or wasted. Such is the nature of celebrity worship, and in the case of Washington, the development of misplaced American attention, a habit now engrained and exceedingly detrimental.

Despite the fact that the first American president inaugurated celebrity worship in the United States, it did not subsume politics until Obama appeared at the beginning of the twenty-first century. How that happened is instructive. Presidents have always had admirers and have typically dealt with them in a businesslike manner as an unavoidable part of the job. In the case of Lincoln, many of his admirers appear to have been sincerely deferential and motivated by respect. He responded with humility and reflected deference toward him with exceptional thoughtfulness in the quality of statecraft. How else could he have maintained support for the nation's bloodiest war? As the nation grew, and with it the importance of government, presidents have tended to acquire sycophants, would-be advisors vying for dominance which, despite the trappings of power that attracted them, fell short of celebrity worship. A brief exception was Kennedy, whose personal charisma and position in the historical timeline, energetically following the essence of stodginess in a breakout period of national identity, lifted him by acclamation even before assassination sealed an exalted regard. Although Theodore Roosevelt was noted for rousing crowds and Adlai Stevenson drew the kind of attention sometimes accorded literary icons, it remained for Obama to bring rock star celebrity to the White House. The reasons, the timing and the portents for the future are important.

Attempts to date the existence of an "imperial" presidency go all the way back to Washington, who, despite his ego, held it at arms-length, and John Adams who relished royal treatment without convincing anyone else of its applicability. The imperial label, often thrust upon FDR, is excusable only when understood in the context of tremendous power wielded during national emergency. In that limited sense, similar power devolved on Truman, who, with Lincoln among all presidents, resisted the idea of pomp along with the circumstance. The whole imperial package did not manifest until Nixon who craved power and was much criticized for flaunting it, even mimicking Papal Swiss guards at the White House. For better or worse, Nixon changed the presidency forever with a version of forced celebrity through which he foisted a self-portrait of himself on the rest of the nation. Even those who would rather believe that he failed, must admit that to one extent or another, every subsequent president has followed suit, achieving in the process, a kind of superficial crust of celebrity that could not survive loss of the office. Even the one president who had been a poorly regarded movie star and who is often called "the great communicator," ultimately fared no better as he grubbed among politicians and acquired the stench of partisanship.

Much happened in the world and to the United States after World War Two. The "greatest generation" that suffered the Depression, fought the war, survived the Cold War, engineered civil rights and built the most prosperous middle class in history had retired or died by the time of Obama. Younger generations, now in charge, saw things differently. They had been raised differently, too, largely in crowded cities but everywhere among larger numbers of people than ever. They rewarded distinctiveness, not necessarily distinction, and they elevated entertainers of all kinds, bestowing affluence in addition to adulation and creating cults of cultural influence that focused on individuals and celebrated them in almost reverential fashion as saviors of segments of highly artificial worlds. A generation that bore no responsibility for the world they inherited and often complains about its condition, tacitly abdicated their stewardship to others. Unfortunately, they were ill prepared to evaluate those to whom responsibility was entrusted, and when the lure of more and better under easier terms was offered, that was the route often pursued. As it happens, Obama was provided rock star status while proffering substance and quality. But already, with Donald Trump, we see the disaster that can occur when entertainment celebrity replaces public policy credibility. Even so, at the dawn of the twenty-first century, Americans seem to want the appeal, not merely of a special story known to characterize many presidential backgrounds, but the ability to energize mass response, awe and extraordinary engagement such as we find with Obama, Sanders and Trump as opposed to the comparatively cold technocratic fluency of Hillary Clinton. Bill Clinton, by comparison, was elected with the special story and the charisma but not the celebrity that seems valued today, having acquired genuine celebrity status well after leaving office.

By reviewing the ins and outs of alleged presidential celebrity, we have an overview of how the nation regarded leaders, work, policy and a flyover of cultural issues. We see, for example, that before the twentieth century, Americans viewed the exercise of executive authority fundamentally like a job in many ways like other managerial positions most of the time. It seemed to require the same nuts and bolts knowledge and competency that any decent man could acquire and exercise if he were disposed to devote attention to that end. Had not Lincoln and Truman demonstrated this? And hadn't so many presidents through the end of the nineteenth century, including Harding and Coolidge, reflected ordinary Americans, their values, abilities and aspirations?

Trying to answer that question for the twentieth century leaves us completely without presidential examples apart from Truman and, very briefly, Ford. LBJ was the heir to New Deal governance and a skilled political tactician; Carter, with a degree in nuclear engineering, boasted multiple and scandal-free careers as a naval commander, farmer and governor; Reagan traded on fame and ideology; the Bushes were wealthy patricians who cashed in on public service; and Bill Clinton deployed exceptional intelligence, perseverance and the keenest ever political acumen. None of that typifies the average American. Attempting to explain American leadership by examining twentieth century presidents, opens a wiggly can of worms. Acknowledging that something new was happening and looking deeper, we find a great deal that is pertinent to us at the beginning of the twenty-first century.

Just glancing over the list of twentieth century presidents, we observe an increasing inclination toward polished performance, carefully crafted persona, and expertly staged events. Consider the change that occurred in presidential press conferences over a short period. FDR was in the habit of inviting a handful of reporters into his office where they literally crowded around his desk as he fed them tidbits of news and inside perspective. Kennedy, in contrast, quickly mastered with aplomb and humor the formidable televised news conference format that furthered his celebrity. Plodders like Johnson could handle the questions but failed to deliver an increasingly expected level of panache. These same characteristics are evident in business and other walks of life. Plain, sometimes grumpy businessmen such as Henry Ford perforce yielded to dynamic performers in television ads as well as the boardroom. Lee Iacocca was thus prepared for leadership in tough times as well as enjoying sweet success simply because of his adaptability to the times and changing expectations. Religious hucksters who knew how to prance behind the pulpit might collect enough change for a small-time good life, but it was Billy Graham who demonstrated how to command a massive audience as others emulated his style to rake in folding money.

Lest all of this reek too much of showmanship as opposed to daily life of the middle and lower classes, consider what bottom dwellers encountered. Burger joints gave way to sleek fast food outlets like McDonalds where standardization gleamed not just with golden arches but gold itself. Greasy spoon diners began to disappear as chain restaurants pushed them aside with predictable menus and greater certainty of the clean standards the public was learning to expect. Clumsy downtowns relinquished customers to exciting malls where the stores offered folding board perfect fashions to eager young shoppers. Crisp was not just a look, it was an attitude adopted by successive generations who applied it in myriad ways to their surroundings. Everyone who made the cut in any endeavor whatsoever was expected to conform. It is most telling that all the cultural outliers, be they beatniks of the fifties or hippies of the sixties, were rounded up and forced to comply on pain of being shunned, not merely ignored. Advertising and media, increasingly national and less scattered among well-meaning but comparatively amateurish locals, enforced conformity at every turn. Television may have simultaneously glorified violence and sustained morality, but the lead characters were always prescriptively attired in the commonly accepted best. Television also helped standardize approved national speech and smoothed regional edges, sometimes with good humor, sometimes with ridicule. Sheriff Taylor was southern and spoke like it, but he was also smart. We laughed, however, at Kennedy who dealt with the intractable problem of "Cuber." Race was serious, too, and no laughing matter. Television sent Amos 'n' Andy packing.

Some of these changes were demanded by a public conscience that, thanks in part to the media, evinced a certain amount of cohesion, national unity that was impossible in fragmented previous generations. The South, as always, carved out an exception for itself, but the overall tenor was one of amalgamation, a uniformity that threatened individualism to such an extent that not even southern culture was completely immune. Purveyors of this change were not only television images and entertainers who dropped Vaudeville for TV variety shows or top executives who could speak inoffensively; some of the leaders of twentieth century change held commonplace positions of trust in communities of all sizes and descriptions across the country.

The dominance of national media winnowed the number of prominent leaders and highlighted them with special focus, a situation in which celebrities rose to the top as examples of their own creation, having been propelled by means of their own invention. Counterintuitively, it was from within this exceptionally artificial environment that leaders emerged at the grassroots level. To be sure, these mid-century mid and lower rung leaders had plenty of innate talent, but they blossomed by taking their cue from the paradigm of high visibility leaders. This development may seem only mildly interesting at a glance, with persons of ability attaining success by consciously adopting the traits of others, something that had been occurring to a lesser extent for many years, but the real curiosity and tremendous impact is the magnitude of this change and the enormous impact created through its replication across many fronts. In addition to mere standardization, the process seized not just the imagination but also the actualization of leadership in cultural, business and government spheres, a private and public remaking of how everything functioned. Consider the faces of this change: communications specialists, educators, consultants, advisors, financial gurus, salespersons, religious leaders, commentators, managers of all sorts and lately, lots of positions previously unknown.

During the latter part of the twentieth century, we stumbled into a situation where these leaders both reflected the change and participated in its creation, not by definition different from what happened during previous periods, but in its execution, vastly dissimilar. Earlier, we considered specialization and professionalization, for example. As the twentieth century progressed, specialization and professionalization acquired new dimensions, partly through proliferation, the sheer quantity of jobs and people involved, and partly through a process that swung around and fundamentally changed the change. Shortly, we will trace specifically how this happened, but the most direct route to understanding it is to observe how the larger categories changed. Earlier, there were owners, then workers and owners together, then workers and owners separated by managers.

To assume that is where we are now and stop misses significant issues and perspectives, because, in addition to the new wrinkle of standardization and celebrity, we must include an element of hyper-elitism. While all of this took place in business as well as non-workplace situations, for a variety of reasons, we will dwell mostly on work, although the overall parameters are more readily explained by addressing culture instead of the workplace exclusively. Consider, for example, that "fashion" became applicable beyond the style of clothing, that people in all sorts of positions became "trendsetters," that consultants cropped up everywhere, that communication became an aspect of every industry, every business, every endeavor and that influence could not only be peddled, it is regularly created, engineered, transferred and magnified by skilled operatives empowered to craft improved results for their stupendously wealthy bosses. In a word, elitism, drawing from preoccupation with celebrity and the proliferation of newly anointed grassroots leaders, rose along with income inequality to historical levels.

Community leaders had always been important, but they mostly reflected their surroundings and the people they led instead of stratospheric entities who associated exclusively with other celebrities and who were known only through the media. Merchants spiffed up themselves and their stores, losing the stockroom apron and making their wares more appealing. Preachers pranced less behind their pulpits in an effort to appear more telegenic even if there were no cameras present. Car mechanics found a pleasing presence for the front counter of their shops and kept the grease behind the scenes. Even the most representative grassroots elites such as the iconic local physician adopted a more clinical image and dropped forever the concept of a country doctor. All of this trickled down to the supervisory level, too, with smartness of appearance and crispness of style replacing the slovenly, casual and haphazard ways of the past.

Enter the age of technicians. Jokes made the rounds of mid-twentieth century Americans amused by all sorts of workers taking highfalutin titles to mask mundane jobs. Garbage men became sanitation engineers, but there was little time to linger in amusement; everyone was too busy doing something that was a cut above the past. In those days it was a matter of adapting an ordinary public school education to readily comprehensible new productivity techniques and the drive for increased productivity pressed ever harder on workers for even more. All of this, of course, ratcheted up specialization. As technology increased, the more complex levels required separately trained technicians and, voila, instantly enhanced elitism. Consider, too, that this extended well beyond the workplace. Job status joins income and race as the great source of social division in America and the workplace of the latter twentieth century churned overtime to produce a nation fracturing along lines drawn by elitism. Divisions in the workplace are replicated in social situations all of which have leaders. Soon enough the leaders of all sorts of social endeavors could be identified with specific job strata. Backyard barbecues don't typically feature millionaire investment bankers chowing down alongside factory workers or store employees. Neither do they belong to the same clubs or worship in the same churches except occasionally as recognizably separate cliques within large memberships. In extremely large churches, for example, small groups conform to demarcation of social status. You will observe industrialists huddled with bankers and middle managers with others of their ilk while supervisors and entry level executives form together; they rarely mix but acknowledge each other with strict formality. Never would you see any ordinary worker approach any of these groups and for the most part, each of the groups gravitate into separate clubs and churches so that they avoid meeting altogether. In small towns where there are fewer organizations and where elites are widely dispersed, meetings in a church environment, for example, produces an awkward response like magnets repelling each other. Even in these less urban settings, elites tend to spin themselves apart from every social encounter with subordinates, traveling miles to avoid them by attending services elsewhere. Country clubs are helpful in abetting this process with formal entry procedures calculated to ensure separation and preserve the dignity of workers of lower station who presumably don't want to associate with their betters any more than they are wanted for that purpose on the golf course or in the clubhouse.

Divisions that were ineffective or nonexistent in an earlier period of American history and that long elided smoothly in social situations broke cultural cohesion apart in the twentieth century. Fractures were evident, not only in the common occurrences of everyday life but the workplace as well. This is highly significant because, while elitism in social and cultural circumstances is easily identifiable as exemplary of efficacious tools of change, in the twentieth century workplace elitism wrought change that was so complete that it expanded its nefarious influence in all directions, including turning back upon society itself. It is for that reason that we need to consider elitism in the workplace before examining leadership in the context of business and management.

Leadership in the mid and latter twentieth century workplace struggled mightily to follow the top-down example of grassroots growth that characterized leadership outside the workplace. The extent to which it was successful will be examined when considering management and business, but the initial overview finds many similarities of elitism in the workplace and outside the workplace. The question of first cause, whether or not elitism in social situations predated elitism in the workplace is of little consequence. But the overwhelming significance is that elitism outside the workplace influenced its development in the workplace or at least acted as a mirror.

Leaders in the workplace began to look to national trends and examples to emulate just as did leaders in other types of organizations. Workplace leaders of all kinds from owners to managers began to sport a more polished appearance than in the past. What might have begun as an effort to upgrade wardrobes ended with a sense of separation from workers that was deeper than ever before. Vocabularies changed, too, along with the manner of speech. Communication in the greater world both impelled and reflected this change. Because of improvements in communications, leaders across the entire spectrum became more accessible through mass market readership that, again, both fed and reflected change. Conferences sprouted where all kinds of nationally recognized leaders had an opportunity to influence leaders at the community level and lower rung managers in large corporations. From these venues sprang more extensive networking than had ever taken place and the more that all of these leaders at every stratum communicated with each other, the more reinforcement of national leadership trends was achieved. Standardization and expectation accompanied communication like members of an elite chain gang working the highways. It became secular religion that either seeped into the grassroots of businesses or was crammed down the throats of dubious managers and some owners who, with peculiar justification, fretted that they were losing what made their companies unique.

Elitism in the workplace took another peculiar turn with the addition of management stars, a perverted type of grassroots celebrity whereby stardom replaces leadership as the central qualification. Instead of being workers fed at the grassroots on thorough job knowledge, efficiency, effectiveness and potential, these job stars were typically brought into a company from the outside. In recent years, CEOs brought in from other industries come to mind but as a false example of lower ranked management inserted by top management to reign over existing employees. (The reason that outsider stars as CEOs falls outside this discussion is the strategic nature of decisions by boards of directors.) This, of course, causes massive resentment and fear with lower ranking management members wondering what other adverse personnel decisions will be forthcoming. Planted stars come with the authority to create their own staffs, naturally enough, composed of company new hires with little familiarity with the specific business and sometimes of the industry itself, making already employed knowledgeable employees even more resentful and apprehensive. Sometimes these star managers create inane changes that do real damage to the company, but their insertion into the grassroots level of company management causes problems simply because of their existence with resignations and retirement of especially productive managers frequently resulting. Sometimes, too, these star managers crash and burn after their usually older senior management patrons move on, retire or get fired because they invariably lose influence once the executive responsible for their emplacement is gone. Whatever the exact outcome, celebrity stars seek to emulate at one swoop the prominence of nationally known leaders by scaling their stardom to the company level but what they engender is more akin to havoc than reverence. Besides, the whole star manager syndrome smacks of elitism gone wild with officially recognized class distinctions among already stressed workers. While it reeks of politics within management, it essentially represents a perversion of change imposed from above that is felt throughout the entire workforce.

Top management saw to it that subordinate managers in their companies were inculcated with the specific degree and content of change deemed appropriate for them and their position. Ordinary workers were omitted except as observers of change taking place around and above them and as the butt of that change. Factory workers had never been on par with owners but the type of change taking place in the twentieth century was very different. Whereas mill hands of the past, for example, were individually known to owners who saw them each day and knew their families, the new workplace admitted of no such familiarity. Condescending smiles and nods were the most recognition that these now lowest level workers could reasonably anticipate.

The meat cleaver that descended forcefully between management and worker was nothing compared to the change ignited by elitism among owners and managers. The concept of management in the modern sense was relatively new. Until the twentieth century, most businesses had been so small that owners were among the workers and provided much of the management themselves. As population increased, so did the size of businesses, the concentration of wealth and the perceived need for a full complement of managers. This left owners to associate almost exclusively with other owners and the topmost level of their own management and that of a few very large corporations. The result is obviously a tremendous infusion of elitism administered in a short period. Those already in management positions when the surge of elitism suddenly appeared were ill-prepared for the consequences. They found themselves cut off from owners they had worked with for many years and were soon reporting to newly inserted layers of management often hired from outside the company. These long term managers who had experienced the growth of their companies from early stages and who typically participated in decision-making, now found themselves locked out of decisions, blocked from input and made underlings of bosses they neither knew nor understood. Familiarity suddenly reversed to alienation.

In a rush to empathy, it is easy to overlook that the newly imposed top management also felt alienated, both from the owners they served and the subordinates they found already in place. To be sure, top management identified most directly with the owners whose interests were parallel to their own but the separation from actual ownership, stock options notwithstanding, made for an awkward gap that no country club membership could bridge. Here, we have a glimpse of millionaires estranged from mega-millionaires and billionaires with a restless cohesion of similar interests set apart by a mental acknowledgment of inferiority as well as a gulf of unprecedented riches and unattainable status. Nothing represents this comical plight more than the resentment generated between old line wealth with their stately, historically secure vacation retreats and newly hatched billionaires with plans to buy up multiple old homes for replacement with colossal structures all for themselves.

We are, as yet, not ready to scoff the whole scenario aside with the adage that "what goes around comes around." We have to give further consideration to what was happening at work, this time with a downward focus on elitism. Top management seemed to find endless ways to fracture the tenuous bonds between them and their subordinates. Where contact with management had previously been positive, reinforcing common goals, if not camaraderie, newly elitist management intentionally drove wedges between themselves and their subordinates. Much of this conduct will be explored in our examination of management in the twentieth century, but at this point, the salient topic is less exactly what happened than the fact of its elitist motivation.

In the earliest of those early days, a trend was established that proved viable for top management for decades. Having made older, existing managers uncomfortable, reasons were found to deem them inadequate for renewed assignment as technologies, processes and objectives changed. These older people were and are out the door on pretext of unsuitability and inability. But as long as they were in place, older managers were made to feel the lash of elitism that belittled them publicly, questioned their judgment endlessly and imposed demeaning duties and absurd requirements. It is not only a matter of what happened but the fact that it was based on elitism that is important because, as these older managers were replaced with a new generation unaccustomed to their place in the hierarchy, elitism was allowed to run rampant, recklessly damaging the whole structure of businesses.

Newly impaneled middle management, minted of younger people unaccustomed to questioning and reliably disposed to accept come what may, were largely unwittingly made the butt of elitist whims that coursed unrestrained through upper management. This level of management, now completely shut out of the decision-making process, was reduced to taking dictation from above and flawlessly executing inanities prescribed by superiors who distrusted each other and jealously sought personal advantage. What characteristics are more fitting of elitism than mistrustfulness and jealousy? How could businesses possibly prosper under these conditions? There are two answers; one will be addressed as we consider management in the twentieth century.

The other answer is demonstrated in yet another, lower manifestation of elitism and the competence with which it coexisted. It is too easy and completely inaccurate to write off these middle managers as ticket takers and robots. They are human beings and they have a tremendous amount of potential talent. Their latent abilities cannot be entirely suppressed and irrepressibly bubble upward in even the darkest of companies. If nothing else, the quality of their execution would be sufficient to keep many businesses afloat despite sabotage from above. Like their counterparts in the larger culture and other organizations in society, these middle managers, the grassroots of management, are capable people and, for the most part, would have been in these positions in an enlightened corporate environment, albeit they would have performed even better. And, like other leaders in the larger society, middle managers sought role models from higher positions of public notoriety and adapted themselves to the paradigm provided by these highly recognized and credentialed leaders. Once again, the point here is elitism.

Middle management is held in place by fearful superiors who restrict and block their subordinates at every turn, almost as if there is a top management playbook with suggested maneuvers that can be employed to mislead. The abundance of information and guidance foisted upon middle managers by their superiors suggests not merely conspiracy, but a whole industry devoted to keeping middle managers distracted with busywork and worse. Certainly, a tremendous amount of time is wasted in meetings and the intake of information that is either superfluous or intentionally misdirected. All of this is based upon elitism that is designed to protect a ranking class that outperforms idiots in destructiveness.

Worse yet, and the most damning criticism of middle management, is how it, in turn, applies elitism to subordinates. That happens because management at the grassroots is in the position of employing and supervising the lower strata of business organizations. The fact that these lower ranking company officials would take out their frustration on hapless and typically underpaid employees is deplorable, especially considering that it often involves racism and sexism as well as mental and physical abuse, not to mention outright criminality. The fact that wage theft lops percentages off the incomes of entry level workers speaks volumes, as does the fact that wage theft far exceeds the amount stolen by employees. Perhaps more hideous is the psychological pain inflicted on countless workers who are helpless in the face of petty tyrants. Much of this emanates from frustration at their own treatment from top management which typically evades responsibility, but it is elitism reacting to elitism nonetheless. The lowest level managers realize that they are near the bottom rung, and they are resentful because of that, but they are also mean and abusive simply because they can be, and, to small minds, exercising abuse elevates their ego if not their status. In addition, the example has been firmly set for them by their own abusive bosses.

Taken together, elitism in the workplace has not only divided the American work environment into classes of workers but has fractured any possible sense of cohesion and cooperation except that which is forced. It is the weakness engendered by these small cracks that spells both the final doom of the system as it is known and offers a solution to the whole range of workplace problems. We have been so focused, rightly and understandably, on income inequality and the rise of the mega rich against a backdrop of deterioration of middle-class living standards, that we often overlook the role that elitism plays in our culture and our workplaces. The fact that a remedy is at hand is encouraging, but we should remember that it took decades to reach the bottom, and it will require time to achieve renewal, time that is running woefully short.

In the nineteenth century when work started shifting from individuals to businesses, a kind of accommodation was reached among workers and owners who were then still involved in the operation of their businesses and, to a remarkable extent, remain so today. There was a kind of understanding, not always agreement, especially, but an acknowledgment of the meaning of position and where everyone stood in the conduct of the business. There was respect, too, because the owners in those earlier days, remained not merely part of the businesses but the community as well. Workers might have been understandably resentful of their meager pay, but periodically seeing the owner outside the workplace (and not on television) made for a sense of shared space if not exactly collegial relations. This goodwill began to shift and was in deplorable condition before the advent of Franklin Roosevelt's presidency.

Unions are made the scapegoat for much of the deterioration of workplace relations in the twentieth century simply because conservatives are quick to blame unions for everything. The fact that unions never approached representation of more than about a third of the workforce somehow falls out of sight when they are being blamed for workplace strife. Even if you add those workplaces that were in contention for unions that were ultimately denied, it is impossible to see unions with the strength or breadth of coverage to influence unhappiness at work (which happens to be the opposite of their impact where they exist).

Other forces were operating in the workplace, and, unlike unions, these things, aspects of elitism, were endemic. In this regard, income inequality is connected to elitism and became so egregious by the 1920s that it could be reasonably credited as a source of discontent among workers. But in a larger sense, income inequality was a symptom of elitism, not its cause. Far more important is the fact that workers split among types of workers such that they became castes, not classes. As castes, they bore the affliction of elitism full down upon the heads of workers unaccustomed to imperious treatment.

Depression, followed by struggle to recover, followed by war, followed by a suddenly surging economy left workers indisposed to countenance more ill treatment at the insufferable hands of old style elitists as before the war. That is why there was some labor strife in the 1950s, but that was also the period when the more complex, if less brutally obvious, elitist assault commenced in the workplace. New top management brought in kinds of workers—lots of them—that had never punched the clock in many workplaces. These were often entirely new positions performing work that was alien to existing workers and did much to expand workplace elitism and foster discontent among workers. The HR industry flourished in this atmosphere and forced its presumed indispensability within the ranks of management. Intensification of all kinds of aspects of work took place simultaneously, culminating in the pressure cooker environment that typified work later in the twentieth century. New measurements of success were generated to gauge all this activity, evaluations that were foreign to many workers and which were often tailored with the intention of creating stress that supported elitist objectives. Add to this a new stream of urbanization that escalated in consonance with new freeways that exacerbated elitism through hideous forms of exclusion and you have a new America, a nation permeated with stress cracks.

Because productivity and profits were increasing significantly, management believed they were on the right path to further success. Like any caged rat in an experiment, they continued to hit the button that deposited food. Alterations and behavior they could correlate with improved results were immediately promulgated. The result was a haphazard patchwork of policies implemented by an army of minions unqualified to tote the hand tools of their workers. The sheer quantity of people designated as leaders and suddenly set above competent employees was mind boggling. Uneasy in these positions of authority but assertive on the theory that it's best to bluff accomplishment and impose command rather than seeking help, cooperation and consensus, these young fools wrought a full generation of havoc on American business. The engine of prosperity rolled on in spite of them not because of them.

Keep in mind that appointed leaders were taking their cue from nationally approved trends and recognizable public figures from within an atmosphere that rewarded consistency over innovation, a trait of thinking people that rapidly became suspect by top management and owners who preferred no threat to their hegemony in the workplace. Another feature of the business environment focused on monetary aggrandizement over any other value including humanity, a proposition deeply rooted in blatant greed with exquisite manifestation in the "Gilded Age" right through the crash of its golden gates. This newly extensive horde of managers drew upon all the public preachments of every recognized public speaker willing, and many were, to spout endorsement of the business machinery deemed responsible for the American way.

Seeing one successful example after another rise and march forth on the motivational speaker circuit, many others rose in imitation. They're still out there, speaking, writing books, exhorting, making videos and doing what they believe to be leadership of the leaders. If ever there was contortion to justify their existence, it is these mislead creatures who prey on the middle managers who seek their guidance and the top management that is enamored of the prestige they bring to annual meetings and the message they present to managers who are expected to trundle it back to the lower legions. The gratitude with which these people are accepted by middle managers and the eagerness expressed by underlings anticipating crumbs returned to them by their bosses is amazing. What needs to be kept closely in mind is the deep desire on the part of managers to emulate what they see is valued by their superiors and the worry with which they pursue anything that will perpetuate their employment.

What was not readily apparent to the audience being entertained by charlatan motivational experts is that these speakers were, themselves, being manipulated by the same forces that worked on the managers. It is questionable to the extent that the guest speakers were aware of their plight. They gravitated to the entertainment circuit from all kinds of backgrounds including business, academia and show business itself. Some were comedians who happened into more lucrative employment collecting directly from the repository of big bucks instead of depending on penny ante ticket sales. Some were preachers who merely swapped flocks, but who often combined both, and who found an unusually remunerative retirement. Some were physicians who traded on the need to keep workers healthy to reduce the expense of medical insurance. When the banquet is set before you, do you quibble over the source of the food? It's likely that many motivational speakers paid no heed beyond their hefty checks. We can hardly expect more of them than we expect of ourselves, can we? Perhaps we should. Anyone who sets themselves up as an authority and takes upon themselves the responsibility of teaching or advising has voluntarily bitten off a huge chunk of sacred duty that must be executed with diligent respect, upholding the needs of those being taught. So, yes, we should expect more from motivational speakers or any other advisor called upon to render guidance to others. And when they are no more informed than their audience, or when they choose to ignore their innate suspicions and overlook evidence of their own perfidy, everyone is dealt tragic disservice.

The fundamental reason that these speakers, entertainers and advisors are in their position, is that they provide what the employers want to push onto their managers and fuse into their minds and behaviors. This is obvious or the whole program would not exist. But we tend to pay little attention to the fact that many managers actually want what is being provided through business entertainers. They want it so much that they are willing to shell out their own money for books and tapes offered by these motivators. These days we have podcasts and YouTube video that is widely accessed and the Internet is the prime conveyor offering tweets, emailed newsletters, online seminars and chats. All of this extends the reach and coverage of business entertainers and encourages more to enter the field. Managers are lapping this stuff up at an incredible rate. To what end?

When it is regurgitated by managers onto workers, it is apt to be suffocating, diverting any potential initiative for innovation into resentment and channeling distress into the workplace. It may also produce ridiculous effects. It's one thing for the Seattle Seafood Market to fling fish through the air, it's quite another for wads of paper to sail above cubicles. What may be energizing for one is a distraction to another. The workplace is beset by forces of counterproductivity and ill will; adding to it from outside sources merely increases the problem.

The ease with which entertainment slipped in the back door and now permeates business communication and influence is itself a subject of concern. Like so many things, it was unnoticed and creeping before it became inundating. That it was able to create a swamp of misinformation and problems illuminates the fact that the ground had already been prepared. At least it was the bottom with nowhere to go but up. But that same ease with which the nefarious was communicated is also capable of communicating positive change.

Some business communicators had been positive and helpful all along; modern communication methods simply provided them the tools to be more effective. That the sincere among them tended to be misguided in some critical respects should not detract from a sense of their intended worth or, in many cases, partial value. Particularly in the early stages of business and leadership critiques in the second half of the twentieth century, these positive influencers simply did not have the advantage of sufficient hindsight to understand the magnitude or even the nature of the problem. That they were, in many cases, able to contribute positive insight to the temporary repair of specific problems is credible in itself, and they should not be faulted for failure to get the big picture before the horizon was in sight.

A service that these positive influencers rendered that is often overlooked is that they acted as a transition, lifting the gaze of concerned leaders and businesspeople from mimicry and petty entertainment to the possibility of beneficial change based upon a critical analysis of changed conditions. Admission that things had gone terribly wrong somewhere in the past few decades was the basis of being able to make the necessary adjustments for the future. Much of management, however, persists in its blind adherence to the ways of leadership and management in the past. Acknowledgment of failure was needed as a precursor to reform. How, exactly, that came about is the story of management in the twentieth century in relation to leadership, a tale we will come to very shortly. But for the moment, we need merely to recognize that arising from the muck of elitism and swamp of error associated with management, a new perspective is taking root. At the beginning of the twenty-first century, we can begin to shift from the old concept of elites in the workplace to "thought leaders," an updated version of elitism with which we are saddled for the time being. Hopefully, these very thought leaders will help us transition further and very quickly to a renewal of true leadership at the grassroots of all organizations.

The prospect of immediate change coming through thought leaders is negligible. Reasons for short-term pessimism stem from the fact that elitism is entrenched in class riven workplaces and the current crop of thought leaders are rising from the same elitism that created the workplace mess and will continue for a period of time despite the fact that emerging thought leaders eschew elitism in their quest for a better future. We see, too, that the overwhelming expectation of thought leaders is that they should be found elsewhere instead of everywhere. We are so accustomed to looking to national trends and famous people for guidance that the ability of thought leaders to emerge from the surrounding environment is limited for the time being.

The media pushes the expectation of finding thought leaders in the same distant places that earlier leader models were found. That happens to validate the role of the media in everyone's eyes while making life easier for journalists who are too few in number and, like workers everywhere, are pushed too hard to produce material too rapidly to do anything but take shortcuts. For thought leaders who are content to follow this path, the immediate payoff is enhanced status, more opportunities to make more money and extend more influence, a sort of incestuous relationship that produces income at every turn, just like the case of old style leader models. Interdependency of any leader, any follower and any enabler badly needs to be disrupted, to borrow a term from contemporary start-ups. And that is exactly how to tell the difference between legitimate, newly emerging thought leaders and old style hacks. The good ones are too busy doing real work to climb aboard the gravy train mashed potato circuit.

More importantly, as time progresses, thought leaders will be shown to be indigenous to virtually every business. Attempting to evaluate thought leaders of the near future by artificial means such as education, publication credits and speaking engagements will prove meaningless and will evaporate on their own. During the process of reestablishing a connection to true thought leadership, elitism in the workplace will crumble and individual workers in any capacity will be identified for the value they bring to their organization and others in their workplace. The reason that this will happen is that we have bottomed out; things can't get much worse in the workplace of the early twenty-first century. Already, businesses are beginning to open themselves to new thinking, and those who respond by enriching their colleagues, as well as the organization, will be identified and supported in their workplace roles. The workplace is evolving, becoming more open to thought leader influence than ever before and many of those leaders will be workers immediately at hand who have all the prerequisites of authentic leadership.

Workers and workplaces are hungry for new guidance and they are beginning to find it where they stand. The hallmarks of new leadership are sharing, cooperation and communication along with a commitment to value. We have seen how contemporary and developing technology and other enabling factors can ease and extend the influence of leaders. The old mindset was to find leaders outside the organization and grow leaders within the organization based on outside models or top-down prescriptions. That antiquated thinking is being washed away by the cleansing power of truly grassroots leaders.

New thought leaders will have much more influence than will be directly attributable to them but it will nonetheless exhibit solid leadership. Perhaps the best news of all, anywhere in this book, is that new thought leaders will influence quietly from the bottom up through example, mentorship, coaching, cooperation and sharing instead of the top-down approach favored by celebrity chasers and control manipulators typical of management. Much of the influence of new thought leaders circles back to the workplace from a perspective that is frightening to the adherents of twentieth century management style and control. The reason is simply sincerity, and the willingness of new leaders to engage instead of dictate. Workers will rub shoulders with thought leaders on a daily basis instead of necessarily looking them up online or reading their books.

The truth is that education has never been of directional benefit to leaders, people who would emerge regardless of their formal training. This is curious, given the set- in-stone belief typically shared by managers and educators that a college education is a minimal requirement with MBAs now seen as a near prerequisite for success in business. I agree that extended education is important and ever more so in the future, but the reasons have more to do with information, practice and communication than anything else. Leadership actually has little to do with education, although a polished communications performance is helpful to advance the ideas being disseminated by thought leaders. Stumbling around a presentation or exhibiting frightful grammar in ubiquitous written communications is more off-putting than the finest thinking can overcome.

Social media has a role with thought leaders, also, but the direction may make a surprising turn. Today, we tend to think of social media examples when the subject is mentioned. Who, now, upon hearing "social media" would do otherwise than think of Facebook, Twitter, LinkedIn, Snapchat or maybe Instagram with an undifferentiated understanding of blogs upon blogs plus email lurking in the background? The reason is that our association with thought leaders and social media is something we receive from untouchable celebrities on the virtual plane. If we happen to have any interaction with these people, it is from the distance of digits.

Think, instead, about texting. Teens were originally ridiculed by older generations for an endless stream of communication that fundamentally could not be comprehended. Why would anybody want to do that, older people thought before they, too, began texting almost as incessantly. Instant messaging in the workplace began to undermine the sensation of separation associated with social media. Think of it this way: all those instant messages flying back and forth are a form of social media even though they presumably pertain strictly to work. The fact that the technology crept up on us says many things, but in this instance, it's enough to identify a means of undermining old doctrine. Social media, including the new computer applications with many sharing and archiving options that are increasingly embedded in the workplace toolkit, feature an opportunity for thought leaders to extend their influence quietly and unobtrusively in every workplace.

They have an opportunity to elevate social media beyond its original best intention, far above the swamp of trolls and even mediocre marketing ploys and throwaway opinion. LinkedIn made a sincere attempt in this direction. As a somewhat closed entity, LinkedIn played with its unique positioning supportive of professional restraint but ultimately many of its users succumbed to the obvious temptation of self-aggrandizement. All of that is understandable and does not preclude a certain amount of value but fails to reach the heights of potential for communication by people disinterested in self-promotion who are willing to offer links to valuable data, helpful resources, intriguing material, and most of all, the accessibility of their own insight.

These new thought leaders are in the next cubicle or down the aisle, or they are at least somewhere in a building we visit now and then. We see them often, and even oftener we interact with them, if fleetingly, accepting as well as making suggestions, providing as well as absorbing information. This raises an entirely new dimension to leadership, one that is independent of interference and control by bosses and one that nourishes and sustains from within rather than reacting to external stimulus. The gradual development of gold standard leadership will be a mutually satisfying and enriching relationship based on sharing instead of seeking and bestowing. The Internet is now full of everyone writing for what they hope will be remuneration of some sort instead of writing for the value that enhances their lives, their co-workers and all their endeavors. The influence of these new thought leaders will then be deeper than ever, if less flashy.

Organizations that have begun revamping their internal interactive structures already benefit from the emergence of new thought leaders. Other organizations will soon follow because, in the opinion of some of the best thinkers such as Umair Haque, old style organizations of the Industrial Age management mindset type will not survive long into the twenty-first century. The reasons for this are connected directly to what happened in the latter half of the twentieth century and to new perspectives on the horizon. Before we look closer at what's ahead, we need to understand the quagmire that resulted in the twentieth century from the Industrial Age management mindset. Unlike my earlier book, Own Your Employment: The Challenge for Twenty-First Century Workers, we will examine the subject from the perspective of leadership because that is the route that will free everyone in the years ahead, even during the time of unparalleled technological advances that threaten jobs and question the very meaning of work in the twenty-first century.

Management

Glancing backward very briefly, we see management in a quagmire of its own making in the period leading up to the twentieth century. Even as the dirt walls of the hole it dug caved in upon them, those responsible for the debacle dug deeper as time progressed. Although we can now regard this dedication to tunnel vision completely without sympathy, we must admit that its persistence was understandable. Greed, after all, is one of the most compelling influences and while it provides no legitimate excuse, avarice explains a great deal. It is as if an evil spell captivated capitalists and led them around by their nose seeking more money. No senses, and certainly no sense, mattered. What they created to assist their mindless pursuit of riches was the idea of professional management. Being historical neophytes, their blind adherence to a concept that, at first blush, seemed productive, certainly of money, their intentions might be overlooked were it not for the grotesque suffering caused by their single-minded immorality. Besides profit for a few, management, as conceived and executed during the twentieth century, produced primarily immense misery and loss. Committed to their course, capitalists did not relent under any circumstances that, by the opening of the twentieth century, coinciding with the Franklin Roosevelt Administration, were extreme. Most businessmen (in those days, almost all businesspeople were men) fought Roosevelt's New Deal without ever backing down from their intransigence. Thus committed, capitalists stayed their course even when contradicted by empirical evidence. In fact, in a twist of fate worthy of grand opera, capitalists turned what should have been liberating modernity into antique repression during the second half of the twentieth century.

After World War Two when economic activity blossomed across the globe, providing employment for most Americans, the titans of industry maintained nineteenth-century attitudes and the petty bourgeoisie happily followed suit because they, also, were becoming rich in comparison to the workers. It was in this sanguine cauldron of misbegotten fantasy that the bloody stew of twentieth-century capitalism simmered. The question that we will address here is not the suffering of workers or the malfeasance of management and how that should have been corrected in a physical sense, but in the metaphysical realm of leadership, the seat of perspective and guidance that failed so many for so long, including businesses themselves and that class of worker known as managers.

In the beginning, everyone was a worker and everyone worked for themselves, later twisted in history as the belief that everyone was entirely self-made, forgetting the interdependence that arose naturally in the course of livelihood and society. But initially, everyone worked. It was only after excessive accumulation by a few that anyone had time to consider what they would do if they did not work. This fortunate minority concluded that they could be more productive—richer—if they left the day-to-day activities of their business enterprises to someone else, retaining oversight while pursuing new opportunities. Thus, management was conceived and a new era of capitalism was initiated. Having discovered something that worked, these new capitalists returned again and again to the well for more until, eventually, large businesses were built upon massive capital aggregations with managers becoming increasingly distant from the manipulators of the wealth itself. There is no news in any of this. Everyone is aware even if they never think directly about it. What is overlooked is the control embedded in the process of constructing management and the exercise of its duties.

The control aspect rightly receives a great deal of attention related to the execution of management functions but it is helpful to identify control at its point of origin and its importance in the founding process because control is the most vital element of modern management and capitalism, the single point on which early success was based and the reason for its failure in the future. Early businesspeople, tradesmen, manufacturers and service providers realized that control was the most essential ingredient that could assure their success. The fact that haphazard conduct of affairs led to ruin was easily observable and the solution, control, was simple in the most basic stages of business. If a tanner, for example, participated in every step of processing hides, he assured himself of a quality product that would be easy to sell. To the extent that he permitted inexperienced assistance or unsupervised conduct of neophytes, he opened himself to loss. The successful tanner, finding an excess amount of capital at his disposal after years of concentrated effort, might then relinquish some direct involvement with his tasks if he had the foresight to train competent assistants who could be trusted to operate with his own level of interest in successful outcomes. Thus, time was freed to pursue other lucrative endeavors. The point of control at this stage is vital, not merely because it occurred but because it was at this juncture that control became integral to the conduct of the businessman, not simply as a person performing tasks, but as an entrepreneur with ambition for extended success. It was at this point that control was baked into every consideration undertaken by the businessman. It was at this point that control became a conscious feature governing every decision. It was at this point, as new opportunities began to present themselves, that businessmen gauged the possibility of participation primarily on the basis of whether or not they could maintain control, or, lacking complete control, determining the extent of risk they were willing to accept. Control, then, became part of the DNA of modern capitalism and provided the motive underpinning everything that lay ahead.

The motive would seem to be entirely clear as the means to increased wealth, dragging with it the creation of an enormous differential of income and wealth. In other words, the purely capitalist aspect of the motive behind control is self-evident and monumentally important. It would be folly to disregard cultivation of any means of extending the magnitude of wealth. But the motive underlying control is much more complicated than the tally of dollars and cents.

The motive of control is not an end within the bounds of its initial conception. There is a second basis for the motive behind control that quickly became, not a junior partner with its progenitor in the creation of wealth, but its full equal, responsible for maintaining wealth and extending it still further by means of specific purview that did not exist prior to management.

This second branch of the motive behind control is the control of management itself. And it metastasizes thereafter throughout the entire organization. It was in the DNA. No one purposefully thought out the ramifications; control simply manifested throughout the structure as a matter of obvious course, springing from "like father, like son" because DNA, even of organizations, is begotten, not created independently. Control cannot be said to have occurred throughout management on the basis of followers tracing the route of leaders because leaders, as we will see a little later, do not inculcate control. Instead, the control that spread quickly throughout early and subsequent organizations happened deliberately as one layer upon another required it to occur. While later generations of mangers might seem to have embraced the copycat method of emulation of their superiors, brown-nosing and all, what they were really doing was adhering to the requirements of control imposed from above. Those who deviate from these requirements are inevitably disposed of like the trash they are deemed to be.

The importance of this virtually coequal branch of control can hardly be overstated. It is the control of management more than the control through management that is most familiar to workers and outside observers. It is also the aspect of control that is directly responsible for undermining, damaging and in many cases destroying leadership. Less directly, of course, the initial reason for control remains and both must be rooted out for leadership to flourish. Our attention here is directed primarily to that characteristic of control that is devoted to management because, except in a few unusually enlightening circumstances, it is management and its control that will change first. We need a quick overview of what is entailed in management control before moving to the numerous facets of control and their manifestations.

The temptation to view management as "neither fish nor fowl," is not wholly incorrect, although distinctions have blurred over the years as business ownership has increasingly dispersed with many senior managers being rewarded with blocks of stock. In fact, the practice of awarding stock grants to those placed among the least of management ranks has also become common. Typically, these lowly positioned managers and a great many of senior managers do not hold their stock long enough or ever have enough of it to be properly regarded as owners in any meaningful sense. The few among mangers who own significant holdings are minuscule, and even these have little, if any, decision-making authority apart from their management positions. All managers can be realistically regarded as hired hands, the same as an ordinary wage earner, except that they exercise varying degrees of control.

They also exercise control of other managers depending on their rank in the management lineup. It is at this point that we consider a chimera, a divergence in management that is no departure from itself, at all. The fact that managers control other managers, with the lowest among them directly supervising entry-level workers, makes it appear that their mission or activities are divided either of two ways. First, it appears that mangers controlling other managers are different in some significant way from controlling wage workers. Second, it might appear that the fact that managers control people on one hand and processes on the other creates division or difference. All of this, all management control is really seamlessly one activity. Procedures may differ, but control is the one undifferentiated existence that characterizes management.

On some level of haunting familiarity, it is widely believed that there is a brotherhood among managers regardless of rank. This is ridiculous on its face given the tremendous gulf that exists between the highest and lowest ranks of management workers, not to mention differences in income and wealth. There are numerous tactics that support this system that strings the lowest managers along. Much of it depends on the Horatio Alger myth that continues to plague American workers at all levels. All of the lowest managers see themselves as rising to the top, and all of the top managers believe they worked their way up from the bottom. None of this is true, and the fact that it is being statistically revealed in multiple mobility studies of income, wealth and education, as well as social position, makes the untenable ultimately impossible to maintain, an example of how knowledge can undermine the sorry story we are telling here.

The fact that managers control both processes and workers are ultimately also shown to be part of a single mechanism. Anyone who is familiar with businesses of any size knows that processes have been designed to incorporate workers rather than treating them as separate from process and procedure. It might be idealistic to view workers as something greater than their work and sentient beings above their activities, but it is not true that they are thus regarded by management. A line supervisor might prefer to think they are working with people but, if they would step back a moment and look at the situation objectively, they would see that their superiors in management have designed a process that incorporates both workers and procedures into a single process, a unified flow. Larger businesses and higher management levels make no pretense otherwise except when it's time for a photo op to demonstrate their supposed humanity and the high regard the company has for their workers.

How management operates to control itself—all managers—as well as processes and workers is best exposed through the consideration of some of the many elements and examples of management control in action. This examination will not be in vain because, just as all bricks that contribute to the building of a house are important for its structure regardless of size or placement, so would the removal of any one of these bricks contribute to its demise, no matter how seemingly trivial a specific brick might appear. Make no mistake: dismantling management control brick by brick must occur before leadership can be fully restored. Apart from stemming global warming, it is the single most important task ahead for all of us in the twenty-first century, and it is necessary to have an understanding of the elements of control in order to be successful in its demolition.

Imagine that you were suddenly placed in charge of a large, multi-faceted organization. Do you know what you would do? Answering this question with a resounding, "Hell, yes, I know what I would do!" is the conceit of managers who believe—not necessarily incorrectly—that if a person can manage one organization, he or she can manage any organization. The question is how and the answer is control.

As to how exactly control would be asserted could begin any number of places. Keeping in mind that management is hierarchical, it makes sense to ensure that the plan points upward in the chain of command for guidance. To diffuse instruction is to lose the thing you are seeking because control rests on the premise that those being controlled must look somewhere for instructions; to have guidance come from a lateral direction means that upper stations of the system do not have oversight, therefore control, of what is happening beneath them. If you look at a complex organizational chart, you will see many seemingly spread out offices. Look closer and you will see that as a reflection of size and that reporting inevitably is consolidated in an upward direction regardless of how large the organization is seen to be. The flow of authority is upward while contributions to power are diffuse.

Being diffuse and aggregating tremendous amounts of information to accomplish a wide variety of tasks in this large organization, you would want to streamline reporting. That means that you would want to keep things as simple, direct and understandable as possible. For that, you impose as much uniformity as possible throughout the organization. Not only does that make information and communication comprehensible for levels of management without abstruse knowledge of minutiae, it sends a message to everyone down chain in command that you are in charge here. It also happens to make reverse communication easier which comes in handy when you want to impose difficult or unwanted change. This particular aspect is extremely relevant to pulling the financial strings of an organization, especially when neither you nor your financial advisors have technical expertise, a growing (and reversible) trend in businesses run by outsiders, especially lawyers. As the complexity of financial arrangements underpinning businesses grows, financial advisors often recommend measures antithetical to the interests of numerous segments or interests within the organization. Layoffs, for example, may make no sense to workers down the line and content adulteration prescribed as a cost savings means may rankle purists in the production arm. The belief is solid in top management that to be successful, a business must be run primarily with an eye toward increasing profit and financial managers are quick to demand satisfaction. They are most easily and quickly mollified with rapid receipt of useful information and the means of issuing directives that are certain to be implemented without delay.

Never forget that control stems from the willingness to assert power. Timidity, fearfulness and hesitation weaken not merely the one initially afflicted but all others in the management chain, even those above. Clearly, any manager below the weak link would be left in a quandary, experiencing fresh reluctance and passing it downward to a confused lower management. Those above are also impacted by irresolution on the part of a subordinate because the governing intent of instruction has been weakened, whether or not consciously, rendering suspect the motive and resolve of management as a whole and making future initiatives dubious in the eyes of workers at all levels. It is for this reason that top management brooks no dissent, tolerates no performance failure or independent initiative. Management is nothing without control. Control effectively, and you might be able to run General Motors, maybe even Apple and certainly Microsoft.

Glancing back at the incipient elements of control mentioned above, imposition of uniformity seems fundamental. From there the explanation easily expands to include many related empirical characteristics. Uniformity, for example, necessarily suggests standardization and the demand to meet the newly established requirements. Incorporated in these elements is coercion that extends well beyond mere restriction. Embedded in these are yet more instructions, rules and regulations. In establishing itself, for example, management imposes its will and inserts various requirements forcing their implementation against any hesitation on the part of subordinate managers or workers. This level of power is second tier to that inflicted by commonly acknowledged rights of ownership. But it doesn't stop there because, having gained overall control, management creates third tier coercion in the form of requirements for uniform reporting, standardized procedures and so forth. Some employees may be familiar with standard operating procedures (SOP), typically volumes of organized details about how things are to be done. These written requirements are mere guidelines until force is added to them through the agency of audits.

It is through audits that management puts teeth into their third tier level of force. Audits, however, are selective with so much wiggle room at every step that they ultimately constitute the capricious will of management to any degree it chooses. All the way through investigation, at each step, in fact, management has an opportunity to guide what is examined and even what is discovered, sifting, emphasizing, avoiding, eliminating and even locating what it desires to find and illuminate. And this is the case for every audit, including those that may be instituted without malice or preconceived outcome. Imagine how perverted the results might be from those audits that are governed by intentional malevolence. Auditors are subject to interpreting the preferences of management, accepting hints that will make their own lives easier and producing reports that justify whatever management wants to find. Ultimate use of the audit, of course, rests with management who may decide to ignore the investigation altogether, select parts only or pursue other, highly selective findings. It follows naturally enough that results are subject entirely to the decision of management that may elect any course of action it chooses or no action at all. Thus, any thought of justice within the realm of management is inconceivable, there always being factors of management control that determine outcomes.

Operating through manipulation and imposition, uniformity, standardization and conformity are only a step away from form and process, the means whereby management leaches all life from its minions. Conformity includes much more than adhering to physical specifications for production, it also involves obedience, a much more subjective expectation that permeates the mindset of management to such a degree that it is virtually indistinguishable from management itself, as if the concept of control was itself controlling. With this understanding, it may be possible to view form and process from a more fundamental level than merely one of paperwork and regulations, even more basic to and essential for management than appearance and machination. Form and process are the air and water of management, life itself without which there is only death. No tinkering is allowed. It is at this precise point that leadership must assert itself, thereby replacing management for the benefit of generations to come.

It helps now to have a look at specifics all of which are familiar to everyone who has ever worked in almost any organization, especially for-profit businesses, but certainly not excluding nonprofit organizations where the governance is largely derived from the Industrial Age management mindset revered by most businesses. The examples will also be well known to those in government that, again, is primarily influenced by afflictions in the private sector.

Always expect to see an abundance of rules, regulations and minute instructions, many of which are contradictory and inapplicable but forced upon subordinates anyway. By the time you add inconsistent implementation and capricious dispensation of guidance, itself often flawed and irrelevant, you have a very confused and resentful workforce. To make all this even worse, much of the communication is thoroughly obnoxious if not completely perverse. One of the all-time best standbys for uncertain managers is to appear gruff and petulant, even saturated in anger. It tends to scare the hell out of subordinates who want nothing more than to leave their boss alone. It can also engender a feeling of uselessness among workers, leaving them with the impression that they are inadequate, a feeling that is generated within the worker and cannot be directly attributed to the boss, thus reinforcing a sense of shame and inferiority that many bosses cultivate to cower their employees. The worker ultimately blames himself for failure, and the boss gets a free bad, as it were.

It might be even-handed discussion to point out that this may or may not be the fault of a particular boss, directly anyway. It is proper to keep in mind that each manager reacts to similar ploys exercised upon them by their own bosses. As far as I am aware, there is no school for any of this, but it is certainly learned behavior for most managers who observe from their superiors and mimic the behavior to their own subordinates. I once knew a manager who toured a facility with an underling while frequently pointing to a wide assortment of things and repeatedly uttering a single acerbic word that made no sense in context but it was sufficiently distressful that the subordinate was left completely dazed, having been made the object of a management show of force, a kind of shock and awe for poorly paid, unsuspecting and fearful managers. The superior manager in that situation doubtlessly thought it had been a good show, delivered with perfectly stern visage and an unstated but always present threat of consequences. But that manager was probably unaware of having fallen into the perfectly executed trap laid by an even higher manager knowing that emulation would follow, offering a technique that was sure to be replicated on even less prepared, less aware, more distantly subordinate managers. It was a good, if predictable, show based on the monkey-see, monkey-do school of management training.

But some managers are natural-born asses. Of these, a few are irremediably evil and far too many are instinctively predatory. Having a surfeit of obnoxiousness, however, is no barrier to management success or even business stardom. One of the acknowledged worst bosses in the world, a CEO, was proud of his acid temperament to the point of boasting about how intentionally mean he was with the verbal wrath to prove it, not to mention memos memorializing his historic mistreatment of workers in a sort of archive of management infamy.

Questions arise about apparently natural-born abusive managers. When, for example, do they discover that they are abusive? The variously appropriate answers are staggering. Some never realize that they are reprobates, a fact that is shocking because it indicates almost complete lack of connection to humanity. It also means that no one, no senior manager, no subordinate, ever disclosed this to them. Friends and social acquaintances who might observe untoward behavior are apt to blow off their observations as petty, inconsequential, irrelevant, or simply none of their business. But the fact that organizational associates at some level never disclose the problem reveals a depth of uncaringness that undermines the myths that organizations like to maintain about themselves, whether it is of paternalism or modern "enlightened self-interest." Have these managers never noticed the agony, the disruption, the failure of productivity they cause? This is one of those damned if they do and damned if they don't questions. If they noticed the adverse reaction they cause but fail to address the problem, they're guilty of a cascade of wrongs that will follow them until they are stopped. If they never paid attention to the problems they spread around the organization, how can they be said to be good managers? That particular question, the answer to which would seem obvious, is yet more devastating because, in most instances, these people are highly regarded by other managers, a situation that will be discussed shortly.

If these negative managers are aware of their abusive proclivities, do they try to curb them? Clearly, some seem to, but if you look closely, you will see that what they are really doing is putting a cap on their extreme behavior instead of attempting to reduce it, let alone eliminate it. To the extent that half-measures constitute curbing, that may be commendable, but the greater issue revolves around the fact that they are simply marking the boundaries of extremes that are unacceptable, then regularly running up the edge of them. Physical assault on employees, for example, is forbidden both in law and common sense. The manager knows that they cannot bash a worker with a fist or a coffee pot or whatever happens to be handy, but they can loudly berate the intelligence of their subordinates or threaten all manner of punishment and follow through on the threat. It is a certainty that in the run-up there was an endless stream of invective that will resume after the larger consequences are administered.

What happened here is that the manager, by locating the outermost point of tolerance of bad behavior, intentionally fills the space of possibilities with maliciousness. Instead of constituting real restraint, they have effectively enlarged the sphere of operation for cruelty, viciousness and malevolence of all descriptions. They have honed and optimized their evil, almost criminal behavior, both taking advantage of subordinates and teaching them the finer points of malevolence along with a practical guide to implementation. They have succeeded in codifying an acceptable measure of monstrous conduct in the workplace that more easily becomes the norm, not only for them, but for others who mimic their iniquity.

Their reprehensible behavior is rooted in authoritarianism. More often identified with fascist politics than organization management, authoritarianism is the special mindset of nearly a quarter of the American population, an astonishing number of people, far too many to run government and more than adequate to populate the ranks of management. Many are leftovers spilling into follower slots, dogmatic, unthinking, noncreative, intransigent, blinkered souls who will do everything they are told, nothing more and nothing different. This cadre of reliable lackeys serves their bosses well and helps goo together reluctant workers who sometimes hesitate on the brink of noncompliance.

Frequently narcissistic, authoritarians are duplicitous and amoral without discernable conscience. They accord themselves the highest regard and hold non-authoritarian workers in contempt even if they are peers of those workers. They utterly lack respect for others. They say absolutely anything one moment, believing it for an instant, before saying something entirely different a minute or less later, believing that, also. Their contradictions are legion; they spread confusion everywhere, compounding fear with uncertainty. In my hearing, for example, an authoritarian started a sentence asserting one thing and concluded it with opposite instructions. Reason is out the window with these people. Their minds gravitate to control; given a chance, they exercise it. Denied the opportunity of directing others, they fall staunchly into line when confronted with senior authority. Occupying subordinate positions, authoritarian followers adapt themselves with unquestioning fealty. While their sometimes unpredictable, erratic and often disconcerting behavior might seem to be a recipe for chaos, authoritarians are masters of the management universe for the reasons of control and hierarchy. In superior positions, they control; in subordinate positions, they are controlled with unswerving loyalty and equally fierce determination.

The reason for the stolidness of authoritarians is no secret among the rest of us but completely eludes their comprehension. Authoritarians are not the least introspective. Unable to understand themselves, they cannot fathom the qualities of others and fail to grasp the role of nuance and respect. The density of their self-centeredness precludes appreciation for diversity and diminishes their capacity for collaboration. Authoritarians want to be liked, to receive accolades and observe the appreciation for them expressed by others but they do not reciprocate except to the authoritarian bosses with whom they identify. For authoritarians, work is about compliance, either issuing directives and expecting them to be followed, or fulfilling the expectations of their superiors. Simply performing a job becomes a matter of obsessive adherence to authority. Amorality facilitates following dubious, unethical instructions and the absence of conscience makes them strangers to equality without which they regard peers and subordinates with disdain. Authoritarians are not creative people; they react rather than generate; they connive and maneuver with Machiavellian expertise but they implement schemes rather than originate plans. Because they do not think, even about themselves, and cannot relate to others, they are cutoff from quality and separated from reality. While not all managers are authoritarian, many are, and their association with similar people reinforces their deplorable tendencies. Management is filled with the worst of this lot and it is the fervent mission of top management to encourage its spread and deepen its hold on workers.

If management is so awful and there are so many bad managers, many with perverse intent, questions must be answered. Why is a good place to start. And the answer circles back to control. Since control is the key, it matters little to top management how control is attained. If it happens that perversity is an effective means, that's fine with the big bosses, many of whom practice it as well. It's wise to remember the monkey-see, monkey-do school of management learning. Like some university professors who delight in seeing their students follow in their footsteps, bad managers enjoy watching their underlings use the methods they have learned from them. After all, and perhaps more to the point of fake leadership, these managers practice follow-the-leader no matter how little sense it makes. It is follow, leader, control, one-mindedness mindlessness that makes a mockery myth of what comes next.

This being management and all, what is supposed to come next is effectiveness because, presumably, the whole point of management is to produce profitable results, outcomes they define, conveniently, as being composed of what management claims to need to be effective and produce desired results. It's like they confuse goals with methods, topping the confusion with the implication of permission granted by default of position. This is why boards of directors are sorely needed to be active and also why these same boards seem to be utterly blind, being composed of members who behave in the same way in their own organizations.

Digging a little deeper, we see the appearance of effectiveness, of successful management, achieved through perverse methods and authoritarianism. Abracadabra. It's easy to manufacture your made-up wonderful result when you control all the appearances that compose it and all the means of combining those appearances. Management marshals all the tools at its disposal in a fictitious display of power that overwhelms all observers, subordinates, shareholders and the public. Controlling the tools, it names the goals, commands the methods and defines the results. By restricting everything, they coerce everything. Insularity means ease of control and management uses specialization and compartmentalization to name all the elements and declare outcomes consistent with its intent. They command specified production and achieve through rules, regulation and instructions, predetermined results arrived through lack of creativity but all the force necessary for implementation. They impose effectiveness through the creation of false expectations. Under these circumstances, how can management not be successful? Management declares itself effective and efficient using manipulation and imposition when, actually, they have done nothing but invent the criteria by which they judge themselves. It's something like a war unilaterally declared won regardless of the outcomes of battles. Often, management is not effective or efficient at all. Many of the components of their methods reduce effectiveness while elevating the appearance of efficiency. Take meetings, for example. They are designed for predetermined purposes and to make everyone view their engineers as being active and clever. They are even able to use inertia to their advantage, slowing progress for a variety of reasons including the possibility that some cunning manager can swoop in and save the day with a flurry of instructions. Mostly, management processes are wasteful. Inevitably, they block results against a tide of untapped potential. They cause dependence instead of independence and interdependence. They can produce tangible products as specified but they lack value.

Everything that management does is with a "sense of urgency," a beloved phrase that imbues its activities with elevated purpose and focus. It implies that what is directed is done so with such dramatic necessity that all speed is required. Don't think about it, don't delay with reservations, don't give it any consideration whatsoever, just do it and do it quickly. Partly, the concentrated determination and need for immediate action stems from a belief, dearly held by management, that everything is better accomplished with sense of urgency, in other words, they have come to believe their own lie. Partly, emphasis on speed is merely to cover tracks and obscure the meaninglessness of what they are doing while keeping everyone very, very busy with an eye on an object instead of the route to the object or even why they are going there in the first place. Primarily, management knows that a sense of urgency breeds more complete control, their overall most important mission. Management has been so successful in this effort that they have inculcated a fear of anything opposite of control or anything that would even question control itself. The reason management wants to foster fear is that it is fearful of an atmosphere that is not charged with fear.

Leadership Hijacked by Management

Along the nineteenth and early twentieth century route of using brute force to produce fear, management realized that a better, presumably more humane means—certainly less bloody—would be to hijack leadership. Controlling leadership, management subsequently discovered, is as easy as channeling it. That is why we have all the elements of control, the selling of management as the epitome of efficiency and effectiveness. That is why we have all the motivational speakers, the pop psychology, the conferences, the meetings, the endless memos, the bluster, the invented criteria, the inertia.

Management engineered an elaborate ruse designed to make everyone think that they, management, are synonymous with leadership, when, in fact, they merely hijacked it from its rightful owners, workers, and claimed it for themselves. Because management controls employment, they can perpetuate their claim of ownership with little objection to the contrary. Until lately. But they have done a remarkably good job of making everyone think that management means leadership, such a good job, that most employees, even managers themselves, believe the lie.

Proof of mind control extended over generations, overwhelming the lives of workers who are kept endlessly busy, comes in the step just beyond all the conferences, speakers and meetings. Take a look at the literature that pours continuously from print and digital spouts. Management flacks constantly want us to do things that control others, explaining how we can be better managers (of more efficient use to our overlords) if we learn to better gain specific results by coercing it from our underlings. There! Now, that's leadership, they want us to believe, when we boost productivity by wretched conduct called management.

Maintaining the appearance of leadership and of wholesomeness required management to perform a few real functions thought to be generally beneficial. The bold stroke of pure genius that accomplished this feat was to accept all responsibility for everything within its purview. More accurately, it was assuming responsibility for everything. It was muscling in, shouldering itself forward and taking over. The effort carried off both an expansion of control and the prize of seeming to have done it altruistically. It was a PR bonanza that simultaneously handed them the bank.

Among the most notable of management endeavors was to declare itself in charge of establishing remuneration for employees. This was entirely logical, having already established their control over employment, but it was far-reaching well beyond initial perception. It spawned two distinct and critically important strategic advantages for management.

First, control of employment and remuneration meant that a new level of bureaucracy was required and we all know that there is nothing management loves more than additional bureaucracy. Thus, was formed what was previously known as the "personnel department," later dubbed "human resources" and more recently misidentified with numerous terms intended to obscure its real mission.

There is no doubt that most human resources workers believe they are doing good things, both for other employees and for the businesses that employ them. But they should re-examine what they are doing and the basis for their activities because they enable management mischief, an official organization devoted to enforcing and expanding control. Human resources workers use the happy face of presumed employee benefit to undermine the independence of workers and to subject them to the calibrated pressure of conformity of thought as well as action. Under the guise of process and with assurances of affection for their welfare, human resources offices carry out the will of management against the best interests of workers. Often, the subterfuge works without a hitch, but frequently employees suspect they are being had while acquiescing without objection in the hope that they will be left alone with no greater disturbance. Little by little, human resources have tightened the grip of management.

Form and process are principle tools of the human resources trade. Give employees more regulations, require more responses, exert greater presence, engender increased dependence and, they conclude with ample justification, workers are rendered utterly compliant. If these ruses fail, there is always a bagful of coercion that can be deployed, all of it nicely formatted and rationalized by premises kept handy for just such occasions. Rules and routes, requirements and necessities. No deviation allowed. A circuitous way of saying and enforcing the old dictum, "my way or the highway."

It always has to be management's way. That's why human resources offices are there in the first place. And having discovered its usefulness, management heaps on more and more responsibilities and expectations for an ever-widening scope of duties. This is far from accidental. Human resources workers long ago discovered the symbiotic nature of their relationship with management and take full advantage. Human resources executives feed top management with endless means of increasing control that we will explore shortly, but one of the obvious results is to increase the clout of HR officers and employment for a swollen sub-bureaucracy.

The fact that unions see through the whole human resources chicanery and pretense is of little consequence. Partly due to the cunning and steadfastness of HR workers, unions represent far fewer employees than at the height of postwar prosperity, meaning that the vast majority of workers have no organization or guidance to help them navigate the obstacles erected by the HR machine. The deeper that we go into the twenty-first century, the fewer workers there are in a unionized workforce. The alternative for them is not merely lack of organization, but a cultural emphasis on presumed independence that plays directly into the hands of management. As it happens, this insubstantial atmosphere—one might call it gaseous—significantly fuels the second strategic advantage for management.

With control of employment and with an enforcement mechanism (HR) in place, management found itself free to pursue expansive activities that not only impact employees but the entire society. The key, control of remuneration, is deceptively simple but ripples through virtually every aspect of the whole culture, sustaining and expanding management control in its wake.

As businesses grew and acquired human resources offices of whatever size, the growth indicated market positioning, if not outright dominance, sufficient to influence the course of smaller organizations. This was true not only of monopolies the dimension of Standard Oil, but also of smaller businesses operating exclusively in small towns and sparsely populated rural areas across the country. And as these businesses began to hire employees to perform an increasing number of their functions without reliance on sub-contractors, more workers fell within the confines of its direct control. Similar businesses in the vicinity naturally looked to larger ones for direction. Freelancers who performed periodic tasks for these larger businesses found the terms of their work arrangements increasingly dictated to them.

Compensation was a central issue being decided by businesses with progressively less input from workers as larger businesses used their dominance to set terms and conditions. What larger businesses were willing to pay became the standard practice on a regional basis. This was a responsibility that businesses gladly shouldered because it gave them distinct advantages in both the marketplace of products and employment. Larger businesses were able to manipulate wages to suit their needs. They were able to pay more for specialized skills, thus attracting superior workers. They could also suppress wages for common skills, at times luring workers with other inducements unavailable to smaller employers.

Despite the necessity of navigating currents generated by larger businesses, smaller concerns also found benefits to the changes of evolving compensation terms. Partly, these benefits arose from the near-universal application of human resources as a management tool. Partly, they emerged as community and industrial standards were established and partly benefits accrued to businesses through cultural changes that they readily guided for their further advantage. If wages cold be established instead of negotiated, workers could be placed in a position of accepting them or moving on; and when uniformity was achieved regionally, the answer was rarely in doubt. It was subsequent issues that grew out of setting compensation that forged extensive new opportunities for management.

While all new avenues of control arising from power over compensation could be lumped under the broad and often noted infatuation of management with social control, it is helpful to realize that many facets grew directly from the issue of remuneration. Since they were dispensing cash to workers on payday, and because of the HR bureaucracy it made sense for management to evaluate each employee, an increasingly formalized process that gradually facilitated far-ranging control. Let's look at how this played out because its implications, often misinterpreted or overlooked entirely, had major impact on the twentieth century and our position at the outset of the twenty-first century.

By removing employee evaluations from the casual hands of straw bosses and low-end supervisors, management claimed for itself the right, not merely to reward or withhold favor, but also to guide the career trajectory of every worker. While most, it is true, remained in poorly paid entry-level jobs, some were selected for specialized training or positions of greater authority. We take this for granted, often without considering further ramifications.

While being selected for enhanced position or training meant that some workers were paid more than others, it also meant elevated status at work and in the community. Dosed in small increments over time, these changes may not seem important, but every change in a workplace where most people spend much of their waking lives, is important, much as rank is important in military organizations. Interactions with other workers are impacted on a continuous basis and the ever-evolving company culture is required to accommodate a shifting subset of low and middle management workers.

Consider, also, the impact of formalized evaluations and compensation processes in society. Changes are especially felt in small towns where many of America's manufacturing facilities were located. Increased economic status at work led directly to enhanced social status in the community with influence filtering through churches and voluntary organizations and much, much more besides. The same impact existed in larger cities but it was felt somewhat differently and would be observed differently. Housing, for example, was frequently scattered from an economic perspective in small towns with rich families living cheek by jowl with low-income residents while economic segregation was pronounced in cities.

Early in the period, training initiated by management during its evaluation process did not rise to the level of education. Members of management tended to have real, formal education that increasingly included colleges and universities. But the specific training provided to lower level workers nonetheless had the effect of identifying individual employees for economic and social distinction, albeit at a lower level. When management realized the potential for even greater control through the manipulation of education, it seized it with a vengeance.

Education is one of those factors that are often cited as part of social control but with little awareness of how it operates. As with much that management does, form and process are important, no less so in the realm of education. Being in the catbird seat, management makes the most of its influence over education and the workers who receive it. Education is often associated with income and it is management that directs both aspects along with the economic and social status of workers.

Education is considered part of the private sphere and individual responsibility with the best educations generally going to members of already affluent families, thus perpetuating the inheritance of social status from one generation to the next. In recent years, progressives have frequently noted the fact that economic mobility has stagnated in the United States to the point that it ranks low among industrialized nations in this important category. Less often mentioned is the fact that management, notoriously shortsighted and focused on profit, is the principal driver of this result, and, is it wrongly believes, its primary beneficiary after wealthy individuals themselves. Management sees benefit for itself in perpetuating excellent education within a relatively small circle of families, blind to the fact that merit, if ever it existed, sometimes filters away and that shutting out diversity blocks world-class adaptability and fresh ideas. That's not all.

By controlling education, management extends its grip over society in manifold ways that are not entirely obvious. Take, for example, what happens when multiple members of elite families attend correspondingly elite universities. They marry other elites and all of them have a shot at top jobs. Not all of them prove worthy of these elite positions and fewer still rise to the topmost ranks of business America. But virtually every one of them is employed somewhere within the realm of these select organizations. So far in this example, we have only a tiny handful of people with superior educations functioning near the very top of powerful companies.

What about the other members of their families? Here is where management control exercises octopus-like dominance partly because extended family members of upper echelon management work for the same or similar businesses in the tight network that proves the adage that "it's not what you know but who you know." Again, filling the lower ranks of businesses with faithful family members subverts diversity and cuts the business off from many essential qualities that are becoming increasingly important. But from the management perspective, there are numerous short-term benefits. These inbred workers are reliable in ways that appeal to management. Their political support, for example, is accorded to candidates who favor management. On a more basic level, family members of top-tier executives from the friend of the nephew of a distant cousin to direct siblings will not be union members. In an age in which unions have been largely shutout and which features jobs typically unassociated with unions, this is valuable insurance for management. It provides a cushion or maneuvering room for management, the flexibility they need and believe is only available with an unorganized workforce. By spreading the influence of economically secure employees, management strengthens its grip. These well-educated, elite workers dilute dissatisfaction at all levels. Management can rely on this elite cadre because connected workers always relate up the scale instead of down. It works a bit like primogeniture in an aristocracy where the firstborn, in this case, the best positioned, may reap the greatest abundance but other family members remain aristocrats, poor perhaps, by comparison, but securely fastened to the system that recognizes and benefits them.

Recognizing the economic incentives for management associated with education is virtually another way of admitting the class consciousness of the entire process closely tied to education. The social mechanism is evident as are the reverberations produced throughout all aspects of society. Control. The concept is inexorably identified with management and for a long time, education serviced the processes and form necessary for its maintenance. Performance evaluations connected to compensation, we have seen, was part of this. Along the way, management found it expedient to invent criteria required for top positions. The system was quite visibly rigged to support the control outcomes desired by management. But as technology developed rapidly during the twentieth century, management was required to alter the process and form associated with education.

As technological complexity increased, management was forced to expand in-house training for workers but was also required to consider other educational requirements that included college degrees; graduate school became almost standard with a premium on MBAs. Some workers could be trained with sufficient electrical skills to perform specific tasks admirably, but electrical engineers with college degrees were also required. But management began to cut sharp distinctions among better-educated managers, particularly with the rise of financialization that remade the face of executive suites in the image of lawyers and bankers. Think of it as a progression from tool box to briefcase to laptop. As needs changed, form and process also changed along with other expectations.

Insidiously, management wove another thread along the entire route of change. As compensation, evaluation and education changed, management was able to simultaneously reconfigure its claim on leadership. Start with the fact that management wrote job requirements and that subjectivity constituted a large part of its decision-making process. When you control the requirements of membership, you control the membership list, bobbing and weaving as needed to preserve your intent, your staff and, above all, your control. But exclusive reliance on university degrees that included exposure to alien concepts is risky. Management wanted yet more control.

Having already claimed responsibility for the definition and execution of leadership, management proceeded to institute leadership education, an indoctrination designed to tie specifics of industry and business to the nebulous function of propaganda and mind control, the nadir to which management pulled the concept of leadership. Under the direction of management, a barrage of educational initiatives launched that harass and hinder workers today. Everyone is familiar with them. Some take the form of high-priced "retreats" during which Ph.D. management consultants subject top executives to all manner of esoteric lectures and experiences contrived to demonstrate and strengthen the control psyche of convinced adherents. You might know these top management students simply as "suits," but they're stuffed with toxins. Management also has numerous incarnations of their leadership education intended for every level of management right down to the lowly line supervisor. You know these as "workshops" led by traveling mountebanks, local professors of business and psychology, and, on the cheap, company executives who excel in group deception. There are endless courses that can be purchased, not to mention books and tapes. HR directors are typically in charge of activities directed at middle managers and below. And all larger companies have a variety of indoctrination schemes aimed at new hires and entry level workers who remain close to the door their entire lives, or, in HR parlance, "careers."

Those who emerge from the morass of management directed leadership education are, without a doubt, better prepared to exert control over their subordinates and render mindless deference to top management. Often, a single exposure will suffice, but, unwilling to chance failure in this all-important endeavor, management provides periodic boosts. These intermittent "leadership" shots in the arm (or ass) always include generalist material to inoculate against any doubts that might have arisen among workers in any facet of their work. This is done in recognition of the fact that people sometimes slip, much as religious people sometimes have doubts that must be buttressed against the headwinds generated by unbelievers, and, gasp, facts they may encounter in the wild.

Booster programs always have a specific topic receiving special focus. This bit of subterfuge slips into the agenda boldly as excuse for the delivery session but innocuously into the mind variously as training, resources knowledge, heads-up, tips, utilities, law and competitive advantage. Who wouldn't want a little assistance that might save you some trouble and even make you some extra bucks here and there? The main point is made with blunt force and subtle attack and repeated in various ways as the session leader may deem necessary by reading the participants' responses. There are lots of tricks for deciphering who among the workers is getting it and who may be lost or even resistant. Enthusiastic smiles, aggressive nods and a willingness to speak up positively will virtually ensure that a worker will not be singled out for further consideration during the session or, worse, later. Games are often played. HR people and their traveling instructors love games. They take up time, presumably make a point over and over again as they are replayed to run out the clock, give everyone a sense of participation and dread of being called upon and let the leader evaluate each worker's sense of acceptance of what they are being fed, often more important than belief that they are really learning anything. Go along to get along is always the best strategy for workers wanting to avoid complications in their employment.

Instructors of booster sessions have a keen sense of what they call information that is their goal to impart but also a sixth sense about who may be reluctant to receive the message affirmatively. Still, the information they present is important because, more than mere knowledge, it is intended to alter behavior in some predetermined way. Sexual harassment courses are a good example because they contain all sorts of legitimately good information but are primarily intended to secure compliance with conduct that does not get the organization sued. Needless to say, the participant who attends the session with porn playing on his laptop will be singled out for further training. But this example misses the point that more often than not, the behavior that top management wants to alter is something far more understated. Management would want an employee to leave the session with an improved vision of top managers and the direction they have taken, as well as a commitment to act on the information they have been fed. For this, there must be acceptance and for acceptance there must be assimilation, one reason for the repetitiousness encountered during the sessions. If an HR director, for example, decides that managers throughout the company need to make appearance the key determinant of hiring selection, it may be necessary to overcome the objections of those rugged veterans of past folly and followers of proven results. Specific exercises would need to be introduced that undermine the tendency of some to gravitate toward effectiveness and proven job histories as opposed to a smooth voice or a great pair of legs. Management wants everyone to leave these training boosters with a determination to carry out a plan in which they have come to believe. HR managers want to glean some glory from the event along with continued job security and the instructors, particularly traveling ones, will want to leave with their pockets stuffed.

A word remains to be said about low wage jobs and how they relate to the takeover of education by management and specifically how engineering leadership education impacts entry level of employment. For this, it is helpful to keep in mind that the goal of capitalism as it relates to education has forever been control. This is widely acknowledged even within the close confines of delusion and American history that often intersect. Anyone whose thinking is the least bit fuzzy on this point should consider the zero foundation of educational control in slavery. In many locations, it was illegal to provide any educational instruction to slaves because the labor supply was intended exclusively for manual occupations that did not require education. Slaveholders did not want their workforce distracted by something that would lead to dissatisfaction and worse. They full well knew what would happen if someone like Frederick Douglass learned to read.

For whites, options were free but constrained by illiteracy; few would learn to read without being taught and there were few with knowledge. Additional natural restriction occurred though the lack of time available for people at subsistence level to apply toward more than rudimentary learning; and the fact that no more was necessary also limited expansion of education. In those days, of course, there was no management because we were in the independent worker period. When technology increased and the need for greater doses of education became evident, management was on the scene and took over control from independent workers and the limited class of capitalists. It is at this point that the often mentioned aspect of social control began to be attributed to business, its foundation having been well built, both intentionally and naturally, from the ground up.

By limiting education, management was able to enforce its social control, a commonly noted assumption. But how, exactly, and to what end? These questions are often overlooked in the rush to congratulate ourselves on having noticed social control through education; too often, we fail to realize exactly what was going on and how we pay the price today. For a long time, it was easy to hide what they were doing in plain sight because managers sent their children to public schools like everybody else, except, of course, the topmost elites who always sent their children to private schools and the most prestigious universities. For decades, before education became complicated, public schools could be counted on to deliver a good basic education. Most kids stopped somewhere during the public school experience with increasing emphasis placed on high school graduation. Children of managers then attended public universities. But even in this process, social control took official education forks in the road. Segregation was one, with the force of law in many states, by custom in others and by circumlocution where all else failed. The lowest ranks of workers were thus sidetracked, leaving white kids to be gradually guided into trade pathways of various kinds where they were provided training for skills to earn a decent living according to the standards of their time and race. And when technology advanced to the point that a larger dose of training was needed, post-secondary trade schools and community colleges were instituted. Meanwhile, public university students were being taught more subjective material that would help them control their subordinates in the workplace and elites floated above all of this. So far, so good, right? Everything here is just as we learned—predictable enough social control. So, what?

In an ominous reminder that perpetual motion machines don't exist, the United States Supreme Court struck down racial segregation in public schools. The 1954 Brown v. Board of Education decision was years ahead of its implementation but time soon caught up with a vengeance. In the space of just a few years, the whole country was alive with new demands for social justice and equality. Failure of society to adjust was met with wide-ranging protests, including many against a distant war that was keenly felt throughout the United States. We sometimes forget that there were almost twenty years between the Brown decision and the resignation of Richard Nixon. Things happened fast during that time, things that were rare or even unprecedented in American history within a time frame that permitted a new generation to be born and rise to the edge of adulthood. As this generation filtered through the public educational system, they were aware that its structure was under stress from within and without and they were aware that their parents had lost confidence in the educational system. No more insufferable signs of failure were needed than the decision to allow college students to avoid Vietnam and for segregation to resurface.

Both of these were high-magnitude failures of public policy and social amity as well as management vision. Like many individuals and organizations during the period, management scrambled to make sense of what was happening and regain control. When it finally determined a course of action, management, hobbled by uncharacteristic indecisiveness, made a default judgment on behalf conservative claimants to its ear. Lip service, they decided, would preserve the form of racial equality while private schooling would be allowed to resegregate classrooms. Trade schools and increasingly common college matriculation among whites would dependably supply businesses with workers who could receive additional job specific training after employment. Under the bus went African American and poor students. It was understood that insufficient funding would choke public schools, a strategy that proved correct, particularly when augmented by vouchers and charter schools, a process that is currently playing out and playing directly into disaster.

Management, typically unable to see beyond its greed, got the social control it wanted but at the cost of long-term advantage for itself as well as workers. What they failed to see was a developing need for workers with higher skills and better education, not merely for the pleasure of knowledge, but for the benefit of production required in more complex job environments. At the opening of the twenty-first century, management faces an under-educated workforce and a large pool of potential workers who are incarcerated. Their solution? Double down on school vouchers, charter schools and homeschooling. This up and coming crop of workers may be sufficiently docile, management having largely sidelined unions, but there will not be enough of them. For some years, management saw a solution in off-shoring and in temporary work visas for foreign workers already equipped with the skills needed for complex assignments, educated workers generally prepared to accept wages significantly below American standards. A wave of anti-immigration measures and a tide of hysteria and fear now threatens these plans but management shows no sign of relenting. Their control remains complete even as it locks out all flexibility and curtails the possibility of naturally occurring economic growth. Having allowed conservative social prescriptions to thwart common sense and trounce what could have become widespread benefit, management has brought the nation to the precipice of disaster. But it is still in control and low wage workers are still under the bus.

Social control achieves much more for management against the interests of all workers, particularly those in low-wage positions, than might be inferred through the careful dispensation of limited education. Partly, this is simply a matter of wrongful focus on turning quick profits, a drug to which elites have become addicted. Management sees every expense as something to be squeezed as dry as possible with all the juice collected for investors. After all these years, management and its capitalist overlords fail to recognize that in a consumer economy that they claim to support, everyone is a potential consumer and that the widest possible distribution of income will correspondingly increase consumption with benefit accruing to everyone. It is startling that in an age in which data is increasingly valued, that management, formed for the purpose of control, would not be realistic enough to open the floodgates behind which are proven resources that can distribute greater prosperity. It's enough to leave a critical observer flabbergasted and wondering what else is going on in the background.

Low-wage workers are not merely under the bus for no good reason, at least in the opinion of management. The exploitation of workers fulfills an age-old myth that there must be a small class of owner-exploiters and a large underclass of exploited workers. This does not need to be the case but it has been that way for so many centuries that most people seem to believe it is necessary and may even be in our DNA. But things change, even DNA. For decades, we have felt the management class interposed between the two ancient rivals, an invention engineered for better control of a worker population of a size grown unwieldy through the action of nature. Owners, capitalists, are so arrogant that they believe workers have no right to engineer their own redemption, also through natural means, the use of their brains.

By accepting responsibility for the historic mission entrusted to it by owners, management, above all, protects and extends the wealth of their masters and themselves through the subterfuge of control, a trick the mind plays upon the willingly deceived and the coerced alike. Many are deciding that it is high time to break with the past and launch themselves in a new direction. But wait. They have to demolish another brick wall elaborately constructed in the process of control. That additional barrier is the fact that management marshaled control not merely of the workplace, but of society, utilizing their control of work and compensation to determine not only strata, but also the composition of social acceptance, of everyone's place in the world.

The belief that management is synonymous with leadership is a deception now commonly accepted, and thus integral, enabling and overwhelming the intent of its deceit. One of the deepest sorrows is the fact that many believe that following a management route is necessary for success and advancement. It is misguidance that costs the economy, shutters creativity and breeds misery to the present moment. Those who believe they must accept management if they crave leadership suffer from world encompassing delusion. They bring the misunderstanding home to roost and brood and spread like a disease.

Those who acknowledge leadership as life, as creativity, as the animator of value, don't waste time with chicken and egg questions. Management cheerleaders, having likely pondered whether leaders are born or made, whether they arise naturally or are coaxed and trained, will devote attention to the conundrum of their genesis. Thereafter, management zombies will credit training and specialized leadership education for their exalted position. Leaders, meanwhile, will do what leaders do. Managers have false confidence produced by engineering for specific descriptions that carefully exclude the real basis of leadership and creativity.

Entombed in the ignorance of false knowledge, management wreaks harm on the organizations it claims to serve. True leaders and non-management workers understand this, as do many managers themselves. But the range of reaction is both wide and deep. Consider, for example, that most workers do not believe their jobs are necessary. Consider further, that the realization applies far less in manufacturing than in office environments where the spillover excess of authoritarians has freer reign. They know, but it would be a mistake to think that they soldier on in disregard. The simple knowledge is a kind of honesty and liberation that elevates them above those who refuse the evidence, especially those who double down on deceit. The moral high ground is elevating in itself, but the temptation to do something about it impels contravening action.

For leaders, enlightenment often means disengagement that can take numerous forms. Some revert to intense devotion to study, sometimes meaning academia but frequently private delving into esoteric realms unavailable to management. The ranks of freelancers also grow through realization. Impetus for labor organization, often frustrated on the twin horns of management and government, simmers unrequited. Yet, sabotage is rare despite fabled bottle caps sealed inside car doors by disgruntled autoworkers.

Passive aggression is one of the chief byproducts of the realization that management coopted only the claim of leadership, not its reality. Passive aggressive behavior is exhibited by managers and other workers throughout the workplace and takes myriad forms with cascading consequences. There are specific reasons that evolve with changing circumstances but they all grow out of the realization of the worker of having been had. Resentment? Aplenty. And disagreements of all sorts including the certainty of being able to do a better job by doing it differently or even doing another job altogether that will yield superior results. It may also be that the job is not done at all, given that management habitually demands more than it can monitor and that results are often easily faked.

Passive aggressive behavior is everywhere; no catalog can possibly record it all. Passive aggression flies over, under, around and through clueless managers who swallow the bait from top management, "hook, line and sinker," as the saying goes. Other managers can't see passive aggression among subordinates because they're concentrating all their energies on their own passive-aggressive strategies. Still others observe some things and miss many simply because they're too busy or no longer care.

Examples, however incomplete, serve not only to indicate scope, but also intensity while barely representing variety. There is virtually no limit to the representations of passive aggression that can be dreamed up and implemented by unhappy, dissatisfied, job-weary workers. Passive aggression can be as little as refusal to follow the exact wording of a script. A fast food worker might resent being told exactly what to say and substitute a few words of their own. In and of itself, that's a little thing but it's nonetheless a form of rebellion that may go unnoticed until a professional shopper hired by the company flags the error on a report. At that point, the violation is a matter of minor interest because a worker acknowledging the presumed failure (it's always initially presumed to be inadvertent) knows they have tweaked the company and will likely continue to do so and even broaden the forms of their misconduct. It is unlikely that such a trivial transgression will be noticed again because companies rarely target a worker twice if they are guilty of a minor infraction. But, should they be caught again during a given period, say, six months or a year, they could be cited for more serious consequences. Also, businesses typically claim that they purge employee files of minor infractions at least annually but actually never get around to it. Turnover is such that often employees leave the company in a few months anyway, but budgets are rarely sufficient for dedicated file maintenance and a manager may ultimately use the "error" against the employee at a later date, including during an annual performance review. This tiny issue can be blown up majorly if, as some fast food and other low-wage industry jobs are structured, a supervisor is close by frequently enough to personally and repeatedly witness the "error." The apparent outcome could well depend on how authoritarian the supervisor leans. It could be ignored or it could be a minimum wage job blown as a matter of principle with the offender bounced onto the street or around the corner where another fast food joint needs workers, however recalcitrant.

Consider the principle involved in this incident. For strict management, the principle could be a commitment to courteous service or a requirement to follow the rules regardless of reasonableness. For the employee, the principle could be as casual as ordinary friendliness, or an individualistic code of maverick comportment or an indication of something more serious. If the employee takes a cue from principles of disruption, whether class, economic or otherwise, there is certain to be an outbreak of manifestations of passive aggression.

Most employee theft is the result of passive aggression, a fact that management fails to recognize, thereby losing billions of dollars in profit every year. Occasionally, employees will steal because of need and sometimes thoughtlessly casually. Before the advent of cell phones, some workers would spontaneously make long distance phone calls on company lines costing their employer a few cents per minute; some would be hunted down and fired while most were undetected. But most intentional theft by low wage earners results from a simmering desire to stick it to the employer because of, well, because of what management does to them. Businesses decry this as false justification but it is a principle, as well, that of a finger in the eye. These misbehaving workers have no illusion that they are evening the score over poor wages, a claim that management prefers to make from sanctimonious righteousness, but they're getting what they can with the added pleasure of insulting the system the best way they believe they are able to do.

Would theft by managers also be classified as passive aggression? Here, we tend to consider the relative size of the offense, apt to be much larger than petty theft at lower levels, and the assumption that greed comes into play when amounts are significant. While that may be true, particularly when spectacular sums are at stake, leaders are not even tempted, being steadily in pursuit of higher aims. Managers who succumb at whatever target, regardless of magnitude, do so with a snide intent to disparage and take particular pride in racking up larger kills as unadvertised proof of their own cunning as opposed to dumb bosses above them. It is a myth that white collar crime is all about cheating on travel expenses, not to diminish its frequency because it remains very common. Managers have a much broader field of choices when playing with passive aggression. Side arrangements of all sorts are made at company expense and sometimes whole deals are diverted; winks, nods and understandings can be as potent as contracts. These transgressions are hard to pin down, especially when they are treated as sport and denied by participants at every juncture.

It shouldn't be surprising that the basis of passive aggression is the same for executives as entry-level workers but the fact goes unremarked because it is widely assumed that managers cannot be afflicted with passive aggression. Management takes it a step further with concern for morale which is exclusively centered on suspect lower echelons in the belief that managers cannot suffer from low morale. All of this is mixed up together in the minds and actions of top management. Some top executives obsess over morale to the point that it becomes a fetish expressing self-doubt erased with self-congratulatory submission to imaginary issues quelled by the superior wisdom of omniscient management. That sucks. And everyone knows it except the self-deceived manager.

The real answer, according to those presumably enlightened, is to inculcate a positive company culture. Read this as top management dictating the terms of a happy plantation. It's like one of those buzz terms gone awry as pie in the sky that can be true if only you believe and make it so. The business landscape is littered with companies subjected to directive driven contentment. Often, hiring is accompanied by explicit warnings of expectations of satisfactory behavior including requirements for obeisance and expressions of appreciation. At a low level, top managers may throw "them" a pizza party. Sure. That always does the trick, doesn't it?

While pizza is not known to hurt anything except a diet, prescriptive company culture misses the party entirely. Apparently known only to a few, the real answer lies in company culture that grows from shared values in a climate of collaboration and cooperation. That takes us into a future where company culture, as shared experience, escapes top-down requirements and operates of its own volition to the true satisfaction of participants. If you think about it, that's only reasonable.

Increasingly, the people who own and operate organizations are making the pursuit of reasonableness a fundamental aspect of their business and they are finding unexpected benefits along the way toward total liberation from the constraints of management. It is not an easy path to follow; its markers are more intuitively than intellectually derived. There are not only obstacles along the way, but also crippled wayfarers sidelined by indecision, timorousness, and failure. These will resume their journey or quit the course entirely for the simple reason that the Industrial Age management mindset doesn't work and efforts to ameliorate its worst abuses are inadequate. Organizations of the twentieth-century mode apply only protections for workers that are required, set aside conscience for obedience and miss the point of leadership altogether. The task at hand is to identify key elements of an effective future and use them to activate benefits for everyone.

# Part Two: Administration

Administration, one of the widest avenues toward leadership and away from management, has been potentially available all along. But its access was blocked and its effectiveness closed by management, anxious to restrict benefits exclusively for itself and fearful of having them unleashed for service among others. That sounds like what it is, control, the typical management shtick, except that it is especially urgent for management to subdue administration because it is the closest living relative to what management purports to do. They can't afford to have administration roaming about freely, lest it directly undermine control, not only through its manifestation in the workplace but simply through common acceptance and mere definition. Administration gets stuff done, precisely what management claims to do, but the all-important difference is how it approaches its mission.

Managers are manipulators and controllers. Administrators are creators and leaders. Both seek to sustain, with one directing and prescribing while the other permits exploration, facilitates cooperation, locates commonalities, guides and assists. An administrator is a leader with lift. An administrator is a leader whose creative wellspring supplies moment-to-moment cohesion with compassionate guidance. Administrators exude patience and inspiration. Managers are intolerant and impatient; they are demanding, even coercive.

An administrator might sometimes be confused with a coach. Like virtually everything else it touches, human resources offices perverted the concept behind coaching, contorting it into another means of control. They have also popularized it a measure of a manager's effectiveness; coaching now appears on annual evaluation forms, a sure sign of the institutionalization of a concept best left in the wild. What management means by coaching is nudging toward a predetermined outcome. Use of the coach label is intended to sugarcoat control and make directives seem like fun and games, often duping workers into conceiving of execution as something they connived to implement themselves.

A leader views coaching as an opportunity for service that benefits both workers and employers. A leader, through the position of administration, seeks to enable workers to locate and release their potential. They do not attempt to force an outcome, instead, they encourage and facilitate the opening of pathways with the implicit understanding that workers will reach their own conclusions whatever they may be. Coaching encourages workers to think independently. The result is stronger workers, a freedom oriented workplace and an organization that benefits from what workers create as well as from the atmosphere that continues to present opportunities.

Coaching should not be confused with mentoring that is another wellspring of leadership when not diverted by management. Again, management has largely hijacked the concept of mentoring and institutionalized it as a virtual requirement, making it yet another avenue of control, ensuring that workers behave as perfectly regulated, dependent, mindless cogs in a machine. Unlike coaching, mentoring establishes a long-term relationship between workers that is more general in nature than specific coaching and which offers the benefits of wisdom to younger workers. Ultimately, a well-mentored worker has a greater sense of perspective, having had the advantage of counseling from a non-manipulative older worker who has no ulterior motive. In management scenarios that make mentoring a quasi-official policy, mentoring is reduced to prestige coaching with mentor and mentee mentioning each other for plaudits within and, with public relations engineering, outside attention. It's just more falsity masquerading as quality and is really intended to keep younger workers in line.

Administration is much more than coaching. An administrator has group responsibilities, that, to some extent, involve guiding which is somewhat akin to coaching except that it involves diplomacy. Administrators are also involved dynamically and integrally with the members of their group instead of remaining entirely at the arm's-length needed for diplomatic maneuvering room. If this sounds something like management, that's because it is, but the differences are fundamentally important.

Managers dictate instead of guide. They demand answers instead of asking questions. They specify remedies instead of cultivating the curiosity that grows solutions. Managers bombard, berate and insist instead of offering assistance and opening channels. Managers hoard and snoop instead of providing resources and releasing information. Administrators respect their colleagues, listen attentively, and help coordinate results that are beneficial to individual workers, the group, the entire organization and the larger world of customers and interested parties.

Realizing the differences posed by the dichotomy of control and development, management does everything it can to demean and trivialize administration. It cannot afford to allow the value of administration to become evident, lest they lose control and their positions and wealth along with it. Take secretaries, for example. Long ago, these all-purpose, all-knowing, all-resourceful employees began to take upon themselves wide-ranging responsibilities far beyond the typing, dictation, coffee brewing and other petty tasks demanded by their resident tyrant. Partly, this was because increased education prepared them for greater service. Partly, these secretaries became the go-to source sought by all other employees. And partly, enhanced position evolved in response to payroll reductions that eliminated jobs while technology made the tasks of secretaries and bosses easier and quicker to perform. To this extent, management did it to themselves and they had to alter the title of secretaries to acknowledge their larger role and to provide that perennial favorite of cost-free job enhancements in place of remuneration, thus, administrative assistants.

But the appellation was awarded with a wink and a smirk that served to encourage others to view the process as humoring female employees. In that manner, the denigration was reinforced with time-honored sexism that continues to buttress little minds in big offices. It was sort of like calling garbage men sanitation engineers. It was the kind of joke everyone could be in on but presumably hurt no one. Hold the phone.

To preserve their rarefied status, management eventually realized they needed to distinguish their vaunted positions from ordinary task driven managers. By that point, the concept of administration as construed by and limited by top management had pervaded the whole atmosphere of middle management such that managers conceived of themselves as administrators whether or not they used the term. This played right into the hands of top management who was able to convey the sense of inferiority without employing the word. A "director of personnel," for example was clearly ahead of a mere "personnel manager," but just as clearly did not rank as high as a "VP of personnel." The director was sure to be an administrator while the VP presumably was calling the shots.

Curiously, the term "administrator" has acquired a certain odium in the public sector, ironically in opposition to the term "secretary" at the higher level. Secretaries and deputy secretaries, even assistant secretaries, seem to rate in the public mind while directors and administrators are reviled. There is a peculiar departure from this in the world of education that is quite revealing.

District level public school administrators are often viewed askance by the general citizenry who seem to have the impression that they are the root cause of all the problems in public education. If they would simply let teachers teach, the sentiment seems to be, then things would be okay. In colleges and universities, administrators are perceived with mixed feelings where they are thought to absorb too much of a school's budget. But university administrators who also teach classes are apparently respected more than anyone; perhaps they are thought to be earning their keep.

Consider that peculiar milieu. Administrators engaged in teaching are okay while those who guard offices are not. This thought stems not simply from experience and activity, but also from a connection with the fundamental value of the workplace. Whether completely understood or merely intuitively believed, the meaning here is that teachers are getting the job done, some better than others, to be sure, but they're working. And if administrators of the management variety insist, demand and control, that job will not realize the value that could be achieved with an administrator of the leadership persuasion who guides and assists. You can teach two plus two equals four until you have consistent "correct" responses, but only true teaching can help students figure out what that four really means.

With dysfunction abounding at every turn, it's no wonder that organizations are beginning to cast about for alternatives to management, at least as it has been known in the context of the Industrial Age management mindset. Google, famous for being willing to try lots of different things and unafraid of failure (of course, it can afford to be) decided that the solution might rest with doing away with management among its software engineers. The experiment was a failure resulting in reconstituted regular old management. Dissatisfaction among engineers was cited along with performance failure of the new initiative. The fact that highly educated, competitive individuals do not necessarily play well together when left unattended should not have been surprising. Individualism tends to run amok when left to its own devices. Think capitalism without regulation. But what we're talking about here is lack of administration, not lack of regulation. What those Google engineers needed was an administrator not a dictator. Of course, those super-urgent, hard charging types wouldn't go for anyone, include one among them, pushing management down their throats. They had that already. What they needed was a leader. If no one among them liked being a manager, they were not given the opportunity to be an administrator. And if not a single one among them thought they wanted even that role, they could have tried rotating administration among the whole group. That winnowing process might have shaken out a person who decided they liked leadership. But the option wasn't provided. The other mistake Google made was in not dropping management completely throughout the organization. Gulp.

A few organizations have kind of tried it with various levels of satisfaction and others have actually eliminated management altogether with success. There are a number of programs designed to accomplish the extinguishing of management but some are cumbersome and time consuming. Organizations tend to abandon the effort when it loses effectiveness in a morass of minutiae and wasted time. Some, however, find success and those organizations are richly rewarded. As a first step, read Umair Haque and Charles Blakeman. The evidence is overwhelming that the Industrial Age management mindset is counterproductive and unnecessary. Blakeman provides many examples and explains how it can be accomplished.

For the purposes of this discussion, it is more pertinent to consider how excellent people fall prey to the management trap. First, realize that trap is exactly what it is. Management seeks to co-opt true leaders, diverting them with lures tailored to their strengths or weaknesses, whatever works to tempt them. A leader, after all, is a creator and as such takes extreme interest in what they initiate and build. They want to see their creation flourish and if an organization comes along with offers of support, many leaders are vulnerable, succumbing to blandishments, to money or to positions from which they can continue to develop and perfect their creations. For these purposes, management flaunts itself as the ideal route.

The process by which leaders degenerate into managers is a pathetic tale of the ignorance, greed and arrogant wastefulness of management that prompts them to sacrifice potential greatness for the lure of an immediate buck. Having created something, many are then tempted to tinker with success in order to perfect and perpetuate it. That is the role of administration assumed by management when leaders delude themselves into becoming managers; they are lost in the process. The sense of gratification achieved through leadership is heady. Failure to release it binds leaders to form and substance and they degenerate into management.

Many leaders see management as an opportunity to extend their effectiveness, to accomplish more. Often, leaders see management as a means of avoiding interference from meddlesome bosses, only to discover that the opposite is true. The promise is autonomy, rank and their own domain to dominate. With the prize of more accomplishment in sight, leaders sometimes fall into the trap and management soon subsumes them, burying them under the weight of meaninglessness. As leaders disengage from creativity and become absorbed into management, they concentrate on maintenance and lose the valuable characteristics of leadership, replacing them with elements of control.

Lots of people see nothing wrong with that, mostly because they are subject to the mesmerizing influence that management exerts over most people in society. The drain of productivity caused by management control is often slow and invisible; only when utter incompetence brings down a business does it become discernable. But the malaise induced by years of wearing away worker vitality eventually leaches the productivity and causes business failures literally fist over hand with reasons attributed elsewhere. Still, a common objection to reliance exclusively on leadership with rejection of management control stems from the claim that entry level employees (more and more of workers) need strong supervision because they are not prepared to function with sufficient responsibility unless subject to intense governance. This has the effect of making workers dissatisfied, sometimes recalcitrant and definitely under-performing. By subjecting young workers to harsh oversight, they are being set up for a lifetime of diminished performance that hurts workers, the organizations that employ them and anyone in the larger world that does business with that organization.

This morass of fatigue and failure could be avoided if the organization would turn from supervision, a guise of management, to administration, a guise of leadership. Administration can extend the effectiveness of leadership. Potentially, this means that leaders who, under the present Industrial Age management mindset mode of operation, shed their creative instincts for control and dominance, can, instead, retain the qualities that made them rise in the first place, while enhancing the effectiveness of those attributes in the practice of administration. Each administrator, acting within the spirit of leadership that animates administration, will find the appropriate steps that best apply in the situation at hand. Because of this, there is no prescription available to apply, no formula to follow invariably for success, but there are some common threads that seem to be woven through successful administration wherever it is found.

When their concerns are examined separately, administrators seem to juggle numerous roles. But seasoned administrators, having achieved a reckoning within themselves that renders their judgment as pure and genuine as humanly possible, realizes the unified flow of all their activities. We've already discussed how guidance is better than diktat. That is because the guidance offered through administration arrives filtered of greed, ulterior motive, control and desire of any sort apart from pristine best wishes for the well-being of everyone. Having conquered their own temptations, administrators are free to support others within whatever context of need arises. But make no mistake about identification. The slightest push for specific outcomes, the smallest inclination to nudge turns an otherwise acceptable human being into a manager. No matter how many seemingly separate facets of engagement are deployed, an administrator bundles everything into a single flow of energy on behalf of others and all.

Administration does not demand allegiance or insist on being obeyed. Administration is the practical execution of leadership by identifying connections, encouraging cooperation and extending unified comprehension. Observers will notice what seems to be compartmentalization being built by administrators. Closer examination finds tight linkage and symbiosis between what appears as independent elements. For the administrator, it's all one project, but those being served by administration enjoy work in the relative isolation of independent domains. These workers are not pulled and pushed into areas of discomfort or expected to meet artificial goals. Freedom is the key. The administrator freely ranges without borders but co-workers who prefer compartmentalization can remain sectionalized without fear of interference. Respect for individuals is fundamental and willingness to seek diversity along with the absence of uniformity is essential, attributes that cannot be said of management or work conditions under the Industrial Age management mindset.

Management has done an outstanding job of imposing contrived requirements on workers who would prefer to avoid a phony world constructed for insincere purposes. Management has leveraged his self-appointed positions as both arbiter and controller of organizations and workplaces to establish artificial criteria for entry into its exclusive ranks and ensures that meeting its requirements is the only threshold to financial success. By making it so that no remunerative comfort can be attained without pursuing a management career, countless individuals are lured into the management trap where they are debased, broken and safely shelved away from possible effectiveness. Again, the power of wage control is devastating to creativity and productivity.

The reflexive response to wages is that management should always receive the juiciest rewards despite their role in restricting progress and inhibiting, even in crippling, those who could, under the circumstances of leadership, achieve much higher value for themselves, their organizations and the world at large. We have been conditioned to think that managers should always and necessarily receive the largest pay. And we automatically assume this to be true, the proverbial received wisdom of ages, without ever thinking through its meaning in the workplace or as part of the larger dimension of life.

Spend a little time in a fast food restaurant and you will spot someone with a clipboard checking off requirements as workers meet them and performing their own duties according to rigid plans that must be certified and reported. Spend a little more time and you will see that person spending a little time simply nosing around, perhaps disappearing into a closet sized office for a while. But watch carefully and you will notice that the same person darts about frenetically actually performing tasks with lightning speed. That's the manager. Other workers, in the meantime, have been busily performing their assignments, relentlessly, doggedly assembling burgers, salting fries and running a cash register. The question is not what makes the manager's work of a presumably higher nature than that of other employees. We know that the manager has been busy ordering supplies, maintaining records, ensuring compliance with laws and company directives. The skills to perform these functions are greater than those required to slap mayo on a bun. But why should the use of more complex but not uncommon skills merit an increased level of compensation? The fry cook and the grill cook have the more hazardous duties, why shouldn't they be paid more than the manager? Aside from the fact that the manager is likely required to work an unconscionable number of hours beyond hourly paid workers and obscure secrets of wage theft and corner cutting that would embarrass the company, there is no justification. The skills necessary for the manager to master his job are those that, apart from an absence of ethics, are skills that most people would like to acquire. There are also less frenetic periods during the manager's day along with the ability to step aside from the hurly-burly that ordinary workers do not enjoy. The manager then, is a worker who finds satisfaction in the variety of his work experience and the execution of more interesting aspects. The manager would naturally gravitate to a job where those skills could be utilized. Why should the manager be paid more than a burger flipper simply because of being a manager? Yet, we persist in thinking that the manager should receive more compensation.

We make similar mistakes when thinking about virtually any profession. Take physicians, for example. The number of years of rigorous study are invariably cited as a reason to pay doctors more than others. But do we seriously think that we would have difficulty finding excellent candidates who would be averse to learning all that is necessary to be of such fundamental help to others? There is also the matter of expense involved with education. But if that expense were set aside, would it be difficult to locate individuals willing to work as doctors for a lower rate of compensation in a society where everyone had adequate funds for a meaningful life?

Workers who take satisfaction in administration would pursue outlets for their abilities and inclinations without the need for additional compensation simply because they are administrators. Under the current system inspired and maintained by the Industrial Age management mindset, workers who want sufficient remuneration for an adequate life must embark upon a management career with all the consequent frustration and absence of satisfaction that goes with it. We should adequately compensate all those who work in any manner whatsoever. No one should be required to relinquish their ethics, ideals and willing endeavors in order to acquire enough money to live decently. Creativity and productivity would jump if simply being a worker in a field of interest were compensated sufficiently. It goes without saying that monstrous income inequalities that presently characterize most organizations are morally and ethically repugnant in addition to being counterproductive.

Yet, management is accustomed to defending the indefensible, much of the time successfully delaying an inevitable reckoning. It would be wise to keep the multifaceted capabilities of administration in mind as we explore some of the key issues percolating in the modern workplace. These issues are piling up quickly against the ability of management to cope. If better days are ahead for workers, we need to understand how to reach them.

# Part Three: Issues

Through consideration of issues alone, we can make reasonable inferences about the immediate future. This review is important because the decisions that workers and management make over the next few years will impact their lives enormously while guiding the trajectory of even more momentous changes. Part IV will address some overarchingly critical questions with the potential to supercharge progress, freeze immediate issues or even roll back achievements. While taking stock of current instabilities, however, we can and must evaluate present issues with an eye toward guiding them forward with reasonable certainty of their outcome.

Subjects selected for examination do not constitute a complete list; far from it, the issues under review here represent only a few of the important topics vying for attention. Hopefully, the selection will highlight some of the most significant and establish a basis for readers to delve into others independently. In fact, the issues discussed are merely indicative of a broad field requiring deeper study and are intended to provoke further thought rather than establish solutions. It is only through an ongoing process of changing understanding that workers can benefit fully. Regard these issues as alerts with more to be uncovered.

Once a thought surfaces, we can never completely recede from its potential. We now have many ideas straining against and out of the past, into the present and demanding a future. Our tendency is to skip too far ahead without fully analyzing the present. Time, of course, is a challenge. It's moving ahead rapidly with scant attention and increasing urgency. Surveying exactly where to begin while sensing a need for establishing priorities is a luxury we can ill afford. As the future beckons furiously, all is immediate, everything is urgent.

It makes sense to begin with the most fundamental element. People are inescapably the reason for all of our effort here, a fact that, oddly it appears to some of us, flies over the heads of those who see some sort of ultimate outcome as the utmost concern. Remember that it is typical of management to have a predetermined destination in mind, exactly opposite of scientific or even historical research that seeks facts before conceiving a conclusion. The needs of people change and our investigation has to change with them. And it is the needs of people that we seek to satisfy, not the aggrandizement of business or management.

If we're thinking about people, we must consider demographics. Right away, minds leap to an assortment of specific categories and we lose sight of the overall picture, becoming distracted by details that inhibit comprehensive understanding. Age is one of the prime classifications that sidetrack our perspective. Because of the hype, much of it justified, about millennials in the workplace, it is easy to devote excessive attention to that single issue despite that demographics is a much larger study than age alone. For our purposes at the moment, let's acknowledge that generational differences impact workforces but recognize, also, that these changes will work themselves out. Generational conflict by this point in the twenty-first century is old hat with senior workers often seeking the leadership of millennials whose skills and awareness frequently make them natural assets and allies for seasoned workers with historical memory and useful experience. We should look deeper.

Applying any amount of thought whatsoever to people in the workplace identifies elitism as a persistent problem. How elitism should be considered in conjunction with demographics is at once curious and illuminating, with age itself becoming woven into the complex tapestry in recent years. The ancient foundation of elitism on accumulation, ownership and social snobbery exists alongside the often parodied "old money" versus "nouveau riche." There is nothing really new here except the vast and growing wealth gap between the two groups with the influence of the former being rapidly diminished commensurate with its loss of financial resources. Status is another matter. What is new and shiny typically overshadows what is worn and faded; the nouveau crowd definitely has the advantage there. Whatever the patina of antique culture and pride is worth, people tend to leap for the money when there is a choice.

The concept of meritocracy, presumably operating hand in glove with democracy, muddles the perspective providing optimism for some and skepticism for others with neither group examining carefully. Look closely and you will see an element of elitism based on emerging meritocracy. On its face, the concept of elite meritocracy seems like an oxymoron. But look deeper.

The romanticism of twentieth century idealism began to die when flower children took a bath and got a job. They're on Social Security now and any notion that what is right will necessarily prevail simply because it is right is, well, wrong. And dead. But the desire to believe in meritocracy is too strong to let its memory die. It desperately needs to be revived, but at the moment, it is a zombie corpse animated only by elitism.

Optimists like to point out encouraging signs. Younger people, they claim, pay no attention to race, for example, and play and work together without regard to social strictures. Acceptance of gender diversity and reduction of gender bias is common. Major universities are opening their doors to more financially disadvantaged students. All of these things are true, but only if you accept anecdotes instead of data. Snapshots of kids at an integrated playground mean little when you consider that they likely attend segregated schools. The proof of gender bias is in the paycheck. Universities, especially smaller ones, are ill-prepared to lend a hand given their expense structure and the really big ones are only beginning to respond after mounting outcries.

Where is the evidence of incipient meritocracy in the workplace where optimists insist it will be found? Ask the titans of Silicon Valley, the vaunted future of America, where it's still a white male dominated world that just can't seem to recruit enough of those other people. Still, they claim to be trying and that they really do believe in merit above all else. Let's give them the benefit of reasonable doubt, at least for the moment, and assume that they are sincere in their belief. What, then, is wrong? How does elitism taint meritocracy in even the best of circumstances?

Wealth underpins the attainment of merit. And when wealth is not equally available, meritocracy is flawed. Roll out all the anecdotes you want about starving ghetto children making straight A school records, gaining acceptance to Ivy League universities and rising out of poverty to acclaim in their chosen fields simply because they are the best at everything. All these examples accomplish is to highlight the fact that there is limitless potential being wasted because potential is inherent in every individual but usually requires more resources than bootstrap determination to flourish.

To highlight the role of money in the cultivation and attainment of merit, look at what happens when there are no resources. Look at what happens when individuals are left to drift as strong currents sweep them away. If outcomes with the advantage of wealth are evidence, so are outcomes where there are scant resources to encourage attainment and merit. Look, then, at the gap between the extremes to find inequality where merit could be. Without providing equal opportunity, we condemn potential to be forever unrealized and elevate the meritorious individuals who happen to have the benefit of resources. That is protecting the past instead of building the future. Such merit as may exist in these circumstances is only a shadow of what could and should be.

The worst of it is an even deeper flaw in the general conception of meritocracy. When resources are applied to individuals who are advantaged by birth and wealth instead of ability, we not only squander the potential inherent in other, more capable lives, we lavish praise where it is not due and mistakenly congratulate ourselves for justice and a meritorious society that does not truly exist. To believe that we have a meritocracy is only misleading and suffocating self-satisfaction.

True merit is not feel good mumbo jumbo humanism but merely a statement of existence that is valid, true and effective. It is status in the sense of state of existence, not artificial position and certainly not imposed position over or under others. We have wrongly imposed a concept of hierarchy on status that does not belong there. We must get past the idea that leadership is imposed form elsewhere. There is no divine revelation or even principles designed by superior human beings. Each person has a status that is organic, not a position that is contrived.

Each of us has a status of leader and a status of follower in different circumstances. Status is merely reflective of what is and it is natural. To elevate one status above another is to diminish the humanity of all while failing to recognize the independent validity of all. There is no disrepute in being a follower, unlike the position hungry false leaders designated by management. We should be giving emphasis to strengthening and developing followers. Preparing and strengthening them will improve the natural working relationship with leaders.

Concern about name, fame and position doesn't square with concern about teamwork except as being leader of a team. Concentration on being a leader diminishes teamwork. As a society, we tend to value individual position to the exclusion of other qualities. That has the effect of glorifying management while doing nothing of substance and value for the organization, let alone the individuals involved, including managers, who are robbed of an opportunity to excel through further development that is elusive when they are busy being managers. The management line is that management is the arbiter of leadership with the sole right of ordaining leaders. We have told generations of workers that aspiration to management is required to be considered successful. That is wrong, oh so wrong.

Everyone is a leader in some way, a fact that should be humbling to everyone—to every leader—and it is a fact that should make them better leaders, not less effective. The measurement used by management is power of control while the effective gauge of leadership is teamwork. Listen carefully and you will often hear managers say that they "direct" a team doing this or that. In fact, they well may be issuing "directives," one of their favorite words, but the workers adhering to those instructions respond out of necessity or fear but definitely not because of teamwork.

Leadership, characterized by a talent for benefit, begins with the acceptance of humility and constitutes an offering to others, a gift of valuable, sustaining work. Teamwork is the natural expression of leadership because workers appreciate and gravitate toward leadership that provides an opportunity for collaborative fulfillment. Elitism, in contrast, feeds the artificial separation between workers and management, destroying, in the process, connection of workers to their labor as well as among themselves. The future is with teamwork.

But not without a fight. There are many forms of elitism including cultural, educational, work and economic with all its guises prepared for battle by means subtle, bombastic and brutal. Perhaps the most hideous and cynical is the approach adopted by elites who consciously regard their activities as fraud perpetrated on those they regard as "little" people. As presented by amoral organizational elites, what they call leadership merely replaces management with elites educated and skilled in their own form of skullduggery. It's a twist from behind with a knife in the back of the unsuspecting, naive and defenseless. In the public as well as private spheres, they practice cruel repression with a smile while picking pockets, shaming and restraining. These elites see themselves in charge of both their own worlds and those of others, the very reason that laws are required but abused, that justice evaded is a trophy of the rich and the exact reason that labor unions are needed today at least as much as ever in history.

The longer our journey along the course charted by elites, the more forks we will face and the more wrong paths we will take to accommodate the expediency of our "betters." With each misled direction, we will stray further and further as the crow flies from our own best interests until the bird at last falls in exhaustion. That the unthinkable could not happen is the fantasy of elites, for it happens periodically only to be concealed by a new rise of elitism on the wings of malevolence and misconception. Elites have a peculiar propensity to misunderstand the needs and legitimate aspirations of subordinates and then to believe their own misapprehension, doubling the distance between right and wrong. Failing to understand their own history and their own depravity, elites eventually drive everyone to the tipping point of the misery of those they scorn. When lack of hope peaks, generational retrograde commences. That is where we are now and it is where workers will flail for yet awhile, our near future that shrieks for immediate relief that must be answered promptly. But elites this time are overplaying their hand. The time is fast approaching when universal basic income will be part of a necessary long-term solution. But there are also contemporary answers for temporary relief that we are outlining here.

The faster that we can make a significant impact on critical issues, the better our future will be, both immediately and in the long-term. The most fundamental issues involve people and while all of them are important, the deepest require special attention which is why elitism is so toxic, attacking the life of what should be free people and binding them to someone else's measure of what they appear to be. So, too, is the recognition of diversity as a key fact of life that should not be ignored or refuted through the practice of artificial restraint.

It makes absolutely no sense whatsoever to object to another human being on the basis of any of a number of characteristics including race, ethnicity, gender, sexual orientation, national origin, allegations of physical impairment or preferences such as religion and political or social affiliation. Problems arising from such prejudice are evident on both the extremely personal level and extremely broad national and international levels. While race is a major issue across the globe generating enormous tension, so are all of these categories that should be regarded as happenstance not delegitimizing circumscription.

Studies have shown that the greater amount of diversity that is in the workplace, the better will be the quality of work outcomes. Even small companies need to embrace diversity. The horizons of every co-worker would be broadened and the result would be more expansive outreach, greater input and ultimately higher satisfaction. Visible, superficial differences are only the tip of the diversity issue, but unfortunately what should be a nonexistent concern is often turned into something of overwhelming, if unreal, significance. More important—and very real—is the perspective that diverse minds can bring to bear on behalf of problem solving and creativity.

It should have occurred to everyone by now that national borders are more porous than ever. They should have observed the frequency with which people from all parts of the world are observed visiting all other parts, often permanently changing their residence or citizenship. This change in proximity alone should be sufficient to convince skeptics that they need to embrace diversity and accept differences among people. Mere tolerance is no longer enough, yet, we persist with unreasoned prejudice that causes problems rather than creating solutions available to open minds.

Diversity is unstoppable despite the persistent effort of individuals and organizations, including government, to squelch it. People of all sorts are active throughout the world and coming into contact with each other more intensely and more frequently every day. Their activity will eventually overcome all attempts to restrict them but in the meantime, the inertia typical of management and individuals ensnared in it will reinforce the status quo creating a drag on progress. This tends to freeze inequality in place rather than allow it to dissipate through positive forces generated by diversity meaning that we will contend with the impact of inequality, not only of income but also of opportunity for years ahead. This is the stew that constitutes the transition to the future as it presently stands.

The upshot is disheartening, especially being aware that enormous improvement could easily be made if people would only permit it. Future generations of management are already established in the pipeline; these people have every interest in maintaining the status quo, protecting the past and obstructing progress. While they will not readily yield to better ideas, it is inescapably true that they cannot do everything themselves and they certainly cannot control everyone's mind. This makes the expansion of progressive knowledge more important than ever and means that workers hired in the future will play a large role in creating systemic change. It also means that existing managers who are responsible for hiring will have outsize influence as they struggle to maintain present workplace conditions of the Industrial Age management mindset. This, in turn, makes them vulnerable to those exposed to progressive ideas and creates an opportunity for eventual improvement.

As larger numbers of young people enter a workforce already being altered by existing workers exposed to progressive ideas about leadership and work, conditions will improve. A critical factor will be the fact that what workers want and expect through work and workplaces will collide with the old assumptions of management control. Currently, managers designate employees that they determine are "weak," despite the fact that they may possess and yearn to exhibit latent abilities that are prized in an atmosphere open to leadership. Management loathes these "weak" workers and either terminates their employment or assigns them for redevelopment by HR specialists in redemptive control. Saving and redeveloping employees is sometimes cited as a cost-effective way of preventing expensive job turnover; actually, it's more on par with conversion therapy aimed at gay people. Often, these "weak" employees simply do not respond to control, do not want to be managers themselves and seem odd to their superiors.

Real redemption of "weak" workers will come through realization of leadership in the workplace instead of management and with the understanding that everyone is a leader somehow, that cooperation is better than control and that what appears as weakness resulting in job change is actually dissatisfaction that seeks relief through growth that is often restricted on the job. Workers often job hop for a little better pay or working conditions. Leaders often make career changes as they grow but they also frequently pivot within their own organization or business. These altered circumstances benefit not only the leader who is making the change, but also his co-workers and organization that reaps the availability of his wisdom and guidance.

More frequent job changes and career changes that are often cited as characteristic of millennials in the workforce is actually indicative of deeper change in the consciousness of workers that is beginning to impact the transition to a leadership future. Talent will find an outlet and capable workers are locating more venues that appreciate their abilities and value their contribution. Management, in other words, is beginning to fall behind and even lose as the transition to leadership expands. What is now seen as an awkward, sometimes distressing job mobility will, in the future, be regarded as the natural elision of growth and opportunity.

Underlying the change is a difference in what people want from work now as opposed to the recent past and the twentieth century Industrial Age management mindset period. Ironically, one of the forces driving the change is management itself whose backwardness and repressive attitude toward employees drove them to search for options and consider alternative approaches. Things became—and still are—so bad that workers began to insist on improvement. What workers see is that a convergence of work and value is possible; they want to believe in what they do; they want work to be worthwhile and they're beginning to insist that their work and workplaces meet their ideals. Their efforts are starting to bear fruit.

There is obviously a long way to go, but applying their ideals to work and workplaces, alterations are now coming to contemporary work life and even more dynamic change is on tap for the future. One of the lessons now being learned may ultimately make a much larger difference than is typically realized. The very changes that workers demand make them more productive. In recent years, productivity has fallen, leaving experts to scratch their heads and wonder why this important factor in economic considerations and profit faltered. They are learning that workers themselves, if loosed with their innate faculties, make all the difference while, at the same time, becoming happier at work.

But "at work" doesn't always mean in a particular place of business or with a given set of co-workers and it certainly doesn't mean being employed in a regime of scripted outcomes and rigid conditions. The growth of freelancers in the workplace provides sufficient evidence to indicate a more flexible future. In a few years, the majority of "professional" workers will be freelancers, partly because of pressures within businesses to maintain very lean staffs, but partly also because workers often see more interesting, remunerative futures in a varied landscape that offers challenges and rewards apart from traditional expectations.

There are problems inherent in advancing this vision far into the future. Presently, for example, service jobs are typically conducted in precisely the rigid circumstances that professionals are fleeing and it is often thought that there is no way that those jobs will or can change much, given that they are necessarily restricted to locations frequented by customers seeking products and services limited by the very nature of what is desired by those customers. Amazon is one company that offers a glimpse at how some of that may change with the eventual elimination of cashier jobs, currently one of the most plentiful types of employment in service industries. But that is peering over the rim a little too far for our present consideration.

In the meantime, we need to apply the principle of value at work to existing hourly wage earners and salaried workers. There is plenty of work to be done in order to meet the potential that exists for improvement and we must concentrate on these first steps before worrying about the distant future. All of this means that there is a place for labor organizations of many types to address specific issues such as wage theft as well as broad improvements in the workplace, including wages, best handled by labor unions. As we address these immediate considerations, we must keep freelance workers in mind because they are an increasing presence in the workplace and their very existence means that they must receive attention along with other workers. The rising but still unrealized potential of robotics and artificial intelligence pose issues for years ahead, but there is much to do now. The decisions we make, the directions that we take now will influence the outcome for all of us in the future. Capitalism, for example, tends naturally toward monopoly, and is attempting to gain strength exactly at a time when independent workers and truly empowered employees are also making themselves and their needs and aspirations known. How we handle this tension is critically important.

For established businesses, the question of transition away from the Industrial Age management mindset is a matter of life and death. All around us, companies that are not adapting are going away. Because of market value built during a period of anything goes capitalism, many disappear through a sale into the folds of a larger company where they are dismembered, their intrinsic worth leached and remnants, along with numerous employees, discarded. The thing that should make existing Industrial Age management mindset businesses quake is knowing that none of them will survive the growing surge of change unless they are able to adapt. And they cannot adapt unless they make fundamental changes.

The most important thing existing businesses need to do to survive is to scuttle management and allow naturally occurring leadership to replace it. Many formats and beta trials are underway and the exact route to a leadership future must be tailored to each separate organization. The process for some is smooth while others, typically holding onto cherished notions from the past, have a more difficult journey. Some fail but the risk is necessary because the alternative is certain.

In the meantime, until a thoughtful path is ascertained and begun, temporary measures intended to relieve the pressure being built by changing expectations is necessary. A variety of labor organizations are helpful for this purpose, easing the way forward with a variety of targeted measures that lift workers and working conditions in specific circumstances while benefiting the organization itself. It's not easy revising attitudes of hierarchy, control and subjugation while building a sense that everyone is both a worker and a leader and that creation of value is the goal. But if initial steps are not taken, and the sooner the better, it will be impossible to rescue intact later. Size matters little. Some very large organizations have already remade themselves while other behemoths are foundering; likewise with small companies. Being small doesn't necessarily mean being nimble, especially when entrenched founders and owners see themselves as bigshots and everyone else as peons.

The gradual rise, sudden recognition and escalation of freelancing comes in part to the rescue and in part as a ticking time bomb. In part, too, freelancing is a symptom, not a solution, with potential consequences that increase as time passes without permanent resolution of employment quandaries that can only ultimately be resolved years into the future when all elements are in place and ripe for long-term remedy. It is ironic that established businesses brought freelancing to the fore and now contribute to its spread, making use of its resources and fearing the competition it generates.

It is impossible to discuss freelancers apart from the "gig" economy, the economic milieu that accompanies the freelancers like wet and water featuring both fear and desire in an unstable combination that inflicts its practitioners and their clients alike. Because profit is the mindless motive of Industrial Age management mindset businesses, they pursued cost-cutting as a direct and rapid route to immediate bottom line enhancement with no view of future growth, no strategy for new products and certainly no consideration for the lives propelling the jobs being trimmed as fat. If they had but listened, businesses might have heard intriguing ideas from their terminated workers along with solid means of achieving them, all for the increase of profit and the benefit of the organization. These personnel excisions were followed at once by hiring those same workers on a temporary basis to fulfill ongoing projects and by creating the basis for competition from aggregations of experienced workers pursuing their mission under new guises. That competition also served the interest of existing and established competitors of the profit conscious companies who released talent upon the land and created new pools of problems for everyone besides, problems confronting individual workers and now the whole of society, including those businesses that created the problems in the first place. Freelancing, then, is a creature of existing and preexisting businesses in addition to the freelancers themselves and the clients and customers of all of them. It is increasingly woven into the fabric of organizations and their methods of operations to such an extent that it will be a significant factor for decades to come, that near future with which we concern ourselves here.

Unfortunately, many of those in charge of public policy fail to recognize the implications of freelancing at the present time; they can hardly be expected to understand its importance in the years just ahead, years for which we should be planning now. Nothing illustrates this better than healthcare debate roiling the public discourse for years. President Obama's Affordable Care Act, aka ACA and Obamacare, recognized the necessity of providing portable health insurance in the wake of a system almost entirely dependent on employer provision. Wiser heads realized that employees, caught in an employment environment much different than the past and changing rapidly, needed the flexibility of healthcare that could follow them when they changed jobs or struck out on their own. Conservatives, ever bound by the past, continue to resist fiercely.

Public policy must change in other ways to protect the ever-increasing number of freelancers who do ever larger portions of the nation's work. Freelancers badly need a means of compelling payment for their work from employers who play the game of loophole law to stiff these workers. Freelancers also need a strong wage floor that will provide a means of surviving emergencies such as incapacity, their own illnesses as well as those of their family. Many freelancers, thanks to endless competition and bottomless wage offers combined with an absence of regulation, work ridiculous hours for subminimum wages. Freelancers need to be able to save for retirement and the retirement system of the nation needs to respond to them as well as other workers. Freelancers need protection in their temporary work assignments, often alongside employees in an organization's workplace. Freelancers need healthcare insurance, a provision best attained in a single-payer system. Notice that none of this provides a cushy lifestyle, these are merely the human life requirements typically available as employment benefits in the bygone era of industrial employment. To ignore these needs is to ignore many citizens as well as the organizations benefiting from the work they perform. How blind can we be?

Apparently, blindness is rampant among public officials who seem not to understand that the world and the way it works is changing. Many of these elected representatives are wealthy and insulated from the problems of ordinary citizens whom they seem to regard as votes rather than people. Most workers scramble to earn a living and the route to earning, formerly a consistent, often lifetime job with a single employer, is virtually nonexistent already. That doesn't mean that everyone will become a freelancer, but it is a certainty that almost everyone will be employed by several different organizations during their working lives and the majority will have multiple entirely different careers during that time. Soon, nearly half of all "professional" workers in the United States will be freelancers, a fact that should be a wake-up call, not only to workers themselves, but to everyone, including those responsible for formulating public policy.

The implications of extensive freelancing for the economy itself are enormous, transfiguring it into something known by a large assortment of names including the sharing economy and the gig economy. "Sharing" is a misnomer when applied to what is happening because freelancers are left to fend for themselves, vying for work, in some cases fiercely, as animals might dispute access to food. This "gig" aspect is startling in a world that most of us have been raised to believe was orderly and sufficient. Nimbleness is required to navigate the now rocky shoals of a previously calm harbor. For many, there is no refuge at all, no rest whatsoever but anxiety and uncertainty aplenty.

Despite the fact that work will increasingly be handled on a disjointed basis in which parts created by some are connected by separate oversight to parts created by others, the gig aspect will exist collaterally with traditional employment for a very long time, perhaps forever. But the overwhelming spread of gig production will color the entire employment picture and influence every thought within every worker and every action those workers make or fail to make. The rise of numerous options also means the rise of much uncertainty, fear and doubt. The tendency of management is to manipulate these factors to its advantage against all workers, both freelancers and traditional employees. That this should be so begs for relief and looks to government for guidance but the extensive, virtually unlimited nature of expansiveness and circumstances involved make it almost impossible to expect. The expectation of ethics is pie in the proverbial sky so large that no police force can monitor the beat. The mixed nature of gig and traditional work offers employers, particularly large employers (here, large meaning dollar valuation, not necessarily number of employees) a limitless playground on which to bully. To some extent, technology beyond the control of employers serves as an offset to corporate power. Excellent tools, the latest, in fact, are often available to everyone, including small players, through a type of rental if they are inordinately expensive to purchase. This helps, but the largest recourse is in the more distant future when years of persistent resistance to management control is able to engender an interest in ethics while shutting down, one upon another, the opportunity and temptation to resort to malfeasance. The most important tool available to all workers in bringing about an ethical atmosphere happens to be embedded in a fundamental aspect of the gig economy that lies outside the complete control of management.

Collaboration is the one unavoidable component that enables the hybrid economic system encompassing the gig economy, organizations and government to work. Collaboration is beyond the reach of any organization, including government. Because all organizations seek to do everything possible to reduce expenses, jobs have been eliminated in all quarters. The work previously accomplished by employees must now be done on a contract basis with other organizations or freelancers. Often, freelancers temporarily staff organizations and it is not uncommon for freelancers temporarily embedded in one organization to interact with freelancers embedded in another organization, all acting on behalf of their temporary employers. In addition, there is extensive collaboration with employees of an organization and the freelancers working on a contract basis in that organization. And all of these people, freelancers and direct employees alike, collaborate with people who are not paid by any of the organizations involved in a project.

There is obviously no way that any management of any organization can control the extensive collaborative ties that buzz with continuous activity, a fact that is exceedingly uncomfortable to management, more for some than others, but is there to the potential fright of all top management. If any work is to be accomplished, collaboration is necessary under the constraints of profit requirements and expense control. Collaboration is unstoppable, not that management doesn't do its utmost to control it. Take, for example, the recent decisions of the Environmental Protection Agency (EPA). The new EPA administrator, averse to scientific findings contrary to his political ideology, removed parts of the agency website that serve information to the public and to scientists and other professionals. Jobs were axed and directives restricting contact with individuals outside the EPA were issued. None of this could halt collaboration among any of these people where it was necessary to accomplish an objective. Not only that, no individual could be prohibited from publishing and studying and transmitting information on their own. What was previously known as "teamwork" when activities were almost exclusively conducted by direct employees of an organization, is now collaborative in the sense that diverse individuals with separate perspectives fuse their efforts into a single outcome.

Communication is at the heart of collaboration and it will continue regardless of efforts to stymie its reach. In some situations, in fact, the more closely guarded communications are, the more focused is attention on uncovering them, circumstances that produce the purest leaks because the information revealed has experienced less alteration in the course of traffic. But the nature of communication is openness; that is, after all, what communication is about and it is the potential foil for management control. The twentieth century concerned process and production; the twenty-first century focuses on information and knowledge which are nothing if not communicated. Having earlier begun to minimize the elements of input and refine resulting products, today, we expand knowledge on all fronts and transmit it, not for edification, but for the further extension possible through collaboration, perhaps in the more distant future of our lives or in some quiet, thoughtful way that will, in turn, yield explosive possibilities.

In the near term, communication is that ingredient of modern work that permits everything else to happen. That is why now, and for all foreseeable future, communication skills are so important and why all organizations concentrate on their improvement. We tend these days to think of communication in terms of technical processes, and, to some extent, that is the case. But these are processes with content and the processes themselves, at best, serve only to enable, to provide, to facilitate, to smooth. The technical processes, including the hardware involved, are important, with the software taking center stage by providing ease of communication through means that are intuitive, direct, impactful. Messaging applications, for example, gain much attention because of their ability to provide services to a network of interested individuals with the capability of archiving and, when necessary, making information available to a wider audience than the workplace network. Refinements of technical aspects are expected to adapt to changes in hardware and accompany workers wherever they are, functioning effectively.

However intriguing the technical aspects of communication may be, especially to those of us who remember the IBM Selectric typewriter as a major advance, all the bells and whistles of computer wizardry pale in comparison to content. The creation of communication requires skill and education regardless of subject matter or purpose. Here, we open a link between communication and the future that is integral to every facet of our lives and our work. While education encompasses many aspects, its necessity as one of the underpinnings of communication carries special weight. No present or future endeavor will be successful without effective communication, making every job and every pastime dependent on the type of education that stretches the mind quite apart from rote application of facts no matter how esoteric or complicated they may be. Creativity, in other words, is the key to communication, more than the knowledge being transmitted. Think of education as the hinge upon which the door of communication swings open or shut.

Despite the fact that communication as a concept is incalculably important, specific communications are not necessarily sacrosanct. Human beings have been communicating with each other for scores of centuries while we are only a couple of hundred years into the Industrial Age; but a primitive grunt warning of danger far exceeds the importance of most of the chatter and imagery being bounced around among us today. We must realize that the cumulative weight of modern communication carries importance in and of itself because it is partly upon extensiveness that commerce moves, a preponderance composed of aggregated bits and pieces that, apart from the rest, carry little significance, but which, together, push forward both ideas and products.

Regardless of what else it does, social media had to be invented to fill the communications needs of twenty-first century workers. Not only are they time starved well beyond generations with leisure to pen lengthy letters, their business communication is much more intense and their private communication often dwells on matters subject to their portion of the world of business. Social media will influence the future for a long time to come, touching everyone and every conceivable endeavor. The thing that older people may not realize is that while workers are keeping intercourse alive on Facebook, they likely are also propelling the wheels of commerce, a quandary for anyone with Industrial Age management mindset and nose-to-the-grindstone mentality, not to mention blinders.

It is education that threads the needle of communication and pulls taunt the effectiveness of everything else. But as the importance of education in the lives of larger numbers of people increase, the more it is assailed by elites who want to restrict it, making available only such amounts that are necessary for the completion of specific tasks by their employees. They think they're being smart to withhold what they assume to be a ticket to prosperity for others, but they are really punching their own ticket on a journey to failure. With every passing day, education becomes more important in ways that diminish those who try to restrict it. From the outset of young lives, the application of restriction to education, whether it is through religious indoctrination, guidance by predetermined outcomes or other means, is cruel with consequences for everyone, including those who perpetrate the restriction. At a time when education needs to be open for everyone, a callous effort is being made to make it available to only a few. When the determining factors are race and affluence, we're headed for large trouble.

Besides the ill-advised results stemming from restriction, education at all levels is facing a remarkable crossroads and doesn't seem to be aware of it. We have already arrived at the point where education must be continuous and very few schools seem to be aware that they have a critically important role to play in the development of continuous education or even what it entails. The concepts of beginnings and endings, of initiation and graduation are anathema to continuousness but we seem bound as much by these past formalities as by the requirement that only students who can afford expensive regimens should imbibe the secret elixir. Successful people of all ages are now exploring life on the terms of curiosity, seeking education from every possible source including formal university courses, but hardly restricted to them or to books. A successful worker now readily finds sources of information everywhere and devises means of assimilating it in ways that are most advantageous and personally applicable. In an age where jobs are being held only briefly before moving on and in which people are more and more frequently enjoying multiple careers during their lifetimes, education must be continuous in order to satisfy interest and need. The more we restrict education, the harder we make life for everyone.

Education, its continuous pursuit and manifestation of its effectiveness is what will distinguish workers as we continue deeper into the twenty-first century. The more educated a worker becomes (not measured in number of degrees, but pursuit and effectiveness) the more that worker will expect to be able to determine their workplace and the terms of their workspaces based on their own needs and preferences. Workers will earn the right to make these determinations with little, if any, input from management. In recent years, a great deal of fuss has been made about working at home or elsewhere as opposed to a traditionally defined office setting. Some studies have shown that workers who congregate in regular offices are more productive because they have interaction, planned and casual, with other workers and that the resulting communication is sufficiently valuable to require workers to attend specific office hours every day as workers have done for generations. These studies fly in the face of other findings that tout the value of allowing workers to work at home or elsewhere with obvious advantages to workers but also to employers who no longer need to maintain as much expensive office space. Marissa Mayer famously upset many workers when she moved from Google where massive perks were intended to make workers want to spend endless hours on campus, to Yahoo where working from home had become customary for many workers. Her dictum that required workers to be on-site stirred considerable negative comment.

Another much criticized tendency in recent years is to design offices without walls such that workers must find a spot in a vast open space in which to work, again, forcing interaction that leads to collaboration. Criticism of this type of office space is justified, often by studies that show noise levels to be too high for concentration and the inherent need of workers to have privacy. Distraction is also an issue. Many workers, attuned to the needs of their bodies, want specific spaces where their facilities such as a stand-up or treadmill desk can be personalized. The list of pros and cons goes on and on.

What is missed in the debate is that truly educated workers, those with commitment who follow their curiosity continuously, will possess the discrimination necessary to know when to engage and when to retreat. They will know how to do it, too, both in the performance and in the preparation. They will demand and receive the right to work where and how they choose because they will refuse to be contained by any organization that does not recognize the value they engender and respect the terms upon which that value is rendered.

Those workers struggling to attain sufficient understanding of education to reach the point of self-propulsion need to have room to experiment. The ideal workplace should feature the means of privacy and the opportunity of casual contact. Selection of comfortable, customized furnishings are necessary to ensure that workers can move about as they please while working. Note that many like the option of standing or sitting and change these positions frequently. All of that should be encouraged with no expectation of predetermined arrangements. As far as whether to work in an office or at home or elsewhere, those expectations should be approached flexibly. It is in the employer's best interest to permit latitude in these determinations with the anticipation of evolving results as workers grow and increase their education.

It should be noted here that the foregoing discussion primarily applies to office workers, especially those involved with creating. Mostly, these people are not low wage workers. Societal expectations for low wage workplaces are considerably different and employers take advantage of that fact to pile on neglect and abuse. Many low wage workers are in service industries where workplace accommodations are minimal and the excuse that service workers need to be on site to perform their services often force all other considerations to conform. Today, much of the work done in service industries can be done literally from anywhere there is Wi-Fi. In addition, if low wage employers maintained HR departments that are the crack outfits they claim to be, arrangements would be possible that permit off-site work. It can be done if employers are willing to cede an element of control and if employees would form labor organizations. Factory and food services work as well as numerous other activities are necessarily location bound but the number of jobs in these fields are diminishing as automation increases. Medical employment, a growing field, is also restricted by location. In all of these exceptions, workers and organizations must exercise ingenuity to find ways to make these occupations amenable to their operators as well as productive and enticing to customers.

Workers and organizations alike are on the precipice of challenge. Each will either rise to meet the challenge or they will fall away. None are too important to fail and none will survive based upon Industrial Age management mindset standards. If twentieth century thinking is your guide, you may as well retire now, get out of the way and let others have the field because you will be unable to play the new game. The financial crisis of 2008 should have signaled that greater changes lay ahead and it should have marked a time for change in the minds of everyone. Instead, the old mindset organizations are fighting tenaciously to save their antiquated forms for a future that will reject them outright. Politicians reacted with governmental measures designed, not to make serious changes, but to salvage what could be identified as possibly worthwhile from the old system. Now, that old, broken, sick system declares itself repaired and returns with a vengeance to wreak another round of damage against a growing force of enlightened workers and organizations that want a better outcome for themselves and everyone else.

As we move into the near future, workers and the organizations that employ them, whether businesses or non-profits, will gradually become indistinguishable in a way that highlights the paternalism characteristic of old-line businesses that liked to brag that they were "families." Note that families today can be most any assortment of associated individuals; as long as the voluntary will exists, they function fruitfully and cohesively, occasionally altering membership and changing directions, qualities that were unknown in the past except as symptomatic of what was considered to be dysfunction. What the future will hold for failed organizations will be a shock to those of the old mindset. They will abruptly learn that their most valuable assets were people, not products, and they will find that the dollar value of their products will be swept away as debris in the wake of exiting workers who disperse into other opportunities.

The nature of organizations in the future will be determined by the affiliations of the people who constitute those organizations. That is true today except that the control imbedded in management requires that everyone examining those organizations look top down. As we cease to yield to the will of management and look at organizations from the bottom up, we will see the people first and we will see that they compose those organizations. Management will begin to vanish as managers, realizing this and seeing it for themselves, disappear also with the recognition of leaders. As this new vision develops more fully, it will be understood that affiliations among individuals change as they willingly link themselves in organizations and those organizations are described, not through organizational charts, but symbiotic linkages impossible to portray through the format of management.

With a change in the way individuals in an organization relate to each other and the organization itself, the tools that they use will be seen in a new light. Whereas technology and other tools, even the creative impetus itself, had been seen as a process for the purpose of production, these, including the spark initiating creativity, will be realized as complete within the cocoon of creativity residing with each individual and within the relationship of all. What emerges is not a product but value. There being no controlled process for a predetermined outcome, creativity will be the recipient of all tools, all willing and associated minds. If this sounds like so much woo-woo feel good nonsense to minds conditioned to accept the blandishments of management, consider instead, failure, because failure will follow upon refusal to adopt leadership as the rejection of management becomes necessary. Consider, too, that what is beginning to be the nature of work could not achieve realization from the framework of the Industrial Age management mindset.

Unfortunately, we are years away from attainment of the improvements leadership can provide when the old mindset is swept away; the existing management structure will fight hard to preserve itself. That is why labor organizations are more needed today than ever. The fact that unions today, at less than eleven percent of the workforce, have less numerous membership and influence speaks not to a reduction of their necessity and role but to failures and depredations that call for redress. As more workers engage in less physical exertion, the expectations that attached to labor unions began to apply less obviously. New workplace roles that keep workers in offices, for example, or working independently in any location, seem less suited for collective labor representation. Nothing could be further from the truth. Sweatshops today may be equipped with computers instead of looms but they are nonetheless sweatshops, even if everyone operating those computers have engineering degrees from a university. Suffice it to say, all employees, including managers are workers under the rule of a repressive management system and should be members of at least one labor organization actively representing their interests.

Another very regrettable tendency is that contemporary urban Americans, focused as many are on office work, fail to include the manufacturing sector in their economic appraisals, let alone low wage workers and agricultural workers. Not everything is about city life and offices. Often, we fail to realize that many factory workers are not unionized and that they face many of the same challenges as office workers. Wage theft, for example is not exclusively a problem for low wage and service workers but afflicts all types of work and income levels. When workers of all types realize that they have much in common and exert more influence together—presumably the unifying principle behind unions in the first place—more can be accomplished for everyone. It is important to understand that managers, being workers themselves, should be included in unions in order to broaden benefits and participation as well as inclusiveness as organizations transition to a leadership format instead of management.

Unions themselves must also change. It is not necessary to think of every labor organization as a union, but specific circumstances can engender differing types of labor organizations tailored to particular needs. Just as importantly, labor unions need to do far more to adapt to contemporary working conditions and find ways to represent those who now fall outside the scope of consideration, workers such as office workers, managers, engineers and professionals of all types. The recent interest of graduate students in forming unions is a positive indication that this is happening but the pace is far too slow and the field of consideration far too narrow. Every worker in every organization should be in a labor organization and the goal should be to transition to a leadership model in every organization, setting aside the Industrial Age management mindset entirely.

Professional associations could help bring about the change needed to implement leadership in organizations. Where these associations currently exist with the mission of including workers as members, they tend to be effete, more social than business and thoroughly fearful of offending management in any manner. The excuse for them is often merely association itself along with occasional doses of professional education in the form of tips, self-promotion and luncheon speakers. Presently saturated with old thinking, they are unlikely to emerge in the future reoriented toward leadership. That is, they are unlikely to change until workers who compose their membership engender change.

Altogether, a tremendous amount of responsibility rests with workers to foment the change they need without waiting for existing organizations of any type to show the way. In some cases, management has seen the light and instigated the change themselves. It is unfortunate that workers have been lulled into timidity over the course of recent decades, too fearful of change itself, not to mention management, to lead effectively as a common body of workers. That is why it is necessary for individuals, small groups of workers, managers and management to awaken across the spectrum and assert themselves.

Two huge obstacles exist. First, of course, is the delusion that is entrenched in our culture making it difficult to break out of a mindset so firmly established that it is usually accepted without question and around which all activities swirl, careful to avoid crashing into it. At best, we arrive in this world as if dropped in the middle of an enormous viscous pool where we thrash about, making slow progress toward the shore only with tremendous effort. Not only is progress difficult in these circumstances, but so is reaching others to join a common struggle, but the more individuals who perceive the nature of what is happening, the greater is the response, both quantitively and qualitatively.

Second, the next future generation of management is already being educated in the Industrial Age management mindset and established in place to succeed a mostly mindless, already atrophied blob that is entrenched in ignorance against all forces of change. The lure of management is too strong for most of the next generation to resist. Make no mistake that they are dragging all the baggage of previous generations with them as they prepare to settle into place to control their domains and channel its goods and services into its coffers.

A few glimmers of hope break through. Increasingly, younger entrepreneurs are aware of better possibilities and are moving toward them. Some older, established business figures also see that the past isn't working for the future any longer and they have the wisdom, the foresight, the determination and the ability to steer toward a better future than the past predicted. In this sense, some of the Silicon Valley entrepreneurs are an improvement, taking their consciousness of disruption with them into their organizations and applying it to the workplace. But do not make the mistake of believing wholeheartedly in these latter-day capitalists; they continue to trend toward accumulation, command and control; they tend to make as rapacious demands on workers as earlier tycoons and could ultimately find a legacy only marginally better and quite possibly worse. Some apparently have in mind creating a legacy featuring all the bells and whistles associated with their software in the belief that wide use of their products indicate personal approval of a digitally enhanced version of themselves. Trickle this false premise down to front line "managers" and you have a replication of the current system. Only those up and comers who have the sense and presence of mind associated with value and with the lives of their collaborators can cut through this stinky fog of greed tinged control. A few are doing it but they need the support of workers who broadly understand, not the herds of employees willing to accept every proffer of hay and hey as decent remuneration and recognition.

The faithful may object at this point that left to its own natural, organic development, management will eventually become synonymous with leadership, just as a few entrepreneurs indicate. The problem with reliance on this distant speck in the future is that there is not enough time left to get there without a shove from those who possess the understanding and the comprehension of what lies ahead as well as behind. They also need to consider that management is never allowed to develop organically and naturally because economic elites turn it into control for their own selfish purposes and they have lots of help from managers attempting to curry favor. Unable to see further than next month's paycheck and unwilling to gamble on the future, most managers see short-term advantage in kissing ass. Economic elites see no reason to behave differently than their current conduct because, let's face it, if grubbing money is your only objective, it probably is better to grab what you can while you can, leaving chance to those who want to groom their minds and spirits. Only as organizations become more collaborative through the actions of workers will their cultures change enough to alter the future. Failure is the final option.

Assuming that the failing organization has sufficient assets to perpetuate an image of itself for a period of time into the future, it can project a sense of viability to those on the outside and many on the inside who prefer to miss the signals of doom. Enron is a good example of this, feeding on public misperception and employee gullibility to the very end. An image of success as understood in Industrial Age management mindset terms can be achieved through a number of means including hollow philanthropy and public engagement. Outside management consultants can also dab enough makeup on the pig to make it presentable for a while. One major league consultant discovered that the organization employing him was simultaneously working another consultant, both tasked with different approaches to making the business appear solid until the CEO was able to parachute.

Many times, however, it is enough for the failing organization to use in-house resources to maintain appearances. Many organizations have sufficient public relations talent on tap to churn favorable public perception. A cooperative financial staff can also be a key player in dubious plans made all the easier when those financial manipulators are able to pave their own street with gold. Sticking around to clean up business leavings can be profitable, especially when those engaged in the effort are aware of what they are doing, preparing their own futures, sometimes using debris from the past but always with compensated knowledge.

Typically, mere employees of failed organizations are the last to know. Keeping them blind but onboard is an assignment for the human resources department, an all-purpose utility tasked with many types of seemingly contradictory and manipulative schemes. As blatant coercion became less fashionable with the rise of concern with appearances, some aspect of management was needed that specialized in deception while, at the same time, making itself the enforcer for the most unsavory as well as mundane undertakings of top management. There is no task too large or too small, too benign or too obnoxious, too upright or too criminal for HR to perform with aplomb. As the twentieth century progressed, HR became the go-to center for getting things done. Clothed in self-righteousness, specialized education combining public relations and pseudoscience and reporting directly to the highest level of management, HR became the engine of corporate mischief. Human resources offices are much like the financial arm of organizations except that one uses numbers to deceive and brutalize while the other employs words. Together, these two forces detail the build-up, crack-up and breakdown of organizations, all hidden by a cloak of letters and digits.

There is no small irony in the whole human resources department focus, especially in large companies where value may not exist at all but there is cash flow sufficient to pay salaries for people who construct the appearance that something worthwhile is taking place that is worthy of investment. All the while, it is merely their own jobs that people are trying to perpetuate and they will do anything, absolutely anything, toward that end. The irony doubles back upon itself constantly. HR specialists are cheek by jowl, capillary, artery and vein with management, knowing nothing more of leadership than anyone else and understanding it less than other workers, they propose to "teach" it, to inculcate it and project it. This rendition of mumbo jumbo makes them quacks by definition, purveyors of both meaningless nostrums and lethal malpractice. Employees are often deceived by recognition programs that distract them and flatter them into compliance but that also serve to elevate HR in the eyes of employees and make them think that what HR does is the most sanctified of leadership responsibilities. Human Resources workers have made themselves into the priesthood of leadership when, actually, nothing of the sort exists because leadership is for all equally and is equally available without need of an intercessor. All the while, these HR people are merely scrambling to save their own jobs. They are much more concerned with being employed themselves than in assisting other workers toward fruitful employment.

If evil genius deserves recognition—and it certainly should be exposed—HR should have an award for ingenuity, persistence and effectiveness. The problem with the particular productivity of HR is its perverseness. Consider, for example, the number of programs and quantity of required reading that HR vomits on workers. Some of it is distraction and some is intended to absorb the volcanic passive aggression generated by disaffected employees, almost like bait upon which workers can vent their frustration harmlessly away from targets more important to management.

No duty is more important for human resources to perform than wage suppression and HR has exercised this responsibility spectacularly well taking only a single generation to bring down the middle class. Those who would object that a mere facet of management cannot possibly be held responsible for toppling one of the most vibrant middle classes in the world, need to examine the complicit nature of human resources with top management; it is more than just an appendage, it is embedded in the brain of business itself and it is linked body and evil soul with lobbyists and financial manipulators. But it has gone much further.

When wages are so low that a worker cobbling together enough part-time jobs to work forty hours a week cannot afford nutritious food, safe shelter, effective healthcare, adequate transportation, acceptable childcare or sufficient education, in short, when wages are below living standards, the gravest injustice has been perpetrated against those workers, surely, but also the entire society. The business model of low wage companies is to throw their workers onto public assistance programs for a significant amount of their needs, raking in profits while expecting taxpayers to pick up the tab for maintenance of their workforce. But public assistance programs, meanwhile, are under assault by lobbyists for those same businesses with the object of diminishing the amount of public help offered in order to cut taxes for those profiting from paying workers inadequate wages in the first place—a vicious circle if ever there was one. Citizens should be outraged. Few people are profiting from those businesses who abuse workers, first, with low pay, second, by pushing government changes that reduce public assistance programs. But it only takes a few profiteers when the percentage of profits they take constitute the overwhelming share. What's left of the middle class is standing by while low wage workers don't make enough to eat, get sick and can't get well because they can't afford food or healthcare. There are two classes of citizens, the middle and the top, watching the bottom class of low wage workers twist slowly to death in the wind of indifference. And when low wage workers are gone, who takes their place? It's formerly middle-class workers who've fallen from affluence under the same pressures that destroyed the lives of low wage workers. So, what is the middle class waiting for? They are the middle managers who have been sucked into doing the bidding of top management, the working connection to the very richest people in the world. They are the human resources managers who damage other workers while ironically awaiting their own fate.

This sounds grim and will make not merely a bleak future, but a dystopia of dark age gloom despite LED lights monitored by computers and serviced by robots. As we will see in the last part of this book, there is ample reason to believe it could be even worse. How could it be better when the domination of wealth outweighs all constraint? In previous ages, national, regional and tribal identity served to limit individual wealth through the imposition of land boundaries across which one rich warrior dared not cross without provoking another rich warrior. Intermarriage tried to bridge feudal borders, sometimes with perilous consequences. Neoliberal aggregations of wealth, however, succeeded where blood lust failed and produced a class of super rich who float above all conventional demarcations with power exceeding any mere state no matter how militarily pretentious. Having engineered their rise to power, the mega wealthy must resort to the only familiar device available to maintain it: management.

Therein lies vulnerability because management is best suited to the Industrial Age management mindset of the twentieth century; it stretches to cover the emergent new demands of a workforce familiar with disruption and unafraid of trying something new. Leadership is the antidote to management. It is capable of widespread human fulfillment in the quest to earn our rightful bread whereas management merely controls on behalf of the shameful self-interest of a greedy few. With change roiling organizations and creating new expectations among mobile workers ripe to establish a workplace foundation responsive to modern needs, the old formulation of management cannot long endure.

The immediate future is unfolding now. It is tentative. It is leadership just beginning to bud, rarely but occasionally opening into a bloom of promise ahead. What is foretold is that story already revealed in previous pages, the story of workers, each a leader in some way, collaborating to create value that will call livelihood unto itself and provide for its own future, morphing along the way without fear of deviating from established pattern that no more stultifies the mind and restrains gratification of itself. All this without homage to an ism.

But the days ahead are fraught with danger for our budding future. Management, having acquired might in the past, will deploy coercion against changes to its future not realizing that resistance is death and that change is renewal. They are so accustomed to conflict and the fight that their obnoxiousness and will to force must be expended absent the reformulation of effort that only leadership can provide. Management will thrash and slash and some buds will fall before blooms overtake the whole with certainty of regeneration.

We see it now in the public sphere. Having grown out of all just proportions and having wrapped itself around the public space, invaded it as well, and gorged itself on content meant for all to share, management has come to think itself the boss of everyone with rights beyond the moral code of man. We must not overlook the fact that management is but an artificial tool and that which wields the tool is ownership. It's an old lesson here from earlier days that power over self, once sold to another, relinquished leadership to be subsumed in the greed of that other and eventuated in management, a tool whereby aggregations could be organized for optimal control for the maintenance and extension of yet more greed. Thus, implanted in public consciousness, ownership through management now overreaches to stake claim to everything as if their ism is writ large in law and custom when, in fact, it appears not at all except as a superimposition that can be thrown off by recognition of leadership. But ownership and management does not see that and will refuse to acknowledge its factual basis. To better defend their position, they have sought to incorporate even governments into their cause in a neoliberal spasm irrespective of party affiliation and, at last, thrown back upon desperation push these governments they now control as with all of management into fascist expressions the better to make the public the same as the private in defense of the totality of greed. The struggle as it appears in public, if it be political at all, is not the politics we thought we knew, but an existential battle between the greed of ownership and the expression of value, with combatants designated as management and leadership. One is artificial and benefits but few; the other is organic and available to all.

Skirmishes at the micro level can be bitter, disappointments intense, reversals not unfamiliar. Indeed, this is already so and will continue through the near future and beyond. The likely fate of every worker now alive will be to witness a workplace landscape pocked by smoldering discontent and flares of self-consuming destruction amid the slowly extending lushness of verdant growth. Toward the future, small organizations will have advantage and will remain organizations even as transition to leadership takes place, more smoothly when workers realize commonality and yield to each other in variable ways and cooperate in variable ways without conceiving an inclination to fight.

Where hierarchy now counts the most is where combat will be fiercest. Size is problematic. Although some large businesses have demonstrated an ability to plan and execute careful reconfiguration, it remains for the more recently risen dinosaurs to successfully navigate the jungle path without falling into the proverbial tar pits they currently disdain with mocking contempt for both workers and history. There is hope, especially where smaller units embedded in these new behemoths can wiggle free of the monster at least to the extent of nesting separately within it and establishing a precedent and an inspiration that can be adapted by other observant segments, enough to change the whole. Colossal atrophy and certain death may be the only alternative. Where magnitude alone stands defiant, its pylons will eventually be pyres.

This process will take years, more years than is seemly among intelligent people. During this time, we are left with two primary considerations. First, as always, there is the possibility of some unforeseen event intervening with long-term consequences, a potentiality that will be addressed in the next part of the book. The second central consideration for this lengthy period of transition concerns what will happen in the lives of people, not merely organizations.

It should be clear that both joy and sorrow, victory and frustration is expected to course through the lives of workers as they strive in coming years to construct workplaces and circumstances favorable to leadership. Sometimes the most obvious things seem trite, masking what is truly profound. That leadership is infinitely superior to management and will ultimately triumph does not make its attainment any easier. For many, the way forward will be a slow slog characterized by more of the imposition of ignorance famous in organizations of yesteryear and today. But obtuse businesses cannot avoid the fact that enlightened concerns are popping up more frequently on the landscape with unavoidable interaction. They cannot be ignored and their influence will penetrate even the most resistant armor. Individuals caught in this tug-of-war between ideals and reaction will need to seek accommodation to survive, retrenching periodically before reviving the struggle. Workers' organizations can help tremendously, but only if they adapt to changes more adroitly than they have up to now.

The swirl of conflicted circumstances and perspectives forces greater burden on individuals, an ironic twist given that workers are increasingly fleeing direct employment for freelancing or are being forced into it as organizations shed payroll. From their new outside looking in position, they have the perspective to comprehend what they had too little time to observe while employed directly. (It's amazing how much time is wasted in organizations because workers are busy defending themselves, looking over their shoulder, guarding their integrity and pacifying bosses who exhibit the same unsure and frightened behavior.) With the gift of time and perspective, workers can see the benefits of affiliation, cooperation, collaboration and participation. They might wonder, too, why labor organizations cannot see the same benefits and why directly employed workers cannot understand their value. It will come eventually and it will bring freedom and strength with it as leadership is established in place of management.

# Part Four: Future

To be successful in establishing leadership we must be mindful of the extended future. Issues that are certain in the present and likely in the near future with all their predictable patterns, combinations, modifications, miscalculations, corrections and ramifications cannot alone steer us successfully into the farther reaches of the future. While there is only so far that we can plausibly peer ahead, however before accuracy fails completely, as rational beings, we must plan for possibilities that have basis in fact and we must speculate with the educated degree of guesses available to reasonable minds. To do otherwise would violate our nature, disregard our future well-being and ignore the possibility of securing enduring value. While certainty is no sure thing, failure is always an option and although the future may prove more perilous and unpredictable than we prefer to believe, the slightly distant future, closer than in previous times, is close enough that we should attempt to map the known possibilities.

Untangling the knot of possibilities requires continuous effort. Each of the issues previously discussed and each of the categories about to be addressed will change and will impact each of the others as they change. As if these are not enough concepts to juggle simultaneously, add the fact that the pace of change is quickening and will accelerate in the future and the fact that each of these issues and categories will impact each other more in the future than now. This means that failure of any issue to be amenable to humanity or at least benign, matters even more in the future than it does now. Failure, already ripe on the lips of naysayers, raises the specter of dystopia.

Because we are not talking about some vague notion of discomfort as the squirmers would prefer, we must face the fact that dystopia is a thorough if not utterly complete malfunction of society. Such a condition necessarily involves public emergency and public policy that only government is equipped to address. If we avert our attention for even a moment, we will be lost and the possibility of being overwhelmed remains even if we observe squarely. Briefly considering the topics that could lead to dystopia should be sufficient, public discourse having already been adequate to provide detail.

Overpopulation is one of the prime areas where humanity typically ignores imminent threat. If for no other reason than warnings have been around for centuries without the manifestation of dire consequences, human beings seem generally to believe that something will be worked out that will enable population growth to be sustainable. Cynics point to the probability that some natural disaster could suddenly decimate human population thereby relieving the earth of the burden of caring for unwanted excesses of people. However heartless that assumption may be, it is a reasonable possibility. But what we often fail to realize is that overpopulation of the planet could fall prey to a combination of overwhelming factors, each of which independently could be responsible for dystopian conditions. Population could easily be subjected to rampant disease caused by lack of sanitary conditions caused by lack of water caused by climate change all of which result in wars as restive migrants seek relief in lands defended against an onslaught of unwelcome hoards. Given current conditions and tendencies, this might actually be more likely than a single cause scenario.

The natural elimination of excess population tends to presuppose a pandemic that sweeps aside millions of people. The mental comfort of scientific advance is a crutch against facing up to the possibility and the collective memory of humanity naturally seeks a way around confronting the fact that it happens periodically, including a mere hundred years ago, leaving more tombstones engraved with 1918 than any other combination of digits. Thinking that it could not happen again is folly. Reliance on global watchfulness and the cooperation of virtually all governments whose interests are served by collaboration is thought to be sufficient to prevent the spread of diseases noticed in the early stages of contagion. Swift response by alert organizations across the globe is wonderful but the often-noted speed of international travel puts the most diligent barriers at risk of penetration. Complacency also ignores the fact that new strains of old diseases may not necessarily respond to existing treatment and while new approaches are developed, a particularly virulent form of one of these diseases could run amok with millions of lives.

Pandemics threaten much more than death by disease. Consider resulting economic dislocation that reverberates well beyond dollar denominated losses. Even if the capacity to produce goods and harvest crops is maintained, interruption of their distribution would put lives at risk for lack of nutrition and access to medication and life-saving products and services. Ultimately, loss of time at work impacts income with consequent reduction of ability to acquire necessities even when they are available. In a complex society focused on the value of stock markets as one of the pillars of financial intercourse, a pandemic is capable of disruption well beyond the magnitude of anything that has previously occurred. The possibility also exists for civil unrest if people are unable to access food, water and electricity, a situation that modern governments typically do not face and that industrial societies wanly want to believe is impossible for them. It remains a fact that desperate populations can wreak havoc no matter how many guns a government points at them, a prospect for which there is little consideration and less appetite for belief but which everyone ignores at their peril.

Vicissitudes of climate change could combine with a pandemic to multiply the impact of a catastrophe. Enough is being written about the potential horrors of climate change that anything I add would be like peanut hulls in a very strong wind but it must be listed as a prime suspect as a contributor in dystopian developments and it must be considered as subject to help through leadership if we would but listen. Climate change must also be obvious as one of the combining elements gluing problems together to make larger problems because, as we have discussed, issues act upon each other and in combination. If left unmodified, climate change, as a stand-alone issue, would be sufficient to cause horrendous problems across the globe. Changes in rainfall patterns, for example, are expected to cause floods in some locations and droughts in others, both potentially devastating issues for populations forced to contend with them. But that is only the beginning of the problem as populations react to circumstances. In some cases, people fleeing excessive heat and drought, for example, armed conflict could result from migration that could, in turn, spark larger wars. The fact that war has been one of the principal scourges of humanity from the very beginning is not an isolated subject. War is almost inevitably the result of issues that twist themselves inexorably through the conflict rendering piecemeal extraction impossible. It is so much more the case with climate change but in this muck of cause and effect are reasons for clarity, perchance yielding positive results.

War is a prime example of how emergencies and public policy could impact the future of work and leadership because the role of government in the conduct of war is obvious. And while pandemics, climate change and population patterns share elements of unpredictability with war, they also reflect areas of broad concern for humanity that are subject to modification through public policy. As such, all of these concerns demonstrate stark vulnerability to the whims of authoritarianism and autocratic rule. Democratic guidance of public policy is by no means assured but recognition of that fact is the beginning of its protection and cultivation. Democracy is the soil in which leadership can flourish in terms of public policy, there being no room for anything but management control when autocratic forces govern the public sector. Pessimism is healthy if it helps preserve democracy, husband its strength and extend its benefits. Intelligence is always skeptical and exercises itself through the trial and error of public discourse and policy. Unquestioning acceptance yields the blind faith that authoritarianism requires to nurture control which is why resistance must be perpetual and skepticism must be persistent even during seemingly quiescent periods of benign tolerance, that very span of time in which weeds take root that will eventually threaten to choke the intended crop.

Management, meanwhile, is betting on itself—the weeds—to win the turf war. Doing a little looking ahead on their own, management whispers that all these jobs that are of such concern now, from the nineteenth century onward, are a recent phenomenon of history that will disappear as the current concept of employment becomes obsolete. They claim that the one-on worker with work scenario is the natural state that will return in the future, a concept akin to freelancing before it was known as such. But while contemporary workers realize that what is needed is a new way for independent workers to deal with employers, management visualizes triumph over workers, subduing them rather than collaborating with them. In this perspective, management sees itself fulfilled, becoming the very control they now seek to exercise, so that, in this victory, management is the fact of control, not merely the act of controlling.

Leaders see the situation differently. Embedded in the interconnectedness of all issues, leaders find the potential for solutions, a critical distinction of perspective that makes all the positive difference for everyone, not only workers. It is a difference sorely needed throughout the world as we face the future together. In reviewing all of the issues portrayed here and countless others not mentioned, it is clear that at every juncture, while complete failure is possible, public policy can be applied in such a way as to cause an adjustment of at least some of the contentious aspects leading to an improved outcome. This observation means that the development of public policy is one of the areas where leadership instead of management is absolutely essential for the well-being, even survival, of humanity. The only scenario where this would not apply would be all out nuclear war with its inherent termination of human life on the planet. But even in that most dire of all circumstances, there is always the possibility of adjustment all the way up to the point of fateful decision and leadership is the only reliable means of enabling solutions in the realm of crucial public policy.

Three categories remain that demonstrate the fundamentally important role of public policy as we advance into the future. All of these topics deal both with the private and public sectors, demonstrating how essential good public policy is in setting parameters for private conduct in a public world. Determining a course of action in these three categories of public-private endeavor is fraught with uncertainty. What we know now is limited, and in some cases, we don't even know how much we don't know. Take automation and artificial intelligence, for example.

Everyone is aware of the running dispute among experts and ordinary observers about the potential impact of automation and artificial intelligence with some arguing that it will result in disaster for humanity as machines displace people in a ruthless march to world domination. That may or may not be just a few decades away, but the future we are examining stops before that stage is reached if it ever is. The ruthless domination that concerns us in this book deals with the impact of robotics on near term employment and the parameters of change that artificial intelligence will bring to the workplace over the course of the next ten years or so when public policy will be established for dealing with the consequences whatever they may be.

Experts who believe that no jobs will ultimately be lost to robots point out that fears of machines displacing workers in the past have been unfounded because scientific advances bring more employment in the long run. This time will be no different, they say. But it will be different because, while robots, as they are currently known, have limited capabilities that mesh with human needs for physical assistance, the application of artificial intelligence will open vast new possibilities for machines. It is true that fairly simple, predictable machines displacing workers in repetitive activities merely means that displaced workers need to find different avenues of endeavor with plenty of them available, has been the experience of the past. The future will be different. As intelligence is applied to machines, they can perform tasks that now require human beings, including being able to program themselves, and in a few years, they will be able to assume decision making authority over their duties, gradually operating in increasing numbers of realms to the point that...well, that's too far into the future for us in this book.

The point is that eventually humans will not need to perform many tasks that occupy them today and societies through their governments must soon confront the need to compose public policies that address this eventuality. Just as with climate change, it may be uncomfortable to think about, but the sooner we begin to plan, the better will be the future for all humanity. As with all else in a world deluded by aggregate wealth, we must simultaneously face the fact that everyone has a valid place in the world and must be accommodated as human beings, not cast aside as obsolete machinery, a habit acquired during the Industrial Age management mindset.

As workers moved from independence into jobs and workplaces, they were treated as expendable, an employer attitude that failed to adjust to changing times. At the beginning of this benighted period, many things were simpler. Education, for example, was thought adequate if a few fundamentals were acquired. Even life and death were comparatively simple. It was a matter of being alive or dead with the concept of health, or ill health, a netherworld of quasi-medical relief and hope against hope gradually filling space that sometimes led to recovery, often merely to delayed death. This time and treatment became more expensive as science blundered ahead holding the hand of the sick as they traversed an unknown future, an increasingly costly future, that has become, mostly, recognized as a human right, especially in industrialized countries fortunate enough to have modern facilities. During these same early days, job injuries suffered the same neglect and followed the same costly path as sickness as time progressed to the point of availability of practical, sometimes legally mandated, assistance. Workers compensation was insufficient as the attitude of management hardened with the rise of medical expense. Owners and managers have never, to this day, recognized human rights in the workplace to the extent that society now perceives them. Modern management has made it absolutely clear that it is content to shove disabled workers out the door to be replaced from the shelf of the waiting unemployed. The conflicting perspectives of the public and management must be resolved in favor of humanity, but to date, this has not happened.

In place of resolution, we currently have low wage industries where undereducated workers languish without the means to acquire expanded skills because the educational system favors management and a niggardly, tax-stingy middle class. Low wage workers are also abandoned to the vicissitudes of public assistance administered by political representatives hostile to workers unable to climb higher in a corporate structure erected against their interests at every turn. As a sop to the physiological requirements of the human organism, some earlier low wage industries such as textile manufacturing provided housing to workers whose whole families signed onto minuscule compensation from a single life-draining employer. Today's low wage workers often cobble together insufficient income from multiple pittance providing sources then, for the remainder, grasp whatever public assistance is available, if any, then anguish through hunger, disease and homelessness. At every point in this sorry history, workers have been expected to function more like machines than human beings and were thus seen as expendable, replaceable. Little has changed over time in this regard and the resulting attitude is now virtually fossilized. Today, we expect low wage workers, cashiers, for example, to function with machine-like precision; if they fail, there is apt to be some monster manager ready to pounce with revenge.

While we concentrate on the woeful fate of low wage workers, we often overlook what hardened attitudes toward them mean for the whole society. Many in the United States behave as if repressive capitalism was written into the Constitution. Proponents of neoliberalism and modern capitalism attribute such glowing regard for themselves that they believe they are not merely rich, but smart, too. Actually, they're stupid. How is it possible to claim intelligence for an economic system that refuses some of its dependents a living wage? What is smart about pushing workers onto public assistance instead of requiring their employers to pay them enough to live in exchange for their labor? Managers and owners believe they are brainy because they run businesses that consistently make profits but they fail to acknowledge that in many cases those profits are acquired partly at the expense of undercompensated workers. There is an old retail saying reminding managers that any fool can sell something if the price is low enough. In a different light, we should note that any fool can make a profit if he doesn't pay for the work required to create a product and deliver it.

Low income businesses represent the critically important issue that we have failed to address up to this point in the book that has so far concerned itself chiefly with workers who participate dynamically in the national economy. Most of the workplace issues addressed in this book are faced by workers who have options, often very limited, but who possess the capability of making choices nonetheless, possibly, for example, to pursue additional education to sharpen their marketable skills. When we speak of low wage industries, we are talking about workers who have few if any options whatsoever, workers whose energy is focused entirely on survival. How to accommodate low wage workers successfully in the economy is the dilemma that our society now faces and that must be solved if the future is to be effective for all of us, not merely those fortunate enough to navigate complex technological advances and relentlessly demanding cultural expectations. Low wage workers have been written out of the script of the Great American Movie. Through no fault of their own, having been deprived of opportunity, they have forfeited the American dream and are relegated to stagnant supporting roles that admit no mobility, the opposite of principles upon which American ideals are founded. As opportunity narrows to a pinhole, vacuous appeals to bootstrap protestations of individual responsibility for success increasingly makes proponents appear ridiculously out of touch with reality. Too often, objections to workers and provisions for opportunity are based on gender, race and ethnicity. The United States will be a failed experiment if it cannot rise above petty, superficial prejudice and open opportunity for everyone. Through that single cultural aperture will flow destiny, success or failure of the future. It is terribly ironic that race and ethnicity should be a prime consideration, even ostensibly, in the preservation of low wage jobs because white people constitute a huge portion of low wage workers. Ignorant of this, race is, in some ways, the window dressing that sells the proposition to whites eager to grasp anything that makes them presumably superior to others. Financial elites, meanwhile, make out like the bandits they are, caring for nothing except the superiority they can measure in dollars and cents. Low wage industries are thus perpetuated as a hoax upon the gullible and a pox upon all mankind.

The urgent question being blithely ignored is what to. It is a question for everyone and it is a question both for the present and the future, a critically important question whose answer will play an outsized role in determining what our future will be. We first need to acknowledge the historical circumstances of low wage industries that finds them created by happenstance as the Industrial Age management mindset spun out of control. The controllers, after all, rejected control of themselves by any entity, especially such a dirty, alien thing as government. Having accidentally birthed this bastard child, they decided to keep it as a servant for themselves, accepting its obeisance represented by a continuous flow of profit. They weren't about to change the arrangement and workers, mostly accepting their fate with resignation, acquiesced as if divine will prescribed their plight as in days of yore when some people were mighty rulers and the rest were them: serfs, dignified in the latter day as wage workers. Low wage workers.

Besides the obvious, what must be emphasized here is the accidental and wholly unplanned nature of this economic development. The reason, the because, at this juncture is critically important. Since the problem was unplanned, its solution is to devise a plan that remedies the error. Carelessness on the part of capitalists and government can be easily overcome if sufficient will is mustered. Public policies designed to address deficiencies inherent in low wage industries can surmount the problems faced not only by low wage workers but by all others as well, the problem being larger than any single element of society. The longer we wait to compose and implement a solution, the worse will be our collective future. Specifics of what to do are easy to recognize, things like a living wage, obliteration of racial and other prejudices, provision for health care, child care, transportation, communication and education. And there is a bonus for beginning the implementation of these changes now.

Urgency is important for reasons subtle and bold. The more time elapses, the faster time runs with the accumulation of problems pushing us increasingly rapidly toward a future that will not get out of the way. The weight of our problems not only means that momentum increases as their complexity and seriousness increase, but that if unresolved, the impact on our lives will be potentially disastrous and could even result in the dystopia described earlier. It cannot be overemphasized that the problems of low wage workers will mean problems for all workers in the future if these problems are not resolved. The existence of low wage workers and the system that perpetuates them are a problem for everyone already despite the fact that comparatively few recognize it. Eventually, the walls of dystopia will fall upon those who seek them as protection from the vast majority. We have a window, albeit a small one, in which to make necessary changes to avoid catastrophe.

The broadly outlined solution of a living wage and sufficient health care, child care, transportation, communication and education is only a first step, remedies that can assist workers immediately upon implementation, but they are not an ultimate or complete solution. They are, however, extremely important. In a society with a long history of repression and an unwillingness to undertake bold measures on behalf of economically distressed citizens, baby steps forward are better than none at all and will eventually lead to a permanent solution.

Ultimately, it is absolutely necessary to institute universal basic income at the level of a living wage. As artificial intelligence becomes widely functional, lack of remunerative employment will require this historic solution as the attributes of one milestone of humanity solves the inadequacies of another. In the meantime, we need to transition incrementally from the temporary steps listed above into progressively increased levels of basic income before attaining the final, complete version. It is reasonable to believe, based on current evidence available in the workplace and from scientific advances, that jobs will become fewer in number requiring fewer hours to execute, leaving workers with substantially diminished means of earning income. It does not take a genius to understand that such circumstances left unmodified are untenable for everyone, including the wealthy.

The fact that entrenched elites are not merely reluctant to acknowledge their dependence upon workers but deny their validity and vigorously resist them with all available resources indicates intransigence that can only be overcome with methodical demonstration of superior results through leadership. It will be a tough but imperative slog to emerge from a centuries-old morass of misunderstanding that misguided workers while empowering their oppressors. The only antidote sufficiently powerful for the job is leadership.

Having experienced management as control inimical to their interests, workers are simultaneously discovering that all of them are leaders in some aspects of their endeavors. This realization is liberating, leading to deeper understanding and expanded expectations of effectiveness for the future. In some respects, it all started with the tools of resentment, anger and disgust inadvertently handed to them by owners and managers who demonstrated their own invalidity through ignobility. To the contrary, workers are learning that a leader is someone who guides toward a forward, elevated goal instead of directing and demanding into the clutches of selfish manipulators. And workers, subdued for generations by management insistence on keeping them ignorant, are realizing that effective leaders are impossible without knowledgeable followers. Workers are seeing that they follow, not through requirement and control, but by attraction to and acknowledgement of valid goals.

Leadership is established through recognition of validity. Its success will spell the benign demise of repressive capitalism.

The End

# About the Author

Michael Driver shares his perspective of over four decades of leadership and management. While most of that time was spent in the retail industry, Driver also acquired political experience and a wide-ranging communications background. In addition, he contributed short stories and satire to numerous publications. His essays and plays are available on  Medium where he also maintains a  Forward Communication Line blog. A previous book,  Own Your Employment: The Challenge for Twenty-First Century Workers, is available on  Amazon. Driver is originally from Atlanta and now lives in Montgomery, Alabama. For more information, visit www.ForwardCommunicationLine.wordpress.com. Follow him on Twitter @mdMichaelDriver or email: MichaelDriver@mac.com.

