wired america
The Social, Economic, and Legal Parallels Between the electrical Revolution and the Information Age

By Lassor Feasley
Adviser Maura Farrelly
Fall Semester 2014
Brandeis University

Abstract

WIRED AMERICA

It is difficult to pin down exactly when a social or technological revolution begins. Some historians assume that the Industrial Revolution took place entirely within the 1800s, while others contend it started a hundred years earlier and did not end until the 1930s. I predict that future historians will have the same issue defining the Information Age that we are now living through. Did it start in World War Two, with the first sophisticated applications of digital technology, or in the early 1990s, when the Internet first became popular? Obviously, this classification is a largely subjective matter for debate. In both ages, however, there was a defining moment when innovators invented products that would finally bring the full promise of new technologies into the lives of everyday Americans. The final realization of the industrial revolution in that context, I would argue, was the introduction of electric light and appliances to the home. Similarly, today the introduction of digital connectivity in the home with computers and the Internet represents the cumulative result of decades technological revolt that has been brewing for the past half century.

These two popular revolutions have striking similarities. David Nye observed that soon after the adoption of electric power, is users did “not merely use electricity. Rather, the self and the electrified world intertwined.” This concept should be familiar to most Americans today, who spend hours each day producing, consuming, and communicating through cyberspace.

This paper does not compare the aftermath of these technologies. To do so would be premature. Instead, it compares the construction and adoption of electrical infrastructure from 1890 to 1940 to that of networked computer technology from 1950 to today. This paper is not a comprehensive history of the technologies, but instead a series of observations on social, economic, and legal similarities.

The fundamental argument this paper makes is that the Information Revolution is not unprecedented in American history. We forget that just a few short generations ago, a similar transformation took place. The adoption of digital technology in the home is in many ways analogous to the adoption of domestic electrical technology in the twentieth century.

Until the invention of the lightbulb and the electrical network, the effect of the previous centuries technological progress on the daily lives of most Americans was relatively superficial. The same can be said of the personal computer and the internet. As this paper will argue, we can use our own experience with the information age to contextualize the electrical revolution. More importantly, the legal, social, and economic upheaval that came with electrical networks can help educate how we approach analogue issues of the Information Age.

The American lives in a land of wonders, everything around him is in constant movement, and every movement seems an advance. Consequently, in his mind the idea of newness is intimately bound up with that of improvement. Nowhere does he see any limit placed by nature on human endeavor; in his eyes something that does not exist is just something that has not been tried.

ALEXIS DE TOCQUEVILLE

Rhetoric v. Reality

Introduction

Both the Electrical Revolution and the Information Age heralded progressive social and economic changes. While change is always inevitable with such massive breakthroughs, many observers and pundits would attempt to guide society in its embrace of new technologies. By examining the rhetoric that became common surrounding electrification in the early 20th century, we can better understand that which even today frames our perception of the Information Age.

Large corporations have been key in directing the narrative of technological change in both the Electrical Revolution and the Information Age. Sometimes this is for competitive reasons, as described in Part One, in which companies use World’s Fairs as venues to publicly best their corporate adversaries. However, just as often large conglomerates have used their considerable marketing power in order to attain political ends, as described in Part Three.

The narrative of technology is influenced by the ideology of the times. For example, Part Two discusses how both the Information and Electrical Revolutions resulted in dramatic changes to the organization of American households and families, and the efforts of various political movements in attempting to control the trajectory of that change. Finally, Part Four observes the “Neo-Jeffersonian” tendency that first came into vogue with the Electrical Revolution, when politicians espoused the potential electrification had to restore rural power, and again with the Information Revolution, when many pundits predicted that network technology would enable a back-to-the-land movement.

One common theme which emerged through the research for this chapter was how thoroughly our perceptions of the future are shaded by our experiences of the past; new technologies often seem to embed traditional values deeper within society rather than drive social progress. This is most apparent in Part Two, in which fantasy visions of “The Home of the Future” thoroughly embrace the most traditional family values, gender roles, and lifestyles instead of driving social innovation. This irony also shines in Part Four, in which rural communities hope the technology of the future will resurrect a Jeffersonian ideal of the past.

In retrospect, some of the rhetoric will seem absurd, some idealistic, and some blatantly political and opportunistic. This chapter will explore how public perceptions of a coming era were manufactured to meet the needs of different communities, and how the claims of the 19th century would mirror those of the 20th.

World's Fair

Part One

Both the old trusts of the electrical industry of the late 19th century and the ascendant companies driving the Information Age today should be noted for the disciplined and comprehensive marketing strategies they employed. Often, the most pure form of a corporate vision would manifest itself in large public gatherings like the World’s Fairs, which were popular in America from the birth of the Electrical Revolution until the first tremors of the Information Age. These events offered companies a place to demonstrate their latest technologies, and more importantly to present a platform from which to grandstand their visions for the future.

Throughout the late 19th and much of the 20th century, the most prestigious displays of electric light happened in demonstrations presented at the World’s Fairs. Though often patronized by corporate sponsors and deeply subsidized by government, these were not simply demonstrations of national prestige or commercial and cultural power. As historian Jill Jonnes put it, “The stakes were not immediately about money, but about the unparalleled opportunity to display to an unsuspecting world for six months the true glories and possibilities of electricity”. At a time when the harsh Dickensian reality of the industrial revolution was entering social consciousness, governments were eager to invest in ambitious, if unprofitable displays of electrical technology. As historian David Nye explains:

They helped to impose a middle-class progressive order on the world, and they helped to give the visitor an explanatory blueprint of social existence… providing a vision of order during a convulsive period characterized by political corruption, violent strikes, and rapid industrialization.

The optimistic spirit of the World’s Fairs made them popular arenas for large corporations to manage public relations and pitch new products. In order to help foot the bill, fairs would allow corporate sponsors to open their own dedicated pavilions. In the 1893 Columbian Expedition, General Electric and Westinghouse engaged in an acrimonious bidding war for the right to illuminate the fair’s vast campus. Both companies, though short on cash, were willing to take a loss for the privilege. The promise of good publicity would justify the loss on the competitive bid in industrialist George Westinghouse’s eyes upon his victory; “There is not much money in the work at the figures I have made, but the advertisement will be a valuable one and I want it”. When Westinghouse won, GE retaliated, installing “an eighty-foot-tall, half-ton Edison incandescent light bulb, shimmering gorgeously through five thousand laboriously installed prisms”, in addition to “A tasteful arena of palms displaying 2,500 different kinds of Edison incandescent bulbs”.

By 1893, GE, Westinghouse, and several regional power companies had illuminated the main promenades of several towns and industrial complexes. However, electrification was by no means common and some of the less cosmopolitan visitors to the fair would have never yet experienced electric light. Yet despite the prepubescent state of the industry, bankers had already begun to attempt to consolidate it into a handful of large trusts. General Electric, for example, had recently been maliciously pried from Thomas Edison’s control and incorporated into a larger trust led by a ruthless syndicate of banks. Despite it’s bad blood with Edison, GE was savvy enough to market “nostalgia for Edison’s historic lighting breakthrough”, a marketing coupe successful in “establishing the primacy of all Edison inventions”.

By the 1920s, this type of marketing was typical and even overdone, according to public relations historian Roland Marchand, “the conventional wisdom was that the way to humanize a large and distant corporation was to conjure a popular, folksy image of its leaders.” However, in the 1890s, General Electric was a pioneer in the process of systematically personifying a corporation by leveraging the popularity of its management and founders.

In reality, by this point Edison had been largely sidelined by his contemporaries in the field of electric light. Having failed to comprehend the advantages of alternating current, Edison had reluctantly passed the baton to more disciplined thinkers like Nicola Tesla and his corporate patron George Westinghouse. Instead Edison moved on to other fields like recorded sound and the moving image. Yet a casual observer in 1893 would likely fall for GE’s corporate subterfuge, paying little note to Westinghouse’s role in providing 30,000 AC lamps and attributing the brave new era instead to the genius of Edison’s ostensible collaboration with General Electric, as evidenced by their impressive pavilion.

Of course, the elaborate displays presented at these events were not simply created in acts of corporate brinksmanship between rivals GE and Westinghouse. In fact, the dazzling exhibitions had a far more substantive aim; to capture the hearts and minds of the American public in order to ensure the continuity of their broad political support This would prove a commodity as important to early conglomerates even as the commercial support that World’s Fair sponsorship also promised. Unlike the pleasure palaces of prior World’s Fairs, many of the exhibitions took on a museum like quality; dioramas of factories or scale models of industrial equipment were used to “teach” visitors about new technologies while demonstrating the host company’s electrical aptitude. This would hopefully cement the public view of the corporation as an “Institution” operating without needless regulation. As Roland Marchand further observed:

Relying on the persistence of republican values in American society, the industrial exhibitors of the turn of the century still sought to give visual pleasure to displays that would enable to layperson to enjoy a sense of technical competence… they would perceive the wisdom of defending business autonomy from deliberate interference.

In reality, the business practices of corporate giants were far from unimpeachable, regardless of their engineering skill. In 1893, for example, many of GE’s and Westinghouse’s clients were small towns aiming to boost their status by illuminating Main Streets in the image of New York’s Great White Way. Of course, little notice was paid to the fact that little public money went into Broadway; in 1890 as now, almost all the illumination in the Theatre District was provided by advertisers. “Every Business developed its own displays and advertising, creating lively but discordant streets… As early as 1903 Chicago, New York, and Boston had five times as many electric lights per inhabitant as Paris, London, or Berlin”, largely as a result of corporate America’s enthusiastic embrace of electrified billboards.

To some, the copious displays had become a nuisance; “As early as the 1890s some New York churches felt compelled to install electric advertising because they were being ignored”, according to David Nye’s Electrifying America. If local governors knew of such exigencies, they were undeterred. Most new municipal systems had little practical value beyond to project a vain and often misleading sheen of prosperity on small towns that were increasingly subjugated by growing urban metropolises.

In addition, many municipal power systems were rife with patronage, no-show jobs, and political quid-pro-quo. An audit of the chronically underperforming system bringing light to Muncie in the late 19th century found that the electrical utilities were particularly prone to mismanagement. “Hiring decisions at utilities were often based more on political patronage than engineering knowledge, and municipal utilities’ contracts were often negotiated with graft and kickbacks”. If a municipal contract with an electric light company brought any of the splendors of city life to small town Main Streets, it also brought opportunities for waste, greed, and corruption.

As will be discussed in Chapter Two, the nefarious antics of early electrical conglomerates would be directly involved in the establishment of the Sherman Antitrust Act and the Securities and Exchange Commission.

Of course, to see the great illuminated classical structures in Chicago that year, one would take the new innovations to represent a renaissance unadulterated by politics and blind corporate reaching. Yet displays at the Columbian Exposition were not designed to present a holistic demonstration of the ascendancy of a new and exciting age of prosperity and innovation. They were designed to cement confidence and trust in the new corporate establishment in the eyes of the public and to encourage municipalities to invest, sometimes unwisely, in their products.

Seventy-one years later, New York would hold its own World’s Fair in 1964. Like the 1893 Fair, this one would be a mecca for speculation of future technological development, and a sharp-eyed visitor might have felt tremors of the then fetal Information Age. For example, the de facto telecom monopoly, AT&T, was demonstrating its new teleconference device, the Picturephone. Slated for mass adoption by 1970, “Each Picturephone circuit needed the equivalent of a hundred voice lines, so the network would need plenty of new capacity”, which AT&T planned to meet with its experimental waveguide technology. So certain was the telecom that the picture phone would catch on that they began adding the pound sign key to telephones that would be used to signal video calls, an archaic holdover to this day.

The breathless optimism of the 1964 World’s Fair echoed that of the Columbian Expedition, but similarly the atmosphere was often misleading. At some level, the promise of its glowing facades was built on a foundation of corporate politics, rather than a realistic assessment of the future. The waveguide, the missing link that could make the Picturephone practical, would never emerge from experimentation as a mainstream product, a fact that, despite billions in research spending, some of the savviest engineers already knew in 1964. The technology had become something of a Rube Goldberg device, evolving to incorporate thousands of focusing lenses, light amplifiers, and technical work arounds, rendering it expensive and impractical. Yet other more promising technologies like radio, satellite, and glass optical technologies were sidelined as AT&T hyped waveguides. Despite these problems, according to optical technology trade journalist historian Jeffery Hecht, “Thanks to regulations that assured the company a return on its investment, AT&T had ample money to spend” on the abortive waveguide system.

AT&T had attained a government protected monopoly status in the mid-1930s. Like electric utilities, early telecommunications companies were considered to be “natural monopolies”. Early in the history of telephones, customers would sometimes be forced to own two or more phones and pay for multiple services. This is because telephone lines owned by different companies were incompatible; an AT&T subscriber would be unable to call someone using a regional carrier. Thus, customers were naturally attracted to the largest phone company with the most subscribers. Rather than break up the Bell AT&T conglomerate once it attained its natural monopoly, legislators decided to allow it to exist, albeit with increased public oversight, in a corporate form not unlike the electric utilities formed in the 1890s.

AT&T, the holding company for Bell and other subsidiaries, knew that in order to justify its monopoly status, it would need to impress its government patrons. Even though interest in scientific innovation was rampant in the Cold War, far greater than interest had been in Industrial engineer celebrities like Thomas Edison, the Telecom industry failed to capture the popular imagination with their carefully rendered plans. By the time of the Fair, the unchanging Bell Model 500 phone had been present in most American homes for over a decade, so the company could confidently map adoption of the Picturephone, which it predicted would reach mainstream status in several decades.

The Telecom monopoly had come under intense public scrutiny several years earlier, when AT&T brought the manufacturer of the ‘Hush-A-Phone’ to court. The Hush-A-Phone was a plastic attachment that could snap onto a standard issue AT&T podium telephone so that customers could speak in privacy (the less perceptive microphones of the time normally required telephone users to shout). The FCC’s decision to ban this device, resulted in a federal appeals court case which overturned the policy of prohibiting phone services that did not interfere with the telecom’s own network. While the Hush-A-Phone case was perceived as a modest victory for deregulatory forces, the petty nature of the case had seemingly inspired cynicism in the courts that was reflected in a subsequent series of deregulatory decisions. As one appellate Judge on the case complained in response to an FCC argument:

To say that a telephone subscriber may produce the result in question by cupping his hand and speaking into it, but may not do so by using a device which leaves his hand free to write or do whatever else he wishes, is neither just nor reasonable.

The picture phone was only mildly successful in distracting public attention from the crushing bureaucracy that had plagued AT&T’s service for years. Before the first system could be installed, “New York State regulators blocked the new service in Manhattan until the phone company improved its regular service”. Chapter Two notes how Thomas Edison aggressively developed the very first commercial electrical network in downtown Manhattan in order to win the confidence of key institutions, so AT&T’s failure to enter that market was a particularly hard blow.

Of course, the AT&T Picturephone never came to fruition as a commercial product. The company had planned for Picturephone to be key in its future operations, so its ultimate failure was disastrous for both AT&T’s business projections and its internal moral. Several dozen of the devices were installed across the country as novelties in office parks, but they were rarely used. Because AT&T failed to find a high bandwidth system in the waveguide, use of Picturephones turned out to be prohibitively expensive. The AT&T Telephone Pavilion in the 1964 World’s Fair, it turned out, was to remain a fantasy. Nowhere in the exhibit, or in marketing materials used until the late sixties, was it mentioned that the adoption of the devices was dependent more on the development of high bandwidth technology, a field that was largely stagnant.

Rather than a representation of a plausible and imminent future, the pavilion was a corporate ploy designed to distract the public from AT&T’s inadequacy as a monopoly and redirect attention from the spoils of its state sanctioned success. This delusion was not just a facade presented to the public, but one which was taken seriously even by management. Planning 30 years into the future, “The company expected the Picturephone to spread steadily… reaching 100,000 sets in 1975 and a million in 1980… AT&T expected to be given access to the tremendous cable capacity needed to support the optical waveguides once Picturephones became commonplace, probably after 1990.” These projections were wildly overstated, and the company’s inability to meet them would contribute to its slow demise and the deregulation of the telecom industry.

Although they were often the focal points for public speculation, the displays technology companies created at World’s Fairs presented a skewed vision of reality. World’s Fairs, as historian Robert Rydell argued, have traditionally served as a proving ground for the status of establishment elites. Section Two will discuss how engineers sought to formalize their profession and form an occupational class, much as doctors and lawyers had, so their participation might confirm Rydell’s assessment. In General Electric’s case, the Columbian Exposition presented an opportunity to steal Westinghouse’s thunder, sell municipal power systems of dubious worth, and capitalize on nostalgic misperceptions of Edison’s affiliation with the company.

AT&T’s 1964 pavilion overtly implied that it was a torchbearer of innovation in the space race (an association with overtly patriotic undertones), and that it was on the cusp of a revolution unrivaled since the telephone’s invention. As a company publication exuberantly stated:

Just as the telephone has revolutionized human habits of communicating and made a major contribution to the quality of modern life, many of us at Bell Labs believe that PICTUREPHONE service, the service that lets people see as well as hear each other, offers potential benefits to mankind of the same magnitude.

Of course, the truth was that telephone service had stagnated in the hands of AT&T; once the courts agreed that the Hush-A-Phone was beyond FCC regulation, a host of new aftermarket products like switchboards and handsets emerged. These products used the AT&T network, but were designed and manufactured by small entrepreneurial companies. Their quick ascendance demonstrated the power of competition in AT&T’s previously exclusive market. This would play into a larger narrative about the nature of government regulation in business, especially telecommunications. Just as fiber optic technology became marketable in the late 1970s, the FCC eliminated AT&T’s monopoly on long distance calling, allowing a national high-speed data network to bloom within several years. As Jeffery Hecht explained:

The fiber-optics market took off in the mid-1980s. The deregulation of long-distance telephone service in America created a market for long-distance transmission. MCI, Sprint, and smaller carriers spread tendrils of fiber networks across the country, along railroad lines, gas pipelines, and other rights of way.

These networks would serve as the backbone of the early internet, and their deregulatory origins would inspire the political rhetoric and academic theory around the Telecommunications Act of 1996, to be discussed in Chapter Three. In this sense, part of the origins of the Internet might be traced back to the Hush-A-Phone decision of the late 1950s.

In creating impressive and costly pavilions, large corporations of the Electrical Revolution and the Information Age attempted to validate their self-assumed roles as quasi-public institutions. In order to operate without interference, these companies had to equate themselves with government agencies, both in scale and in competence. The Hush-A-Phone debacle called into question the validity of AT&T’s privileged status and created political strains that the company attempted to ease by investing in grand publicity stunts, and by pushing flamboyantly optimistic rhetoric, like that surrounding the abortive Picturephone.

The maintenance of a corporation’s political mandate would turn out to be nearly as important as its technical competence and products. As one corporate sponsor of the Columbian Exposition put it, the point of massive investment in breathtaking pavilions was “to educate the public regarding the company as an institution”, as a source of public good, rather than the soulless profit seeking machines the public sometimes made them out to be.

the home of the future

Part Two

As previously noted, World’s Fair displays are, by their nature, highly politicized events. However, they are not the only venues in which corporations pitch their comprehensive visions for the future. Another commonality between the rhetoric of the Information Age and the Electrical Revolution is a public obsession with technology’s place in the home. Through both ages, corporations have constructed their image of “the home of the future” both to market existing products and generate hype for their brands. In some cases, these displays were necessary for adoption of new technologies by domestic consumers.

Even by the turn of the 20th century, there was a pervasive misunderstanding of electricity that threatened to slow consumer acceptance. By 1885, “some thought that, like water, electricity obeyed the law of gravity and would flow down unaided but had to be pumped up”, according to David Nye. “Success, said one analyst, required many people to wire their houses, overcome fear of electrocution, discard those whale oil lanterns, and stay up after sundown”. Superstitions ran so high that President Benjamin Harrison was prohibited from touching light switches by his security detail. Clearly, these obstacles were immense, and overcoming them would take great marketing ingenuity.

Resistance to adopting electricity was not just predicated on simple technical ignorance. Arc lamps of years past had occasionally made headlines when improperly installed wires gruesomely electrified hapless line workers. The first execution by electric chair, a device that was hailed as a humanitarian innovation over firing squads or hanging, failed so abysmally that the victim’s heart continued to beat even as his limbs were slowly carbonized by the high voltage. It was thus understandable that some consumers were made apprehensive by the notion of bringing this violent force into their homes.

Similarly, the general public could not be expected to understand the true implications of the Information Age on their everyday lives. Before home computers were economical, computing was an enigma to most, something which scientists and engineers might occasionally depend on, but completely foreign to the average American. Even to people privileged enough to make acquaintance with the technology, computing seemed mysterious and inscrutable. As Information Age historian John Markoff recalls, in the late seventies and early eighties:

Mainstream computing was an exercise in remoteness: You took your problem, captured it in a stack of cards, surrendered it to the priesthood guarding the glass-encased computing machine, and then came back the next day to get the answer on reams of computer-printout paper.

Electricity in the 1890s might have seemed similarly incomprehensible. By then, it was primarily used for municipal and industrial purposes, just as computing in 1990 was still primarily a corporate endeavor. While many large companies had private digital information sharing networks, there were no major consumer applications for the web. However, in both cases, engineers felt a sense of manifest destiny; that their inventions would one day permeate even the most mundane aspects of everyday life.

This inevitably meant that the engineers hoped to bring their creations into the home. To create a sense of immediacy for these plans, large corporations would often create model homes, with every potential benefit of their product, real or imagined, on brilliant display. Predictably, the products on offer would be primarily directed at women, who were often responsible for home purchasing decisions.

By the late 1890s, electrical appliances were of perennial interest to women’s associations and suffrage clubs. Some began to apply the tenants of Fredrick Winslow Taylor’s Scientific Management (discussed further Section Two), seeking to liberate themselves through increased efficiency. Though Scientific Management was written for Industrial managers, its principals were adopted in broader contexts to suit social and political needs. The book basically demonstrated how some factories could be made exponentially more efficient if every worker’s moments were strictly regimented. But Taylorism was reappropiated as a cultural manifesto, that regimentation of social society could be a moral imperative.

The use of home appliances symbiotically fueled a “popular notion that women could formalize homemaking into a legitimate profession, in part by adopting industrial principals like Scientific Management. One article in Lady’s Home Journal, a publication largely responsible for planting that ambition, described how baking a cake could be “Taylorized” through the purchase and careful arrangement of new appliances, reducing the steps it took from 281 to 45. These obsessively methodic homemaking guides were common in popular women’s magazines of the 1920s. Proponents of this “Domestic Science” envisioned a world where average housewives would require rigorous education in order to hone their craft. Adopting this new industrial discipline would require women, as David Nye recalled, “to study anatomy, chemistry, physiology, hygiene, art, and literature”, such that they could fully realize the opportunities that Domestic Science promised.

It is easy to mistake the early home economics movement as a form of self-empowerment for the repressed domestic housewife. This assumption comes intuitively; with all the time and exhaustion saved now that the washboard was obsolete and once time consuming tasks automated, wouldn’t average women inevitably achieve some degree of previously unknown independence? Sadly, this opportunity was ignored by the early home economics movement.

Rather than challenging the domestic role of women, the technologies of the home of the future promised only to further embed women in home life. This is because the mechanization played into the growing school of “home economics” that cast “the home as a management site controlled by women who, through the use of applied science, would free themselves from drudgery and raise the quality of family life.” In other words, the ambition of the rhetoric behind the home of the future was never gender equality, but rather to make women more effective wives and mothers.

The fact that most early innovations focused on the kitchen was telling of women’s role in the electrification of America. One of the first displays of the electrified home took place at the Columbian Exposition. Although it would be decades before the average home was equipped to use them, one display featured “electric stoves and hot plates, saucepans, water heaters, washing and ironing machines, dishwashers, fans, and carpet sweepers.” The vision of a resurrection of the Victorian ideal of the poised domestic women embracing modern technology continued as a theme in marketing the future. In 1905, still decades before electric household appliances became common, one GE marketing demonstration showed a woman in a formal Victorian dress operating an early hot plate.

By 1922, when electrical appliances had finally become feasible for millions of homes which were already electrified for light, hundreds of demonstration homes were built by local utilities, hoping to increase electricity billings by encouraging the sale of more household appliances. Usually the homes would operate as show places where potential customers could test new gadgets and contract installation. After several weeks, they would normally be dismantled as showrooms and sold. These exhibits were generally successful, and they had exactly the effect that home economists had expected. The result was not to liberate women, but to further confine them in the home. As David Nye explained:

Long work hours in the home persisted as a result of raising expectations for middle class women, who were exhorted to prepare more varied meals, vacuum the house more often, maintain a larger wardrobe, do laundry more frequently, and spend more time with the children.

While the onerous and time consuming work of washing linens by hand and pumping and carrying water was often eliminated, women were not thus afforded the luxury of independence. Saddled with a host of new domestic responsibilities, the adoption of home appliances was not generally rewarded with increased leisure time or the pursuit of non-wifely duties. 1920’s advertisements articulated this with acuity. One proclaimed, “A man’s castle is a woman’s factory”.

From a modern lens, it is easy to castigate the portrayal of the electrified home of the future as a missed opportunity for women. But back then the home economics revolution was seen by progressives as a cause not far out of line with the feminist movement. The modern industrial woman was, to some extent, elevated to the role of a “manager”, using the home “factory” to “produce” healthy citizens. Thomas Edison, in an interview with Good Housekeeping magazine would cast the new role of the modern woman as “rather a domestic engineer than a domestic laborer.” By equating the domestic work of women with the industrial work of men, the home economics movement sought to frame women as active participants and benefactors of the industrial revolution, and, as section three will note, to elevate women to a higher degree of citizenship.

Manifestations of the home of the future have continued to be a rich source of public fascination and marketing ingenuity. As early as 1966, it had been appropriated to market the coming computer aided Information Revolution. This is when a Westinghouse engineer installed an 800-pound computer in his basement using excess parts he found on the job. Wires ran to binary displays and teletypes near every door, the living room, and the kitchen. The company soon heard about this novelty and started marketing the house and family in a series of newspaper articles. The engineer had rigged the device to store his wife’s recipes and make shopping lists, and he also had plans to have it track her grocery inventory.

The ambitious home computer, and other abortive attempts to introduce digital technology to the kitchen in the sixties, harkened directly back to the home economics movement. As Ted Friedman noted in his book, Electric Dreams:

They were a logical attempt to model the home computer along the lines of most of the new machines successfully marketed to American families in the twentieth century: “labor-saving” household appliances such as washing machines, dishwashers, and refrigerators.

However, now engineers hoped to draw domestic women into the Information Age rather than the industrial one. In the early sixties, when books like The Feminine Mystique highlighted the increasingly formulaic roles of middle class men and women, this vision of the future of the digital revolution as an extension of the electric one was predictable. But the Information Age did not leave its mark on the household for several decades, and has yet to embed itself in normal home appliances and fixtures.

By the late 1990’s, many of the tenets of Home Economics, that women could rise to the level of her husband through domestic management, for example, seemed archaic if not downright sexist. But the rhetoric around marketing new domestic technology had not changed. One Microsoft video imagining an imminent home of the future depicted a housewife using a kitchen counter mounted computer to make shopping lists, keep track of ingredients, and read recipes. She is also able to control other aspects of the house, like the TV and air conditioning from a “home office”, a relatively new term that would have been music to the ears of proponents of “Domestic Science”. Now, like a factory, the home would also have an office from which women could manage the cultivation of their children and home lives. In the video, the wife’s husband does take a macho supporting role managing the digital household; he activates the security system from the “web phone” on his bedside table before he goes to sleep.

In this case too, the fundamental purpose of new home technology was not to create leisure time for the inhabitants of the home of the future. Instead, the family continues to operate just as it had before, only more efficiently. The wife still cooks, the husband still works 9 to 5, the children still watch television, and the nuclear family continues, just as it did before. The promise of the home of the future was never to subvert the family structure and allow alternate lifestyles for women or otherwise. Instead, the home of the future has traditionally served to show how technology could be used to embed the most traditional societal norms into the very walls of the home.

Media and Advertising

Part Three

If the implicit assertions that corporations made in public venues like World’s Fairs and Homes of the Future were premature, they would seem mild compared to those made many of the advertisements that accompanied them. Widely disseminated copy would frame these corporations not as profit-seeking appendages of billion heir’s and robber-barons, but benevolent leviathans, advancing a humanitarian cause. Advertisements angled to enshroud their subjects with an air of patriotic ascendancy, dubiously framing electrical companies as the instigators of political achievements and technological breakthroughs.

In one advertisement run in 1923 General Electric celebrated its role in enabling the woman’s suffrage movement. Entitled “The Suffrage and the Switch”, it depicted a woman turning on a light superimposed behind a woman’s gloved hand submitting a ballot. The copy read:

Millions of American women voted for president in 1920 and are finding the time to take active interest in civic affairs. Woman suffrage made the American woman the political equal of her man. The little switch which commands the great servant Electricity is making her workshop the equal of her man’s.

Note how the advertisement did not just implicitly claim that GE made possible woman’s suffrage through labor saved. Instead, through her the mastery of the “servant Electricity”, the American woman had elevated herself worthiness of the vote. This would square with Edison’s social Darwinist tendency, as David Nye recalled, “He believed that appliances in the home would literally force the housewife’s brain and nervous system to evolve to be the ‘equal’ of her husband’s.” To modern ears, this assertion might seem ignorant or even crass, but compared to claims made in other corporate copy, responsibility for woman’s suffrage would seem a modest exaggeration.

Future advertisements would be even more aggressive in their assertions of social good. At a 1939 General Electric demonstration, a presenter would articulate a corporate tactic that had come to characterize electrical advertising:

General Electric, as one of the leading scientific organizations of the world, recognizes its responsibility to promote progress-constantly to produce new comforts and conveniences; to raise the living standards of everyone; to make this world a better place to live… It is to this service for humanity that General Electric is dedicated.

To endeavor to make the world a better place to live for humanity is certainly an ambitious corporate goal, and arguably one that General Electric could creditably to have strived to accomplish. But if it had done so, it was through the production and sale of consumer products to what was at that point still a relatively privileged class of urban professionals. GE would emphasize its farming innovations, but as David Nye noted “simply ignored the blight of actual farmers, and made no effort to explain their relation to the landless migrants described in The Grapes of Wrath”. What worth were these fantastic new inventions to a constituency of farmers who had no access to electricity and no means of getting it, like 90 percent of farmers in 1930?

In fact, General Electric and the broader electricity industry had long been deeply embedded in many of the issues of heated public controversy; the rise of the industrial robber baron and the concentration of wealth in the 1880s and 1890s, the vast system of patronage and excess in the 1920s, and the decline of rural America in the 1930s, to name a few. Yet by affiliating themselves with their more illusory connection with the betterment of the human condition, they were able to save face and avert public enmity. As large electrical conglomerates came to resemble public trusts, rather than private institutions, maintaining broad support among the populace became increasingly important to General Electric and other companies leading this transition.

Often the claims made by the electric industry verged on the ludicrous. When the federal government pushed for greater development of public power in the sixties, “privately owned” utilities ran a vicious campaign to erode support for such a move. One ad depicted an armed guard stopping an elderly couple at the Berlin Wall. “Freedom is Not Lost by Guns Alone, when government owns business, it can control both goods and jobs… Then freedom has quietly slipped away. A quiet threat can be the deadliest. You may not know it is there until it is too late”, the ad ominously warned. Chapter two will discuss the rise of the electric utility as the ultimate realization to the quasi-public institution.

This ad would reflect a long tradition of cloaking private electrical trusts with the American flag in a perpetual effort to consolidate public support (further discussed in Chapter Two). The audience of the campaign would be forgiven for forgetting that private utilities are deeply embedded at every level of anticompetitive government complicity; they were guaranteed profit regardless of waste, granted a legislatively enforced monopoly, and could hardly claim the grounds from which to champion free markets. This campaign, however, was successful, driving support for public power from 30 to 70 percent in the 1960s.

But identifying with broad national causes to their own advantage was not the most aggressive tactic industry advertising would pursue. Another strategy which GE utilized with precision was to subtlety equate the work of electrical companies with the humanitarian “world’s work”. “ACHIEVEMENT”, one 1910s ad proudly proclaimed, superimposed against an imagined corporate utopian legacy- an image of a vast industrial metropolis of towering skyscrapers, generating plants, transmission lines, train cars, workers and engineers relentlessly managing it all. Above the cacophony of industrial activity, a massive GE monogram replaced the sun. Some of the copy read, “By the achievements which this company as already recorded may best be judged the greater ends its future shall attain.”

Another campaign from the period advertised an improved light bulb that was superimposed in front of an image of the earth and sun. The catch phrase: “His only rival”. Although the “his” in the ad referred to a personification of the sun, it is hard not to imagine that GE was casting itself as the arbiter of a force rivaled only by God. Another ad from the period was even more overt. It reproduced an 1816 argument against street lighting, apparently highlighting the absurdity of not purchasing GE products: “Artificial lighting is an attempt to interfere with the divine plan which has preordained darkness during the nighttime.”

In the dramatic rise of the Information Age in the past two decades, the claims made in corporate advertising were no less incredible. The competing rhetoric glorifying companies like Apple, Google, and Microsoft have become so intense that social commentators regularly skewer the companies for playing up their humanitarian roles such that their corporate roles are eclipsed. Commenting on a scene in which inventor entrepreneurs at a startup competition successively declare their innovation’s ability “to make the world a better place,” the director of the recent sit-com Silicon Valley, Mike Judge noted:

There seems to be this obligatory thing that (Information Age companies) have to throw in there... I mean, there are some people are making the world a better place, some maybe aren't, but it's just funny that most of it, it's just capitalism.

Many corporations adopt aspirational slogans that they use outwardly to project their roles as corporate stewards of humanity. Google famously adopted the term “Don’t be Evil,” which turned out to be a potent philosophy among its employees and users alike. “There is a shared, and perhaps blinding, belief on the Google campus that Google was altruistic, an attitude reflected in ‘Don’t be evil’”.

This slogan persevered even through Google’s IPO, normally a time of ideological moderation as companies court investors and submit to a battery of new regulation. In the midst of an boilerplate SEC disclosure, the founders placed an “Owner’s Manual” for Google’s share holders, including subheadings “Don’t Be Evil” and “Making the World a Better Place”, which cast it’s advertising and e-mail programs as tools to “bridge the digital divide” by connecting consumers and producers; advertisers and their targets.

One television ad Google recently aired depicted the grandchildren of two elderly men, divided for decades by the Indian-Pakistan Partition. The two technologically savvy children use a host of Google products to rekindle their grandparent’s friendships. Compared to the celestial terms in which GE portrayed itself, the Google ad was relatively modest, and even plausible.

Yet Google and other information technology companies will inevitably become tangled in the contentious social, economic, and political issues which history books will one day judge us by. Google, for instance, played its part promoting the irrational exuberance of the late 1990s. Its IPO created hundreds of millionaires, and it is widely held to account for promoting social stratification and gentrification in the San Francisco area. For example, in the past two years, frustrated protestors have taken to blocking and vandalizing buses contracted to shuttle employees to Silicon Valley. Google has also been accused of brash and legally grey competitive techniques, such as offering news aggregation services without the permission of the sites it indexed. In other words, despite the humanitarian and altruistic shroud, Google faces the same conflicts as many of its competitors.

While many technology companies cast themselves and their products in a similar humanitarian light, the only one that can match General Electric’s rhetorical prowess is Apple Computer. In one of the most renowned television ads of the century, Apple hired the acclaimed director Ridley Scott to film the commercial known as “1984”. In it, a lone female athlete, adorned in an Apple branded tank top, runs through a dystopian metropolis, uniformed men marching in unison, watching a close up of “Big Brother” angrily gushing inscrutable propaganda. The woman, clearly a human embodiment of the company, hurls a sledgehammer into the screen creating an awesome explosion. A solemn voice is heard: “On January 24th, Apple Computer will introduce the Macintosh. And you’ll see why 1984 won’t be like 1984.”

This commercial is important for two reasons. The first is that it announced a truly revolutionary product, the first mouse equipped point and click interface. But the format of the ad was equally noteworthy. As Ted Friedman pointed out:

The schema of the “1984” ad allowed Apple to harness the visual fascination of a high-tech future behind it, it became a touch-stone for the dot-com hype of the 1990s, anchoring the images of technology corporations.

In other words, the 1984 ad gave technology companies the vocabulary with which they could claim their legacy as the new arbiters, and in this case, even the protectors of the American spirit of innovation and social progress. This would influence the rhetoric around the entire technology industry for the following decades.

Apple did not stop at marketing itself as a lone hero averting a dystopian future, it also depicted itself as the heir to a rich tradition of innovation, craftsmanship, civil disobedience, and basically anything and anyone one inspiring. In the ad “Here’s to the crazy ones”, a wistful narrator celebrates “the round pegs in square holes”:

They push the human race forward. And while some may see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do.

Meanwhile, black and white film of historical American counterculture icons like Martin Luther King, Bob Dylan, and Thomas Edison stoically cut into one another. Finally, an Apple logo and its slogan “Think Different” appear on the screen.

Before “1984”, technology companies had been oddly coy about advertising their products, and never attempted to equate themselves with past historical figures, as GE had done with its ostensible founder. As Ted Friedman noted, until then, computers were seen as cold and foreboding; stigmatized by Darth Vader, Hal 2000, and Star Trek. Some companies responded with topical and hokey ads, like IBM, who ironically appropriated Charlie Chaplin’s technology adverse Modern Times to market its products. “Rewriting the story of the dystopian Modern Times to give it a Desk Set–style happy ending was an act of astounding gall, but judging from the IBM PC sales, it was quite successful.”, says Friedman

Like the electricity industry before it, the information and computing industry now had the vocabulary it needed to establish the foundation of a new era. Computer companies began to realize that in order to justify their growing influence and stature, they would need to be more than just competent at their technical craft. Like electrical companies before them, they would need to adopt a corporate persona and grow to more resemble public institutions. Some of the mechanics and implications of this are discussed in Chapter Three.

Neo-Jeffersonian Dreams

Part Four

At the dawn of both the Electrical Revolution and the Information Age, rampant speculation abounded on how new technologies would affect the rural areas. Politicians and social commentators often predicted that advances in technology would result in a resurrection of the declining Jeffersonian ideal; newly empowered people in rural and farming communities would leverage technology to reestablish economic and cultural primacy, as they had in the early years of the country’s founding.

By 1930, rural areas had experienced a dramatic depopulation, and farming communities had fallen victim to economic stagnation, especially compared with the growing political capital being accumulated in urban areas. As Gifford Pinchot, a well known political and conservation spokesman of the time noted:

“Only electric service can put the farmer on an equality with the townsmen and preserve the farm as the nursery of men and leaders of men that it has been in the past”.

Even from the invention of very first domestic and agricultural applications of electrical currents, engineers and politicians were acutely aware of the drudgery their technologies might avert. According to his rather self-aggrandizing autobiography, Nikola Tesla was one of the first to realize the potential of electricity to vastly improve the quality of life for the laboring class. Immediately upon imagining how an electric motor might work, the potential benefits might create, he dreamed that it would be used ubiquitously: “No more will men be slaves to hard tasks. My motor will set them free, it will do the work of the world.”

In fact, even before electricity was widely available, engineers developed an abundance of applications. In 1913, General Electric published a fully illustrated, 66 page catalogue of available farm equipment, including corn huskers, vacuum milkers, and even ozonators for improved mushroom growth. Yet two long decades later, only 1 in 9 farms were equipped to take advantage of those technologies.

The labor saving aspect of electrification was only one driver of the optimism around rural cooperatives. Evidence also indicated that many early efforts to electrify were meant to combat the cultural decline that many rural communities experienced in the early twentieth century. By the early 1930s, most city dwellings were electrified while only a small minority of rural towns had even basic service. The stratifying effect of this discrepancy was overt; one 1931 cartoon showed a couple considering the purchase of a country dwelling, only to be put off by the lack of electrical service. “Without electricity we would have to use kerosene lamps”, the wife scornfully explained, “Think of the bother of trying to get ice”.

In essence, America had been divided into two nations, one that took electricity for granted and could laugh at the backward ways of those without it, and one where the ubiquitous signs of urban modernity were completely absent. The contrast was stark enough to make each nation so alien to the other that their living conditions became fundamentally incompatible, as the cartoon humorously demonstrated.

Weary of the archaic stigma that the lack of electrification conferred on them, many small towns apparently first opted to electrify not for economical benefits, but instead to boost their own social status. Envious of the mystical “Great White Way” in New York, several small towns through the twenties, like Nyack, New York and Bellefontaine, Ohio financed their own miniature equivalents. This vain tradition was chronicled in Sinclair Lewis’s Main Street. Upon the illumination of Gopher Prairie’s main thoroughfare, a local speaker declared, “Now the point of this is: I’m not only insisting Gopher Prairie is going to be Minnesota’s pride, the brightest ray in the glory of the north star state, but further more that it is right now.”

This made for potent rhetoric during the rural electrification movement of the 1930s. Local politicians would often foment popular support by claiming credit for federally enacted electrification programs. The Rural Electrification Agency, for example, encouraged the creation of cooperative utilities by offering ultra low interest loans and logistical support to towns that volunteered to build them. “Rural Electrification was popular among congressmen, for they could easily present themselves as the initiators of REA services in their districts”, according to historian D. Clayton Brown:

They espoused the Jeffersonian ideal of agrarian values, seeking electrification as a means of restoring rural life and replenishing, so to speak, the nursery stock of American leaders and source of democratic values.

The effect of the program immediately stimulated community building and camaraderie, as farmers and other natives came together to facilitate cooperative construction and management. “In the early years, (cooperative) business meetings had been akin to church socials or harvest festivals”, according to one REA participant. There was a sense that country citizens would genuinely be able to better themselves through technology, and evidence exists to substantiate that impression; the ability to read at night, for example, helped the Muncie library multiply it’s lending nine times from 1890 to 1925.

Thus, “by the 1920s the rural use of electricity was no longer thought of as a luxury, but as a technological advancement necessary to fulfill the promise of American life”. In other words, the drive to electrify was motivated just as much by a desire to attain the full advantages of citizenship that people in urban areas had as it was by the economic benefits electrification could bring farmers.

More than half a century later, politicians and commentators would once again predict that the adoption of new technologies could facilitate a resurrection of Jeffersonian ideals and a resurgence in the primacy of rural areas. By the mid-nineties, it was widely speculated that technologies like e-mail and video conferencing would allow future workers to “telecommute” from home, which they hoped might reduce the relevance of large cities in the execution of white collar work. Further, they hoped that more widely disseminated and accessible information would level the intellectual divide between urban and rural areas.

Bill Clinton articulated these hopes in the 1996 signing ceremony of the Telecommunication Act, which was self consciously performed in the Jefferson wing of the Library of Congress.

Most of you know Jefferson deeded his books to the Library… Today the information revolution is spreading light, the light Jefferson spoke about, all across our land and across the world. It will allow every American child to bring the ideas stored in this reading room into his or her own living room.

One important aspect of the Telecommunications Act was that it required subsidization for new Internet connections for some schools and libraries. Just as electric light gave the children of Muncie (and presumably other parts of the world) more time to read loaned books in the 1920s, Clinton predicted that the educational benefits of widely disseminated information would ultimately aid in a better informed and more active rural citizenry.

The people in rural areas did not just hope to improve educational opportunities with Information Age tools. Some communities hoped that aggressive adoption of new technologies would reverse the tides of the urban migration that by then had conspicuously eroded the populations of some rural towns.

One Microsoft ad attempted to cash in on this bootstrapping mentality. Entitled “Anthem”, the ad cuts from shot to shot of Rockwellian small town utopia in the small town of Lusk, Wyoming; a police officer washing his cruiser, smiling children outfitted in cowboy hats at a rodeo, and so on. A somber voice pontificates on the newly connected towns ingenuity:

The schools have 320 computers for 500 kids. Home businesses are common. Why? They’re practical people. They want to talk to the outside world using technology. They wanna save their ranches with technology. They want to talk to the kids who have left and keep more kids from leaving by having the technology. They want to save their small town and keep it exactly the way it is”

It is hard not to draw parallels between Microsoft’s portrayal of Lusk, Wyoming and Sinclair Lewis’s depiction Gopher Prairie, Minnesota. Both in the Information Age and the Electrical Revolution, nostalgic boosters would speculate that technology could help restore small towns to their former prominence. Both communities cast the use of technology not as an overture into a progressive future, but as a retreat back to traditional values in order to resurrect a faded glory. Just as Gopher Prairie was the self-proclaimed “brightest star in Minnesota” even before electrification, Lusk would both save itself and stay “exactly the way it is”.

This has a certain parity with the previously discussed “home economics” moment, which also commandeered new technology to help embed traditions of the past into a technologically and socially “progressive” future.

Hopes of the resurrection of rural communities through information technology were not only predicated on the bootstrapping betterment of current residents. Many thought that the internet would allow telecommuting white collar workers to abscond from their dreary city lives and relocate to the vibrant countryside.

The 1970s saw a “back to the land” movement in which many affluent city dwellers would purchase second homes and retire to rural regions. Starting in the early 1990s, many commentators predicted that the Internet would facilitate a resurgence of that trend. Bruce R. Hull, an environmental advocate at University of Chicago, exuberantly put it this way:

The cultural currents of Jeffersonian pastoralism continue to thrive as a wave of “neo” pastoralists trade the business suit blues for their blue-jean dreams… Technologies such as telecommuting, flexible work hours, and the information superhighway fuel the counterculture urban to rural migration.

Hull observed a niche trend in which tech savvy urban professionals used the Internet to free themselves from geographic constraints, abandoning congested cities for what he presumably saw as the more desirable rural country side, not just because there was an economic incentive to do so, but because of the superior moral fiber of rural areas.

At the very dawn of the home computer, Alvin Toffler wrote a utopian novel predicting that a new “electronic cottage” would replace the office, resulting in every splendor a neo-Jeffersonian could hope for: greater community stability, a renaissance among volunteer organizations, the decentralization of industry, and the resurgence of the family unit.

So far, the results of both the Electrical Revolution and the Information Age have run counter to the predictions of most Jeffersonian hopefuls. Electrification failed to slow urban migration and arguably enhanced the allure of city life to rural households. The advent of the radio, one of the most prolifically adopted innovations of early electrification, audibly illustrated city temptations, as Clayton Brown put it:

In heightening the awareness of the cultural differences, radio encouraged youth to acquire a more sophisticated mode of life, and they flocked to the cities in search of satisfaction. In this sense the new instrument intensified migration to the city, the antithesis of what the early proponents of electrification had hoped would happen.

This is not to say that electrification was a failure; it vastly enhanced the quality of life for many American’s living in the country. However it did not succeed in “boosting” small towns to prominence or stopping the erosion of the rural farming populace once it was finally achieved. Instead, farms became vastly more productive, employing far fewer people. This was a result that many commentators might have found counter-intuitive. As David Nye put it, “Successful rural electrification both improved farm life and helped to depopulate the farms… as productivity soared, thousands of farmers left for other kinds of work.”

A single family could only cultivate several acres of land without electricity, but by using tools that were widely available in the 1930s, one family could manage a farm with over 40 acres. Unlike the “home of the future” which helped to embed traditional values into new technology, the rural farm of the future was something completely new. Farmers had no choice but to live fundamentally altered life styles predicated around the efficient adoption of new technology. The ultimate effect was not a revival of an older order but entry into a new one.

While rural cooperatives yielded enhanced community relations at first, the effect quickly faded. The festive environment that the REA participant described soon lost its luster, and cooperatives became drab faceless entities, not much different from a private utility. “With the decline of farm membership and the rise of suburban areas within its jurisdiction”, the participant recounted, “The cooperative lost much of this spirit and became to most of its members just ‘the electric company’.”

Similarly, many observers have been disappointed by the Information Age’s failure to revive stagnating communities like Lusk. Walter Isaacson recently observed that, “Among the myths of the Digital Age is that we would all be able to telecommute and collaborate electronically. Instead, the greatest innovations have come from people gathered in the flesh, on beanbag chairs rather than in chat rooms”.

Although a small counterculture may exist of people who use new technology to escape cities, no mainstream migration to rural areas has created a transformation in American life. This is not to say that the ubiquity telecommuting was an incorrect prediction. Instead, as many have observed that, in addition to a steady job, many Americans telecommute back to work upon arrival at home. Slate Magazine, a journal of note in the Information Age, put it this way:

Telecommuters, the majority of whom still go to the office, even if less frequently than their non-telecommuting peers—are in some sort of Catch-22 here: They want to use technology to become more productive and spend more time with their families, but the availability of productivity-boosting technology also makes their managers believe that the employees will get more work done, on weekends or after dinner… Somehow, what was supposed to be an “electronic cottage” has become an “electronic sweatshop.”

Rather than living more holistic family focused lives in rural areas, many American workers find that the advent of telecommuting further embeds them in the working world, even at times traditionally reserved for domestic family life. One Pew survey found that, as a result of networked telecommuting, “working Americans have become more likely to check their work-related email on weekends, on vacation and before and after they go to work for the day.”

The emergence of Neo-Jeffersonians seeking to capture the benefit of electricity for their own ideological cause is typical of the rhetoric that transformative new technologies invite. Although ideologically shaded predictions of future use of technology influence pubic hopes and understanding, they are often erroneous, informed more by narrow political interests than by a comprehensive understanding of technology. As this chapter has discussed, corporations, feminists, and politicians have all been guilty of disseminating such rhetoric.

Is it a fact, or have I dreamt it, that by means of electricity, the world of matter has become a great nerve, vibrating thousands of miles in a breathless point of time? Rather, the round globe is a vast head, a brain, instinct with intelligence! Or, shall we say, it is itself a thought, nothing but a thought, and no longer the substance which we deemed it?

NATHANIEL HAWTHORNE

Science Finds, Industry Applies, Man Conforms

Introduction

One of the more enduring legacies of the Electrical Revolution and the Information Age is often overlooked. The development of those technologies required a new set of relationships between the visionaries of scientific creativity, such as engineers and inventors, and the guardians of corporate capital, such as bankers and managers. Today, there is a vibrant venture capital industry that exists solely to incubate the development and adoption of new technologies. But before the 1990s, little formal corporate infrastructure existed to facilitate innovative new ideas. Instead, most inventors and entrepreneurs collaborated with established industrial companies to improve existing goods and services. An independent entrepreneur would be hard pressed to find institutional financing such that his product could become a competitor to the industrial behemoths that, to a large degree, governed the pace of innovation in America.

Today the words “engineer” and “inventor” are generally considered loosely interchangeable. Engineers and inventors frequently start their careers studying at universities and might expect to be handed substantial responsibilities as managers at large corporations. More ambitious engineers might dream of becoming an entrepreneur by marketing his or her inventions, and many engineers have become very prominent in business.

However, prior to the late industrial revolution, an engineer had none of those connotations. Engineering was a military discipline. According to Merritt Row Smith’s essay Becoming Engineers in Early Industrial America, “at that time West Point was the only institution of formal engineering in America”, turning out “not just officers of the line, but ‘soldier technologists’ who could master the level as well as the sword”. Noah Webster’s 1828 dictionary confirmed that narrow military understanding of the role of an engineer:

ENGINEER, n. In the military art, a person skilled in mathematics and mechanics, who forms plans of works for offense or defense, and marks out the ground for fortifications. Engineers are also employed in delineating plans and superintending the construction of other public works, as aqueducts and canals.

In other words, engineers were not deeply involved in economic decisions or management. They were facilitators of broader plans set forth by industrial capitalists. The occupation was thought of as more prestigious than that of a foreman or craftsmen, and carried more weight with corporate managers. However, unlike bonafide professionals, engineers were paid hourly wages, a fact that testified to the relatively subservient status an engineer might expect.

Today, one becomes an engineer after years of university instruction. But in the past, the discipline was far less exclusive. Lacking formal structure, one became an engineer by choice, not by training. According to Smith, “It brought a degree of respect from merchants, bankers, and other more highly placed members of the business world… Being an engineer meant being a gentleman, and gentility counted for much in class recognitions of 19th century America”. She continued, when they “identified themselves as engineers they thought and acted as if doing so were a matter of personal choice”, a mark of aspirational gentility, free from the formal constraints of the structured discipline which the Electrical and Information Ages would ultimately demand.

By the 1930s, when government investment fueled optimism that industrial electrification would lead the United States out of the Depression, engineers found themselves thrust into various social and political dialogs associated with the relationship between government and commerce. By the 1930s, engineers and their discipline seemed to have cultivated a kind of free agency. No longer were they the servants of politicians and businessmen; they were now equal partners. These three distinct realms, engineering, commerce, and politics seemed to share a symbiotic relationship, not lost on the futurists and pundits of the day. As historian Carroll Pursell recalled:

As the formula had it, scientific research discovered laws of nature that were then converted into technologies that were consumed by the nation’s citizens. An early and defining statement of the formula was adopted as the official motto of the 1933 Chicago Century of Progress Worlds Fair: ‘Science Finds- Industry Applies- Man Conforms’”

This motto defined a previously unobserved dynamic between the three disciplines. Science, and the engineers who make it useful, are positioned as the instigators of social and political change. Conversely, humanity, the ‘conforming man’, is positioned as a pawn in the hands of science and industry. The integration of the three realms, and the transformation of the engineer, would continue deep into the Information Age. The 2006 census used a far more holistic definition of an engineer, which had by then shed its military connotation and emphasized ties with economic and social needs:

Engineers apply the principles of science and mathematics to develop economical solutions to technical problems. Their work is the link between perceived social needs and commercial applications.

The census bureau definition completely eliminated the necessary condition of civil servitude that Noah Webster’s dictionary seemed to imply. Instead, it positioned the engineer at the nexus of “private” corporate capital and “public” social needs. The census continues to list 18 different kinds of engineers, including elaborate definitions with references to trade associations for each, and counts that over 1.4 million professional engineers work in the United States. The story of how engineering transformed from an aspirational title of social class and military ability to a carefully disciplined occupation fully integrated with politics and industry is one told in close tandem with the Electrical and Information Revolutions.

Scientific Creativity and Corporate Capital

Chapter One

Thomas Edison is arguably the first engineer to transcend the constraints of his discipline and actively identify, cultivate, and capture the full benefit of broad corporate support for his inventions. Through deft public relations he was able to court J.P. Morgan as his first corporate partner, a fete that was extremely uncommon at the time. Morgan, one of the world’s wealthiest and most powerful men, was known for his fiscal conservatism, rarely investing in unstable companies and especially new and uncertain ventures like early electric light.

Morgan was not alone in his regular aversion to fiscal adventurism; there was little access to corporate capital for creative inventors and entrepreneurs, and their relevance to the vitality of the broader economy, today taken for granted, was written off as trivial. As Journalist Historian Spencer E. Ante would recall:

In the nineteenth century, entrepreneurs virtually disappeared from classic economic and political thought thanks to the lack of financial support for their undertakings and awareness of their importance to the economy.

Inventors and entrepreneurs are terms that we today understand as nearly synonymous. But in the 19th century, inventors and engineers were rarely entrusted with corporate capital or managerial control of the companies they worked for. Conversely, entrepreneurs were not expected to have deep engineering knowledge or experience. The convergence of these disciplines would be a narrative perpetuated both in the Electrical Revolution and the Information Age.

By the second half of the 19th century, some engineers were starting to escape their corporate confines, gaining a degree of financial and creative independence. George Westinghouse, for example, got his start patenting and licensing his seminal inventrion, air brakes for trains, rather than working on the payroll of a locomotive company. Yet no engineers had yet ascended to America’s managerial elite, which was then dominated almost exclusively by a class of well pedigreed industrial factory owners. Bankers, and other gatekeepers of corporate capital, were generally too conservative to invest in engineers, fostering a dependence on large corporations to drive innovation. When Edison obtained the collaboration and financial support of J.P. Morgan in the financing of his Electric Light Company, in the words of Jill Jonnes, he forged “a new kind of relationship, however prickly and difficult, between corporate capital and scientific creativity”.

Edison would do so by making his endeavors literally impossible to ignore. He deliberately placed his first network of electric lights in the financial district of downtown Manhattan, no easy task since it would require him to dig eight miles of trenches to lay his wire under the pavement of the bustling New York streets. Not only did this make Edison’s work the source of perpetual tabloid fodder, it also allowed him to take various law firms, banks, newspapers and other established institutions of note, as customers. With his corporate infrastructure literally embedded in the foundations of America’s elite institutions, Edison would have an unprecedented source of corporate sponsorship for his electrical adventures.

Edison was not the only engineer to recognize that in order to fully realize the dreams of national utilization of electric power, a new partnership with corporate gatekeepers would have to be forged. This is evidenced by the fact that he and the two other major electrical entrepreneurs of the time, George Westinghouse and Charles Coffin, were headquartered in America’s financial epicenters where they could easily court the favor and financial support of America’s banking elite; New York, Pittsburgh, and Boston, respectively.

By 1900, these firms and their banking partners had a first hand awareness of the benefits of electricity that convinced the financial community that electrification would become a vastly profitable enterprise. Associated electrical industries “absorbed more capital than any other type of industry and rivaled railroad investment of the late 19th century in its relative share of the gross national product”.

Engineers did not immediately win the hearts of all bankers with their mastery of electricity. Some early skeptics of electricity saw electric light as a limited “service,” rather than a potentially transformative force that would fundamentally change the United States economy. It was easy, for example, to mistake Edison’s Electric Light Company as merely a competitor to candlestick makers and gas light utilities. From that perspective, the hype around the incredible versatility of electricity could have sounded hollow, since the vast constellation of electric appliances that would revolutionize American life had yet to emerge. However, entrepreneurs were able to break this stigma by billing electricity as a ‘commodity’ with multiple uses. As noted, by 1900, there were hundreds of working prototypes of electrical equipment that promised to change nearly every aspect of American industry and life. By then, doubters would be laughed at; “businessmen understood electricity as a ‘commodity’ which “might be useful in almost any branch of commerce”.

Because of electrification, for the first time, engineers would not be second-class citizens on the technical fringes of the economic revolution, as had largely been the case in other major infrastructure projects in the early Industrial Revolution. Instead, they would play a key role managing the investment decisions of financiers. Large electric conglomerates, like Westinghouse and GE, would be run in large part by engineers with deeply rooted technical backgrounds. With the rationalization of corporate infrastructure through expensive and complex technical equipment, engineers became increasingly important in management. “As electrification assisted oligopolistic concentration, it became a keystone of an ideology of progress, uniting engineering and commerce”.

As such, Edison and Westinghouse began paying their engineers salaries, and the field formalized its community through journals and professional organizations. MIT became the first institution to offer training in engineering outside of West Point, opening an Electrical Engineering Department. According to David Nye:

By 1900 this group, which was largely male, had professional organizations, journals, and positions of social power and responsibility. They knew the most about electricity and expressed few doubts about it or other new technologies, which they regarded as tools of social progress.

One of the virtues of the new engineering community, I think, is that as it adopted a formal structure, it did not become more selective or elite, as other professions had. In part, this was because of the vast potential which electrification offered perspective engineers. Although conglomerates like Westinghouse held most of the patents to build generators and transmission systems, “once a home was wired, anyone might build a device that could be plugged into the system”. This resulted in a vibrant community of engineers, who tinkered with electrical apparatus and designed then marketed improved variations on washing machines, electric ranges, toasters, and so on to be sold regionally like other crafts.

But the formation of a formal engineering profession did serve a purpose; it furthered the role of its members as equals to bankers and managers of America’s corporate apparatus. The life of Samuel Insull was perhaps purest expression of that trend. Insull, a protégé of Edison’s, would go on to create a truly massive electric utility conglomerate by raising money from the public, marketing stocks and bonds to small individual investors. By 1921, over one million individuals had invested with Insull companies, making it the first corporate security widely owned by Americans; by 1938 over five million American’s would own them. Richard Munsun recalls:

Although originally designed to obtain political support from customers who became shareholders and bondholders, the public relations effort also represented a major innovation in corporate finance and weakened the stranglehold of New York bankers.

Even by today’s standards, Sam Insull was an aggressive marketer. He would often sponsor middle school programs to teach students about the benefits of electric light and other appliances. Although he hoped that the children would return home to pontificate on the benefits electrification to their parents, he was also aware that kids would, in his words, “be the customers, the investors, the voters, and the lawmakers of the future”. He would often say “Mr. Edison thought me all that I know about electricity, but I owe to one of Mr. Barnum’s men all that I know about publicity”, referring to the famous circus.

The original intent of “customer owned” stocks, it seems, was not to distribute profits and losses among investors or to raise capital (which is how we typically think about stocks and bonds in the context of personal investing today). Instead, the hope was that by linking the well being of ordinary Americans directly to the success of the company, Insull would come to garner increased political support, since his success in business would necessarily coincide with their success as investors. This was especially important, considering the political nature of electric utilities noted in Part One.

It seems incredible that the creation of widely owned commercial securities started as a political ploy. But regardless of its intent, “customer ownership” of corporate securities did more than serve Insull’s political interests; it would effectively position him as the first of a now familiar “managerial class running giant corporations that were owned by thousands of investors”, less accountable to the notoriously ruthless and micromanaging Wall Street bankers who had governed most facets of American industry.

By the dawn of the Information Age, corporate capital was still relatively difficult for an entrepreneur to obtain. Despite the fact that late electrical industrialists like Edison, Westinghouse, and Insull had eroded dependence on banks in the day-to-day management of their companies, it would still be considered highly irregular for institutional capital to be expended on an uncertain venture. As one pundit recalled:

Young, unproven companies were still the stepchildren of capital markets, overlooked and neglected. That left entrepreneurs with the same old minuscule set of options: raising money from friends, family, or rich individuals.

Venture capital, the formal technology incubators that would ultimately rise to fill this void, had yet to be invented. When small, innovative companies wanted to finance research and development while still growing their customer base and revenues, they were often forced to merge with one another or with larger companies.

In the post WWII period, the capital desert was especially parched. At a time when the proverbial “Organizational Man” promised to rationalize business through risk aversion and dependable returns, there was little corporate tolerance for disruptive innovations which might deviate from management’s long-term plans.

For much of the 21st century, IBM had been considered the corporate embodiment of the Organizational Man. In 1957, for example, IBM engineer Ken Olson presented management with a prototype computer using transistors, rather than vacuumed tubes. Transistors and vacuum tubes are semiconductors, the fundamental building blocks of all computers, which preserve and transmit binary signals. Transistors were cheaper, faster, and more reliable than traditional vacuum tubes, and would ultimately prove capable of abiding by Moore’s law, becoming exponentially cheaper and smaller over time. This realization would be a key element of the Information Age. But, Ante recounted:

Olsen developed the computer, unequivocally demonstrating the superiority of transistors. For his accomplishment, he expected businessmen to shower him with praise. But that was hardly the response of the organizational men running corporate America.

Olsen’s invention was ignored and written off as a vain academic project with little relevance to IBM’s broader plans. Like AT&T, IBM operated using sacrosanct multi-year plans that did not account for the invention of the transistor, who’s existence instantly rendered virtually all IBM devices obsolete. Realizing the potential of his invention, Olsen began secretly looking to finance the production of his product. Naturally, his first instinct was to market his company to another large multinational, General Dynamics. Lacking business experience, he was immediately turned down. Fortunately, General Dynamics directed Olsen to a company called ARD, then America’s first and only venture capital company. Within a month of a new presentation, ARD would offer Olsen 100,000 dollars in financing and name the new company DEC, the Digital Equipment Company.

DEC would transform the computer industry, successfully competing with IBM for decades. However, its more enduring contribution came from its collaboration with ARD, which would create a new class of engineer entrepreneurs who, through the support of venture financing, would remain the CEOs of the companies they founded, despite a lack of business experience. Other companies to follow this model include Microsoft with Bill Gates, Apple with Steve Jobs, Facebook with Mark Zuckerberg, Google with Larry Paige and Sergey Brin, as well as countless other venture backed corporations.

Another parallel between the Electrical Revolution and the Information Age is the financial innovation that took place to accommodate them. Just as Sam Insull had popularized wide ownership of his electrical company’s equity and bonds, the rise of venture capital precipitated a large increase in stock ownership by households. The Dallas Federal Reserve cites a symbiotic relationship between Information Age technology and publicly traded markets.

According to the Fed, low cost IT democratized investing by making it cheaper and easier to invest:

Changes have stemmed from better information technology that has cut the costs of investing... Partly as a result, the share of U.S. households owning stock has risen from less than 25 percent in the early 1960s to about 50 percent by the late 1990s.

At the same time, new financial technology gave venture capitalists the ability to sell ownership in their companies before they fully mature, creating riskier but more rewarding investment opportunities for individual investors:

Investors’ access to capital markets has increased, and this democratization of America’s capital markets helped fuel the economic boom of the 1990s, the longest economic expansion on record… (new financial technologies) made it easier and less costly for firms to arrange IPOs through investment banks, venture capital investing saw boosted returns and volumes in the 1980s and 1990s.

In other words, just as Edison forged a new kind of relationship between corporate capital and inventor-entrepreneurs, venture capitalists created the financial machinery through which they would secure their assent into the most elite realms of the financial establishment. By the mid 1990s, Initial Public Offerings by companies selling their own stock for the first time became “the most prevalent form of security issues by firms wanting to raise capital in the United States”, as opposed to taking on new debt from bankers who sometimes demand managerial control. This helped contribute to the ascendancy of todays ruling corporate class of owners and managers, who frequently include inventors and entrepreneurs within their ranks.

Scientific Creativity and Political Ideology

Introduction

The first fully articulated declaration of the philosophy of the ascending engineering class was Fredrick Taylor’s 1911 book Scientific Management. Taylor’s work was intended for an audience of industrial employers, who he believed could exponentially increase the productivity of their employees and machines by applying scientific rigor to their work processes. However, Taylor would occasionally allude to a greater national cause that he felt adherence to his principals could advance. Although the bulk of Scientific Management reads like a rather un-extraordinary business management book, a close reading of the afterward reveals far deeper social and political ambitions. Taylor would sum up his theory with the following slogan:

Science, not rule of thumb. Harmony, not discord. Cooperation, not individualism. Maximum output, in place of restricted output. The development of each man to his greatest efficiency and prosperity.

Read narrowly, the slogan might refer only to managerial styles, but it would soon be interpreted more holistically, and applied to numerous social disciplines. Taylorism, as his beliefs came to be called, would ultimately gain influence far beyond its targeted audience of wealthy industrialists. It became almost philosophical worldview, far more potent than its context in the comparatively limited role of managerial self-help would suggest. As noted in Chapter One, forms of scientific management could be applied outside of the industrial factory. Political adherents of Taylorism, like the technocrats, attempted to apply scientific rigor to bureaucratic governance. Technocracy, as Neil Postman would note, differentiated itself from mainstream political thought by emphasizing a rigorous philosophical framework:

These include the beliefs that the primary, if not the only goal of human labor and thought is efficiency; that technological calculation is in all respects superior to human judgment; that in fact human judgment cannot be trusted because it is plagued by laxity, ambiguity, and unnecessary complexity, that subjectivity is an obstacle to clear thinking; that what cannot be measured either does not exist or is of no value; and that the affairs of citizens are best guided and conducted by experts.

In other words, the betterment of the human condition was not determined by philosophical considerations or by the leadership of great men. Instead, only efficiency and productivity could hope to result in greater prosperity and freedom for the society. Mastery of the laws of nature would lead to cultural advancement, rather than mastery of the laws of humanity. To progressive engineers of the time, Scientific Management was the missing link that would finally make engineers relevant to the social and political well being of the nation, a social echelon shared primarily by doctors and lawyers.

Although the Taylorist mode of thought was advanced by engineering and crafted for industrial capitalists, it had a surprisingly evangelical streak. In the conclusion of Scientific Management, Taylor extolled the incredible benefits mankind would attain in the event that his principals were widely adopted:

The larger profit would come to the whole world in general… This means increase in prosperity and diminution in poverty, not only for their men but for the whole community around them…. Is it not the duty of those who are acquainted with these facts to exert themselves to make the whole community realize this importance?

William Akin has framed the Taylor’s aversion to lost productivity as quasi-religious: “Taylor possessed a moralistic view towards waste and inefficiency, viewing the idleness of a productive plant as the social equivalent of the protestant attitude toward individual waste.” As George S. Morison, then the president of the American Society for Civil Engineers put it, “We are the priest’s of material development,” a designation which presumably implied a moral responsibility to society, despite the fact that engineers remained economically accountable to bankers and industrialists.

As previously noted, engineers had often speculated on the humanitarian benefits of their work. Remember how Tesla claimed that his first thought upon conceiving the induction motor was how it would “free men from drudgery”. However before the advent of Taylorism, engineers did not possess an ideology in the way that capitalists or politicians did.

By the early 1930s, the technocratic movement had morphed into a niche culture. Yet rather than recognizing their roots in commercial Taylorism, many technocratic commentators and journals of note were condescending and even combative towards the movement’s well established political and capitalist origins. The Columbia professor Walter B. Pitkin’s book, A Short Introduction to the History of Human Stupidity (574 pages), which was widely quoted in periodicals like Popular Science, Popular Mechanics, and The Technocrat, was relentless in it’s invective:

If America wants a five-year plan that will put her ahead five centuries, let her close the White House and kick every banker and broker and manufacturer out of every pontifical conference... while a few thousand genuine scientists who are not Yes-Men for corporations ascertain which unexploited inventions and discoveries might be quickly turned to account.

A small contingent of faculty at Columbia University’s Engineering Department began to assemble the foundation of a formal movement with a unified cause and strategy. Soon, many technocrats came to share a derision of the “price system” of capitalism, believing that automation of the Industrial Revolution was bound to put tens of millions of Americans permanently out of work, a theory which they felt was validated by the Great Depression. Despite the abundance that mechanized processes promised, the reasoning went, under a market capitalist system the masses of newly unemployed would be unable to share in them, an intuitively appalling paradox. Though the details have been largely forgotten today, from 1932 to 1933, periodicals wrote more features exploring the technocratic movement then they did FDRs administration.

Columbia professor Howard Scott, one of the fathers of Technocratic Economics and its most prominent voice, traced the inequities of the late Industrial Revolution to the abundance of electricity. Because electricity was, in his opinion, the basic unit of all industrial output, he proposed that “energy certificates” replace normal currency and that the total amount of energy in production be equally divided into rations to be distributed to people and spent like cash.

Scott, like many of his coworkers, was influenced by the generally transgressive bohemian outlook centered embodied in his home neighborhood in Manhattan’s West Village. Like the bohemians, the technocrats shared derision towards the dominant “leisure class” who were the primary managers of American industry at the time, further discussed in Chapter Three. The technocrats resented the cultural lag of the capitalists, which they felt was inconsistent with the commitment to productivity and workmanship that industry required to best serve the public good.

While the technocrats were taken seriously by academics and politicians in the early 1930s, they rapidly lost credibility after their opponents highlighted instances of intellectual dishonesty in many of their primary documents. One oft cited economic survey that attempted to measure the increase in industrial productivity and technological unemployment across America in the Industrial Revolution was found to contain grossly exaggerated conclusions. This document was the primary source for so many of the technocrats claims that the movement was permanently stigmatized, and its proponents lost nearly all their influence in political and economic matters.

The dawn of technocracy was celebrated as the path to prosperity in Utopian visions of the future like Chicago’s Century of Progress World’s Fair, which seemed to finally give credence to the literary fantasies of the late 19th century such as Edward Bellamy’s Looking Backward. But by the mid to late forties, the popular image of technocratic progressivism was tempered by darker images like Aldous Huxley’s Brave New World and George Orwell’s 1984.

By the late 1930s, as previously noted, it was clear that electrification was not going to create a resurgence of the Jeffersonian country citizen as hoped and, in fact, rural populations were only further depleted by the increase of farm productivity. The continuous ascendency of big agribusiness, as depicted in The Grapes of Wrath, was salt in the wounds of betrayed agrarians. Urban communities were no less offended by the apparent hard-heartedness of industrial Taylorisum, as evidenced by Charlie Chaplin’s deeply cynical 1939 Modern Times and other darkly Dickensian depictions of late industrial suppression of the individual.

However, the good intentions of the engineers whose ideology educated technocracy were not lost on all social commentators. The novelist Eleanor Buckle captured the growing skepticism of technocracy by comparing over zealous electrical engineers to military generals, fighting not for the sake of the people they protect, but for the love of their craft:

You now know everything there is to know about your specialty… But unless you concern yourself with why, and especially with for what, your construction exists, you’re like a general with a passion for military tactics and an abysmal ignorance of the nature of the human beings who fight the wars, but yearn for peace and freedom.

This was perhaps an unwitting tribute to the military heritage of engineers, but it was also an assault on the very notion of technocratic progressivism; that every advance in technology must necessarily result in the betterment of humanity. Compared to the seemingly inevitable collision between humanity and technology which Huxley and Orwell predicted would result from technological change, Buckle’s criticism was optimistic. Technology, like war, was a tool that despite the excesses of the Industrial Revolution and the age of electricity, could be used in moderation to the benefit of humanity.

While the economic and philosophical experimentation of FDR’s Great Depression may have created a sympathetic audience for the ambitious aims of the academic technocratic movement, the post-war environment of the late forties and fifties was far less tolerant. The grey flannel suited Organizational Men who came to characterize the economic and social conservatism of the time were unwilling to risk political stability and security for the unproven technocratic ideals. In addition, the alarmist message of economic Armageddon due to mechanization lost its potency without the fresh specter of potentially perpetual depression.

But the technocrat’s anti-progressive message, that advancing technology would not necessarily spell human betterment, was not entirely lost to the post-war decades, as William Akin explained:

The technocrats made a believable case for a kind of technological utopia, but their asking price was too high. The idea of political democracy still represented a stronger ideal than technological elitism. In the end, critics believed that the socially desirable goals that technology made possible could be achieved without the sacrifice of existing institutions and values and without incurring the apocalypse that technocracy predicted.

Yet some would argue that, although formally vanquished, the technocrats had planted the seeds of revolution and were silently accumulating power without attracting political or academic attention. In 1941, for example, James Burnham published a popularly read sequel to Scientific Management, Managerial Revolution. In it, he noted that government administration and corporate regulation were increasingly placed in the hands of technical experts and that corporations were increasingly lead not by their owners, but by managerial experts.

The American economic and political structure easily withstood the deeply subversive fantasies academic technocrats such as Howard Scott proposed. Yet some technocratic engineers willing to operate within the establishment boundaries of the Industrial Revolution were slowly growing in power and influence. Burnham believed that the seeds of technocracy had been planted within the industrial establishment, and that the movement was secretly thriving, despite its public fall. Yet another way to interpret this coup is that the establishment hijacked useful and profitable parts of the technocracy, like a carefully cultivated trust in managerial capitalism and experts over democratically elected politicians. Meanwhile, the establishment discarded the more important idealistic foundations of the movement, such as the redistribution of wealth and income technocrats felt was necessary to prevent automation from creating mass unemployment and poverty.

This is the philosophical environment in which the Information Age came into being. Computing, an industry dominated by IBM for the bulk of the 20th century, was widely associated with faceless corporate and government entities. The late fifties saw an increase in government sponsored venture financing by entities like the Defense Department’s Advanced Research Projects Agency (DARPA) as a result of Cold War tensions. This helped spur some innovation in the sixties and seventies, but the computer continued to denote inscrutable, faceless organizations in the public imagination. The optimistic connotations of technocracy linguistically succumbed to unflattering associations with bureaucracy in general. As Neil Postman cynically explained:

Naturally, bureaucrats can be expected to embrace a technology that helps create the illusion that decisions are not under their control. Because of it’s seeming intelligence and impartiality, a computer has an almost magical tendency to direct attention away from the people in charge of bureaucratic functions and towards itself, as if the computer were the true source of authority.

Academic engineers had alienated themselves from mainstream economic thought through the technocrat movement. Some felt resentment of their practicing peers, of whom they accused of preserving the industrial establishment with Orwellian efficiency. But by the dawn of the Information Revolution, modern engineers seemed to have achieved their predecessors’ ambitions. No longer were engineers facilitators working on behalf of the pioneers of the industrial order. Like medicine or law, engineering had become a profession, with all of the social and economic benefits that came with it.

Dormant for several decades, countercultural ideologies not far removed from those of the early technocrat’s bohemian tendency became popular in the engineering community. In fact, many social commentators associate the personal computing revolution, and especially the rise of the Internet with the social upheaval of American culture in the 1960s.

Stewart Brand, the founder of the Whole Earth Catalog series, and later, Wired Magazine (which is widely considered the most important journal of note in the Information Age), epitomized this association. The cultural historian Fred Turner would cast Brand, “a countercultural entrepreneur, but in a deeply technocratic mold” (recognizing the original meaning of the word). Brand was instrumental in casting the computer in a light that the counterculture movement could connect to. Of the Whole Earth Catalog, Walter Isaacson recalls:

The underlying premise was that a love of the earth and a love of technology could coexist, that hippies should make common cause with engineers, and that the future should be a festival where A.C. outlets would be provided.

These were the beginnings of “hacker” culture, a term that in many ways resembles the pre-industrial definition of an engineer. As previously noted, early engineers donned that title as a mark of technical skill and social distinction, despite the lack of a formal engineering establishment. Hackers also voluntarily chose that designation to denote not just their skill at programming computers, but a set of loosely agreed upon social and professional goals. Richard Stallman, a forefather of the hacker movement, articulated those goals this way:

The hacker ethic refers to the feelings of right and wrong, to the ethical ideas this community of people had, that knowledge should be shared with other people who can benefit from it, and that important resources should be utilized rather than wasted.

This notion has a particular resemblance to the early Taylorist notion that engineers and entrepreneurs should have a moral obligation to prevent waste on behalf of society. However where early engineers hoped to elevate their status to equal that of prestigious members of the establishment, hackers distanced themselves from mainstream professional recognition. Reflecting in a 1995 Time magazine article, Stewart Brand recalled:

Hippie communalism and libertarian politics formed the roots of the modern cyber revolution… In the 1960s and early '70s, the first generation of hackers emerged in university computer-science departments. “Hackers” embraced computers and set about transforming them into tools of liberation. "The Hacker Ethic," articulated by (Steven) Levy, offered a distinctly countercultural set of tenets. Among them: ‘Access to computers should be unlimited and total, All information should be free, Mistrust authority, Promote decentralization.

These tenets would not have been unfamiliar to technocrats, who felt that electrical power should be “decentralized” away from industrialists and given to the masses. We might even imagine that technocrats would have been sympathetic to the vaguely communal undertones of Whole Earth Catalog as an alternative to the ‘price system’. Steward Brand, like the technocrats, felt that society needed to adapt to new technology to become more inclusive. But unlike the technocrats, who enjoyed formal structure and publicity that the Columbia Engineers brought them, hackers by definition lacked such centralized academic support. As Stephen Levy noted, “The precepts of this revolutionary hacker ethic were not so much debated and discussed as silently agreed upon. No manifestos were issued. No missionaries tried to gather converts.”

Of course, in the early 1970s, the Internet was still decades away from its invention and the personal computer was a distant fantasy. Mainframe computers were only beginning to reach even medium sized companies and universities. Even in an environment in which computing structurally lent itself to elitism by virtues of its scarcity, hackers were developing a populist “ethic” and a clear set of goals as set out in Steven Levy’s book. This ethic is arguably at the root of much of the consumer hardware and software that emerged after the 1980s, and especially the Internet. The net might be considered the ultimate articulation of hacker values in that it invites its users to contribute to it and is arguably intolerant of organizational hierarchy.

Lee Felsenstein, the designer of the first personal computer, the Osborn 1 in 1981, was deeply embedded in various stains of the countercultural movement in the 1960s. By the late 1970s, Felsenstein adapted his growing technological skills to serve his carefully honed countercultural ideals.

The Free Speech Movement was about bringing down the barriers to people-to-people communications and thus allowing the formation of connections and communities that were not handed down by powerful institutions. It laid the ground for a true revolt against the corporations and governments that were dominating our lives.

Steve Jobs briefly lived in a community called the Apple Commune before naming his company after it. Apple sold the first computer that used a point and click “Graphical User Interface”, that was announced by the previously mentioned “1984” advertisement, a deeply countercultural critique that Jobs was involved in commissioning. Like the Technocrats who despised the leisure class of the industrial order, Jobs and much of his generation of personal computer engineers felt a similar aversion to the type of inbred complacently that blue-chip corporate computing companies like IBM or Xerox seemed to entertain. He was influenced by the Whole Earth Catalog, and even as jobs tried to balance the reality of his own growing corporate influence, Stewart Brand would endorse him as “the nexus of counterculture and technology."

Liberals often credit the personal computer and the Internet to big government and the military industrial complex, which arguably financed their invention. Meanwhile, conservatives claim the internet as a victory of free enterprise, since the dynamism of today’s web has a distinctly lassie fair quality. The countercultural narrative advanced by Issacson, Brand, and Markoff lend credence to neither camp, instead pinning the rise of the personal computer and the creation of a Internet community on the LSD fueled social currents of the 1960s.

As John Markoff noted, the epicenter of the West Coast military industrial boom happened to be near heart of a constituency more comfortable at Burning Man Concerts than walking the halls of buttoned up firms like Lockheed Martin or Hewlett Packard. Markoff argues that this cultural dissonance is responsible for the realization that computers had applications beyond managing large corporations or calculating missile trajectories:

What separated the isolated experiments with small computers from the full-blown birth of personal computing was the West Coast realization that computing was a new medium, like books, records, movies, radios, and television. The personal computer had the ability to encompass all of the media that had come before it and had the additional benefit of appearing at a time and place where all the old rules were being questioned.

While engineers in the West may have been culturally predisposed to understand the incredible potential of computer technology, computer firms on the East Coast had seemingly adopted the careerist mentality of corporate America from a previous era. Some speculate that these cultural influences may have impaired the vision and foresight of technologists on the East Coast and stimulated it in the West. In Tracy Kidder’s famous chronicle of the late 1980s New England computer industry, Ivy League credentialed engineers jockey for administrative power, hoping to attain managerial status. In other words, many engineers began to look and act like mainstream establishment professionals, creating the impression that their industry had matured from its innovative but chaotic early roots.

Perhaps this contributed to a deep cynicism that began to emerge in the 1980s which framed the Information Age as merely an echo of the old industrial order. Alvin Toffler, for example, complained that the new white-collar jobs it brought were no more engaging and no less intellectually subjugating than blue-collar factory jobs of the industrial revolution. This factory style of work was transferred into the office, each person doing a tiny part of a repetitive task, without any sense of its relationship to the whole, without any pride or skill or craft, without any opportunity for discretion or creativity.

It was not until the late 1980s that personal computing, led by Apple, began to demonstrate how computers might ultimately counter the ominous HAL 2000 stigma that had been attributed to them. In part this was because the Graphical User Interface made the use of the computer relatively intuitive, and in part it was because computers were finally cheap enough for millions of individuals to purchase. But if counter cultural undercurrents were built into the Macintosh and its imitators, the fact was by and large lost to the general public. It was not until the rise of the Internet that the roots of the personal computer industry became a part of public dialogue.

David Porter captured many of these in his 1997 compilation of essays, Internet Culture. Among other things, he discussed analogies between various political venues of history to Internet forms. One potent comparison was that of an Internet chat room to a coffeehouse. Here, there is little risk of offending bystanders with antiestablishment options. Here, participants are invited and even obliged to form new ideas and opinions with the advice, rather than the scrutiny of bystanders.

Porter’s work, however, did not detail how the Internet would help foment political change. In 1997, arguably as now, it was too early to observe such effects in a rigorous way. Instead, Porter observed that many of the venues (coffee houses, public squares) and tools (the essay, the printing press) that had historically served as catalysts of political unrest and revolution had direct analogs in cyberspace.

In the following decade, the degree to which these tools have been used in a revolutionary capacity is unclear. It seems as though the most pure expressions of the “Hacker Ethic” in a political context have been through the actions of subversive acts of “Hacktivists” like Edward Snowden and Julian Assange. Indecently, both Snowden and Assange have records on chat rooms that visibly evidence how Internet culture imparted upon them aspects of the hacker ethic, which include, as previously mentioned, “All information should be free”, “Mistrust authority”, and “Promote decentralization”, a message deeply represented by their lives’ work.

The Hacker Ethic and the residual countercultural spirit that informed the rise of personal computing does not seem to have penetrated mainstream political thought in a substantive way. Just as the Technocrats failed to articulate a comprehensive mission and antagonized establishment forces, the Hacker culture that informed the political ideology of many computer engineers has proven to lack potency outside of the profession. Although it is too early to know for sure, it seems that Hacker culture is doomed to suffer the same fate as the Technocrats; some beneficial aspects of their ethic will be subsumed by commercial and political forces while their ideological core will be forgotten.

Corporate Ideology and Political Power

Part Three

American corporate regulation in the Electrical Era from the 1890s to the 1930s was formulated, in large part, as a response to the massive electrical trusts that had consolidated market power at that time. Electric utilities, streetcar companies, and electric component manufacturers presented unique challenges that had never been confronted before. Corporate interests responded to the growing public pressure first by resisting their changing role in the political economy, and then by accepting increased accountability and public obligations.

From the very beginning, electrical interests were saddled with complicated political considerations. Thomas Edison’s Electric Light Company was forced to pay off Tammany Hall hacks who threatened to stall his very first network in the heart of the financial district, and Edison regularly wined and dined Tammany leadership in order to secure his construction. Subsequent lighting and electric companies won contracts through similar acts of patronage, often integrating no-show political appointees in ostensible roles as plant managers.

As technologies advanced and electric lighting began to permeate the homes of millions of Americans, electric companies began to realize the commercial genius of electricity. Electric light was easy to market since it’s only real competition was comparatively expensive, dangerously flaming, and often noxious gas lamps. However, unlike gas, once a home was wired, it could not receive service from a competing generator; electrical concerns owned the generator, the wires, and in some cases even the lamps. As David Nye noted:

The advantage in marketing electricity, in comparison to other products, was that the wires permanently linked the consumer to the producer. To sell electricity one did not have to go out into the market and compete with other brands. All electricity was invisibly the same, and within any given market area had a ‘natural monopoly’.

As communities began to realize the excesses of businesses with such deeply embedded advantages, they began to employ an economic entity that had been formalized years earlier; the municipal utility. By the time electric light companies were consolidated into utilities after the turn of the 20th century, many electric traction car companies had already attained that status.

The New York City subway system was the first major example of a municipal utility in the United States. Because private industry could not finance the construction of underground trains, the city government agreed to pay for construction and allow private ownership in exchange for limited managerial control, which was most frequently used to ensure reasonable fares. One problem which emerged early with quasi-public utilities was that they were simultaneously considered “too big to fail” and often lacked the ability to operate profitably on their own. When their “private” owners were unable to manage them, the municipal government was usually compelled to step in, purchase the utility’s assets, and continue service, often at great financial loss. While today such a flagrantly one-sided deal might not meet such a positive response, the proposal passed referendum and was widely supported. Proponents of the deal developed a potent argument for utilities, which contextualized them as the 19th century version of the great public works projects of the early industrial revolution:

The city’s businessmen worked out a theory of municipal ownership that did not appeal to any brand of socialism, but rather to history. The state had once built the Erie Canal, and it provided other essential public services… It did so not in order to compete with private business, but rather to improve business conditions and to raise property values, which in turn enlarged the tax base.

The corporate form of the utility was widely used across the United States to build and manage electric streetcar and subway service. But the auspicious beginnings of the utility would quickly deteriorate; many of them were chronically unprofitable and would have to be taken into receivership by municipalities after several years of service. Further, utilities created opportunities for white collar maleficence which would ultimately necessity drastic legislative action:

The Thompson-Houston firm and Henry Villard, the railroad organizer interested in selling arc lamp equipment, agreed to fix prices and split the business of supplying power to city streetcars. Their maneuvers prompted Congress to pass the Sherman Antitrust Act and stifle such arrangements.

Despite the questionable performance of early utilities, by the era of home electrification, the seeds had been planted. Utilities would nevertheless continue to play an important role in facilitating partnerships between corporate capital and government power.

There was a reason that the United States was so welcoming to the utilities despite their early shortcomings. Other Western nations were also electrifying and did not see the proliferation of publicly funded, privately owned utilities. Unlike the United States, counties like Holland, Denmark, and France were prone to consolidate large industrial projects under federal control as public works projects. The American federal government did not have the inclination or resources to fund and organize those types of projects, especially given the powerful advocacy groups American utilities established that ensured opposition to private ownership would be made politically untenable (discussed in Part Three).

Instead, corporations like Interurban and General Electric contracted with country municipalities to form local utilities without centralized federal control. This system had inherent flaws that disrupted partial concerns within the American industry. Expensive electrical appliances were often compatible only with the specifications maintained by the utilities that installed them; a family could move across the street and find that their toaster was useless.

Further, the arrangement did not encourage growth. Most American utilities would fully electrify urban areas then slowly increase billings by encouraging existing customers to buy more energy-consuming appliances. Ideally, a successful utility would invest in acquiring new customers by expanding it’s service area. Instead, they were generally confined to the municipalities within their jurisdiction and generally showed little interest in expanding their range, which required the construction of expensive lines and generators.

Thus, by the early 1930s, while other developed countries’ had federalized electrical systems servicing the vast majority of rural farms, the United States lagged far behind, with only one in nine farms electrified.

Although utilities were generally averse to building new electrical networks beyond their customer base, the industry was prone to consolidation far beyond that which other industries experienced in the late 19th century. In order to maximize the economics of power generators, large networks servicing distant consumers were needed to balance nighttime and daytime loads. For example, a generator which powered only streetlights would remain idle through most of the day, while a generator that serviced only street cars might have excess capacity in the nighttime. In the 1880s, some streetcar companies created America’s early power intensive amusement parks at the end of their routes to try and profit off their otherwise idling generators.

Sam Insull initially made his fortune by realizing this inefficacy and working to correct it. For example, by purchasing an electric light utility in one town and a streetcar utility in another, he could close the generators in one and use the other day and night, cutting overhead and creating enormous returns for his Chicago conglomerate. In addition, he successfully lobbied for state governments to take regulatory control of his infrastructure from towns and municipalities. Where the regulatory decisions of local municipalities were by and large governed by democratic referenda involving contentious political lobbying for and against utilities, state agencies were staffed by tenured appointees who would predictably render a uniform regulatory policy. The arrangement greatly benefited utilities, but was also a product of the political concerns of the era:

This mixture of private ownership and public supervision satisfied the trust busting mood of the time, serving both as a form of progressive reform in which the private sector was formally controlled by the public, and as a kind of technocratic bureaucracy, in which decision making… was lodged in the hands of experts.

After the Standard Oil breakup in 1911, Insull and his concerns became increasingly aware of their vulnerability to antitrust law. General Electric was also successfully prosecuted for antitrust violations in 1911, but was somewhat counter-intuitively subject to a punitive consent decree which compelled it to merge with the competitors it had been convicted of setting prices with. Although the power companies had evaded the full brunt of antitrust litigation and political suspicion of large corporations in general, they were vigilant through the 1920s in ensuring amicable government relations.

The subsequent public relations effort by utilities was so massive that it managed to permeate America’s popular imagination. In Sinclair Lewis’s Babbit, for example, a customer complains about the service of the town streetcar company. In response, George Babbitt lectures him on the virtues of the company and the onerous tribulations it faced:

But of course, it won’t do to just keep knocking the Traction Company and not realize the difficulties they’re operating under, like those cranks that want municipal ownership. The way these workmen hold up the company for high wages is simply a crime, and of course the burden falls on you and me that have to pay a seven cent fare!

General Electric would pioneer the corporate public relations industry through the efforts of the legendary ad man David Barton, who is credited with the creation of the notion of a “corporate consciousness”. Recognizing that large corporations like General Electric were affecting the daily experiences of regular people across the entire country, Barton believed that companies were coming to resemble governments. An inevitable consequence of this, he believed, was that corporations would be held accountable for the public’s well being. Rather than fight this perception, Barton advised GE embrace it, resulting in flamboyant ads like “The Suffrage and the Switch”, discussed in Part One.

Leveraging the credibility and goodwill created by their “corporate consciousness”, companies began to grow into ever larger and more powerful entities. Although electrical utilities may have been prone to consolidation for systemic reasons, electrical equipment manufacturers had also consolidated. By 1900, “General Electric and Westinghouse had swallowed all the other competitors to become a duopoly controlling the market for electrical generating equipment, transformers, meters, motors, and lighting apparatus." By 1925, Barton predicted an impending clash between private ambition and public ideology, and proposed a merging of the two. As he observed in a marketing convention:

The one danger… is that your growth will outrun public appreciation of the necessity for that growth’. A Business as immense as GE… could not remain a ‘purely private’ business if it expected to endure and grow. It had to become an institution, with its leaders serving as trustees.

As previously noted, Sam Insull was even more aggressive, marketing securities like stocks and bonds to millions of Americans so that they became economically linked: “If the light shines, you know your money is safe”, as many of his security’s sales pitches read. Further, he would supply elementary schools with pro-utility reading materials with names like “The Ohm Queen”, hoping that his efforts would eventually result in political support from both the parents of the children and the children themselves once they grew into politicians and voters.

Whatever goodwill Insull and the associated electrical industries had accumulated by 1930 was quickly eroded after the stock market crash of 1929. Many utilities were unable to honor debt obligations and most plummeted in stock value; Insull’s concerns, which were aggressively leveraged, were severely effected, erasing the investments of 600,000 stock investors and 500,000 bondholders in 1932. Insull’s middle class investors were pioneers in a new age of personal finance, yet they were often left penniless. As one federal official recalled:

They were just the average run of the mill people-clerks and schoolteachers there in Chicago, small shopkeepers in Illinois, farmers from Wisconsin- and what they brought in of course, was worth nothing.

The Insull debacle was highly publicized and pounced on by progressives as evidence of the need for what would eventually become FDR’s New Deal. Governor Roosevelt on the campaign trail seized upon the tragedy as an example of the corrosive influence of industrial robber barons on the American public. He described electric utilities as “a kind of private empire within a nation… Challenging in their power the very government itself." Once elected, much of FDR’s agenda revolved around the reform of electric utilities, including financial regulation like the establishment of the Securities and Exchange Commission. Electrification and disappointment in electrical companies, predicated many New Deal programs. As one historian noted:

Though Insull and his crash are today largely forgotten, we are still subject to its regulatory aftermath- measures that were adopted in direct response to Insull’s real or alleged doings. This legislation, a substantial part of the New Deal regulatory agenda, includes the Securities Act of 1933, the Securities Exchange Act of 1934.

The popular derision of Insull’s electric utility conglomerate ensured caustic relations between government and private industry that only exasperated Franklin Roosevelt’s conception of a New Deal, which many titans of industry believed was an overt attempt to pit business and government interests against one another. The unique economic qualities of electrical services, that lent themselves to massive consolidation and the creation of natural monopolies, created a potent demonstration of how free enterprise could seemingly subvert the power of government and create public hazards. In the end, Insull’s campaign to promote wide ownership of his securities backfired; by making the economic link between private industry and the general population tangible, he invited popular support for government incursion into larger spheres of private concerns, not vice versa.

Like the electrical industry, the corporations responsible for shepherding the Information Revolution spurred a controversial debate about the nature of government in its relationship to the corporation. Many of these battles were fought along the same lines as the ones that had taken place nearly a century earlier. For example, brash new mega companies like Microsoft in the 1990s spurred a deep paranoia about the potentially corrosive nature of big business resulting in one of the most prominent antitrust cases in American history. The Information Age also spurred a deep contemplation of the nature of utilities, reflected in the Telecommunications Act of 1996. Finally, the consolidation of the information industry companies, like software companies and data telecoms bears a remarkable semblance to the consolidation between electrical component manufacturers and electric utilities.

The Microsoft Antitrust case, for example, began before anyone could credibly claim that the personal computer and the Internet would fundamentally transform American society. It was predicated on claims that Microsoft was unfairly preventing startup software vendors, specifically Netscape, from competing through potentially illegal agreements, complaints first filed with the Federal Trade Commission in 1994. The case was unusual in that Microsoft was accused of restraining trade by giving away a product (Internet Explorer) for free; as former federal appeals court Judge Stephen Bryer had stated in an influential antitrust case in 1984, “The Congress that enacted the Sherman Antitrust Act saw it as a way of protecting consumers against prices that were too high, not too low." But had the Information Revolution changed things?

As noted, it was not until May of 1995 that Bill Gates began to recognize the advantage of the World Wide Web over the far more ambiguous “information superhighway”, a term that includes any long distance high-speed network, when he distributed the now famous internal memorandum, “The Internet Tidal Way”. Despite the cases perceived prematurity, the FTC was eager for a test case that would determine if the Information Revolution had already changed the nature of antitrust. Their willingness to litigate might indicate a heightened awareness of corporate control in the Information Age. Ken Auletta posed the question like this:

In the Information Age, is the notion of a monopoly rendered antique because classic monopoly characteristics, rising prices, control of finite resources, distribution barriers to entry, availability of capital, and choke-holds on innovations, are not as apparent, even if the allegations of coercive tactics are familiar?... New technologies inevitably invite new questions about government’s proper role, and the Microsoft trial promised to be the nation’s laboratory for choosing between old law and new.

It did not help Microsoft’s case that the company’s culture was fueled by a form of corporate supremacy stemming from its intensely competitive founder. Bill Gates was often considered arrogant in his corporate poise, publicly ranting against federal prosecutors and professing a complete incomprehension of how Microsoft could possibly break the law. “You’re saying that Microsoft shouldn’t compete hard!”, he once explained in disgust when asked if his company should consider public opinion in its marketing decisions.

Gates would come to resemble early utility executives in his public stance against regulation. He would often rail against the political forces interfering with Microsoft, which he felt was fulfilling a sacrosanct destiny to bring America into the Information Age. But an impartial trustee for a quasi-public institution in the tradition of Barton’s GE, he was not. His caustic public statements read almost like the words of a modern day Babbit, always on the defensive without any attempt to display empathy towards his detractors. For example, in one press conference, he sneered:

How ironic that in the United States, where freedom and innovation are core values, these regulators are trying to punish an American company that has worked hard and successfully to deliver on those values.

Unlike the twenties, however, the 1990s did not see an increase in popular opposition to new large corporations. In fact, public opinion generally accepted Microsoft as a positive force in the American economy and Bill Gates as its able captain. Mark Penn, the pollster credited with advising Bill Clinton through his impeachment hearings, was retained by Microsoft and confirmed general public support. His advice: the best defense is a good offense: “Talk constantly about the plot by competitors to induce the government to sue Microsoft... Don’t bow to the press ayatollahs. Don’t show weakness.”

Unfortunately for Gates, the case was not to be decided on the merits of public opinion. Instead, the noted trial lawyer David Boise would successfully redirect attention towards academic literature arguing the existence of a “Network Effect”, that Microsoft could hold its customers captive, effectively conferring upon it illegal monopoly status. Although the argument had never been used before, and the conservative Judge Thomas Jackson was skeptical of antitrust innovation, the reasoning was successful, and Microsoft was ultimately required to make several concessions in a resulting consent decree.

The result of the Microsoft Trial represented a major coup d'état for the notion that, like the Electrical Revolution, the Information Age would require a fundamental shift in the political and legal stance of government in regards to corporate regulation. So overwhelming was the sense of change that even court appointed mediator Richard Posner, who had written books on the need for judicial restraint in the administration of antitrust law, proposed a state appointed technical committee to oversee Microsoft’s software design.

But the legal tides of the changing corporate climate were not confined to the judiciary. By the early 1990s, it became clear to many technicians that the potent combination of rapidly advancing personal computing and telecommunication technologies would herald a new age in the American economy. In part, this was due to the deregulation in long distance telecommunications which had taken place in the late 1980s and allowed a network of small carriers to install a massive network of high bandwidth fiber optic cables across the country, competing with the dominant AT&T. But Congress was also affected by the same “Network Effect” pathology that was influencing the Microsoft case.

Proponents of the Network Effect framework would argue that networked companies in the Information Age enjoyed protection from market forces exceeding even the industrial economies of scale and natural monopolies of the 20th century. Because every new customer of a digital networked service, like e-mail or web browsers, cost almost nothing to the provider, and the quality of their product increased as it grew more popular, as in the telephone industry, observers worried that established companies would achieve a permanent state of extra-market industry control. As one pundit argued:

Once the Internet reached critical mass, it could rely on the network effects phenomenon to keep it- and its most successful applications such as e-mail and web- from fragmenting into mutually unintelligible systems... In that respect, the development of the Internet, e-mail, and the Web is somewhat like the development of spoken language... As words change meaning and whole new words come into use, individuals adjust their own linguistic practices to ensure that they are understood by others.

The fear was that popular services on the Information Superhighway would inevitably create insurmountably high barriers to entry for potential competitors. Like language, a corporate protocol might attain a degree of power and influence that would be nearly impossible to reverse. This notion was vividly illustrated by Microsoft’s anti-trust woes.

Among the first to recognize this trend was Vice President Al Gore. Gore is credited with popularizing the term “Information Superhighway” to describe this new network, first at the National Press Club, and then in numerous public events and statements. Technically, the Information Superhighway was indistinct from the legacy telephone network which was regulated under the framework established in the Telecommunication Act of 1936. In other words, the signals carrying telephone conversations are sent through the same wires as the signals carrying digital network information such as the Internet.

Al Gore’s enthusiastic embrace of the Superhighway language was calculated to emphasize that telecommunications companies would soon bear the burden of disseminating high-speed information to the masses. This would require a massive infrastructure initiative that Gore felt the century old, highly regulated, and historically complacent telephone utilities were ill equipped to take on. Though he never specifically voiced his frustration with legacy carriers, some of his comments implied an awareness of their short sightedness; in a candid forward of a 1994 online text book, he chided, “as a House member in the early 1980s I called for creation of a national network of "Information Superhighways," the only people interested were the manufacturers of optical fiber.”

Although telecommunication regulation is generally considered an arcane subject, beyond the scope of regular public interest, Al Gore was determined to sound the alarm on the importance of reform in the mid 1990s. Prefacing a speech on the need for reform, he lectured the audience on how smarter telecom regulation could have prevented the Titanic disaster by clearing radio channels. Despite this dramatic illustration of the importance of government in telecommunications, his proposed Telecommunications Act of 1996 had a decidedly deregulatory bent. Gore would often discuss how in the Information Age, commerce enhances democratic values:

In the last few years, we have witnessed the democratization and commercialization of the Internet. Today, the network connects not only the top research laboratories and universities but also small colleges, businesses, libraries, and schools throughout the world. The growth of commercial networks has enabled much broader access to the government-subsidized portions of the Internet.

In order to encourage the construction of “on ramps” to the Superhighway, the Telecom Act’s intended effect was to eliminate “many of the cross market barriers that had prohibited large corporations in one communications industry (such as the telephone industry) from providing services in another related industry (like cable or broadcasting).” The act correspondingly claimed a lower legal standard to determine a telecommunication company’s status as a monopoly including a provision for retroactive post facto prosecution of monopolies in certain situations. The idea, it seemed, was to break down both the market protections telecoms had enjoyed in recognition of the specter of the Network Effect, as well as to allow cross-market investment (like an Internet company merging with a cable provider) in order to facilitate of the natural convergence of these industries.

The resulting legislation has generally been considered to be an inscrutable and directionless labyrinth of byzantine regulation understood only by the companies it was meant to control. As Justice Antonin Scalia put it, “It would be a gross understatement that the 1996 Act is not a model of clarity. It is in many respects a model of ambiguity or indeed even self contradiction.” Other industry observers argue the act resulted in an unmitigated calamity of corporate consolidation that dangerously empowered companies like telecom conglomerates and media distributors. As a leading Cassandra of that view, Ben Haig Bagdikian, who keeps track of media consolidation in continuously updated editions of his book, The New Media Monopoly observed, “In 1983 there were 50 dominant media corporations”, he recalled, “today there are five. These five corporations decide what most citizens will, or will not, learn”.

Just as the dawn of the Electrical Revolution heralded the creation of the modern utility, the dawn of the Information Age necessitated a rethinking of the relationship between government and corporations. Telecommunications have long been quasi-utilities in that FCC regulations protect them from competition. The Telecommunications Act of 1996 effectively cemented that status, allowing companies like AOL and Time Warner, and now Time Warner and Comcast to merge into enormous Information Age media conglomerates with enormous influence on government. Unlike the electrical utilities, which often trace their roots to municipal ownership and were clothed with nationalist popular support, telecom utilities are often massive national concerns burdened by increasing public scrutiny.

Are modern folk afraid of night? Having made themselves at home in a civilization obsessed with power, which explains everything in terms of energy, do they fear the night for their dull acquiescence and the pattern of their beliefs? Be the answer what it will, todays civilization is full of people who have not the slightest notion of the character or the poetry of night, who have never even seen night.

DAVID NYE

being wired as a right

CHAPTER THREE

How does a luxury become a right? A conservative interpretation of the term “human right” generally  refers to an entitlement without which it would be impossible for an individual to live, or live freely. But more broadly, a right could be interpreted to mean anything without which it would be impossible for a citizen to participate fully in their own community. Transformative new technologies tend to create an intuitive sense that access might be considered a “right” by either definition.

Even before access to electricity became common, Americans intuitively understood that electricity was not like other technologies. This is reflected in the innovative new regulations and legislation the government forged to manage the nation’s electrical interests.  For example, Part Three of this chapter discusses the legal innovations that were created to accommodate uninhibited access to street cars. Later regulation sanctioned the new electrical institutions like utilities. Finally, the New Deal Era electrification regime cemented a clear precedent that being wired was, in fact, a right that citizens could expect their government to help provide.

In the recent policy analysis, Rethinking Rights and Regulations, editor Lorrie Faith Cranor reflects upon the role of institutions in adapting to unfamiliar technologies:

The first, and perhaps most difficult, challenge to developing policy responses to a new technology is the construction of a conceptual framework to guide it’s development. Unfortunately, there is no recipe for framework development. Rather, a new understanding emerges in fits and starts (or misstarts), building on insights drawn from eclectic sources.

As noted, electricity was originally regulated along the lines of public works projects of the past, like the Erie Canal; a government investment in otherwise private enterprises could be justified for its ability “to improve business conditions and to raise property values, which in turn enlarged the tax base.” Perhaps it is this foundational framework that helped empower early utilities with monopolies and guaranteed rates of return, leading to a disappointing rate of electrical service through the late 1930s, nearly half a century after the Columbian Exposition promised the imminent elimination of human suffering through the technology.

In 2006, Alaskan Senator Ted Stevens was subject to a firestorm of jeering political pundits and popular backlash when he described the Internet as “a series of tubes” while articulating his opposition to regulating the it. While this characterization was probably ill advised, it represented an attempt to contextualize a new technology with an old framework. When, as Part Three will discuss, John Perry Barlow described the Internet as an independent republic in 1996, also while articulating his opposition to regulation, he was widely praised for his allusion, which was rapidly distributed through the nascent Web. Clearly, the public is intuitively conscious of the implications of these frameworks. As the Internet becomes even more prominent in the everyday lives of more Americans, selecting the correct framework will be essential to ensure that the Information Age does not tear apart society in the way the Electrical Age did.

Of course, not all new technologies instantly become perceived as potential “rights”. This section will discuss the process by which access to electricity came to resemble a right that government could be expected to protect, then compares it to the progress made by internet service since the beginning of the Information Age. The chapter is divided into three parts, each focusing on a type of stratification that unequal access to technologies threaten to create. Left unchecked, they could divide the American community into two separate nations. In the 1930s, this stratification became too great to ignore, as evidenced by the REA and other programs. Hopefully, technological stratification instigated by the Information Age does not create the same type of societal dissonance. But many futurists believe that just such a future is imminent unless we take a proactive approach to averting such a fate.

Ultimately, any debate about the role of technology in society will focus on Technological Determinism; does technology drive social progress, or vice versa? Typically we take the former for granted. As Cranor noted:

There is strong tendency, especially in the popular press, to think and write of new technologies as exogenous forces shaping society. The reality is that the interplay between society and technology is a complex improvisational dance in which each partner responds dynamically to the moves (or anticipated moves) of the other.

The history of electrification confirms this notion. Electricity did not transform America alone. It was not until government actors found the correct ‘framework’ for regulation that the true transformative power of electricity’s full potential could be captured. Milder examples of societal checks on technology exist within the history of information; could the transistor have been marketed if the organizational mentality of the 50s had not subsided? Could fiber optic networks have emerged if deregulatory fervor had not ended AT&T’s monopoly on the long distance market in the early 1980s? In other words, dose society even have the power to direct the consequences of new technologies for the greater good, or does the influence of technology autonomously shape society, blind to the needs and desires of the individuals within it?

This chapter will examine, with the benefit of hindsight, the negative aspects of electrification on American society in the early 20th century and meditate on potential analogies of the past two decades, as well as consider possible analogies that might emerge in the near future.

Psychological Stratification

part one

Electricity began as a luxury, accessible to only an elite few. Thomas Edison’s great patron J.P. Morgan installed a generator under his Madison Avenue mansion shortly after the invention of the light bulb, making him the owner of the first truly electrified home. As electricity became more common among the moneyed classes, it was coveted more for its symbolic association with prosperity and modernity than actual utilitarian value. By the turn of the century, when many urban homes were illuminated, “Most perceived it not in terms of labor saving but in terms of conspicuous display or novelty." This might explain the popularity of the illuminated Christmas tree; GE first sold strings of light bulbs for that purpose in 1901. Obviously, with the vast majority of Americans left in the dark, precious light was not being lavished upon pine trees for lack of a more economical use; like the brilliantly illuminated Main Streets before home light was popular, the most common uses of electricity “blended utility, decoration, and social display.”

By World War I, electricity had become an integral part of the daily experience of most urban Americans. As an economy measure for the effort, the United States Fuel Administration ordered the illuminated signage of the Great White Way turned off. Within days, the political pressure to reverse the order overwhelmed the agency. Merchants reported plummeting sales, both because their billboards languished in the dark, and because Times Square lost its attraction without an electric aura. Many complained of the phantom character of the normally vibrant thoroughfare. Although urban electrification was a relatively recent innovation, it had already become an inextricable part of the city and its inhabitants, regardless of its dispensable utility. As David Nye recalls:

People do not merely use electricity. Rather, the self and the electrified world have intertwined. The rhythmic excitement of the dazzling electric city had already become commonplace by 1910, and the short blackout of World War I was experienced as an unacceptable psychic loss.

Though it may have been psychologically important to the inhabitants of New York at the turn of the century, electricity was still thought of as an extravagance by most Americans, even by the early 1920s. By then, electricity was considered just one of the many conveniences that most urban Americans enjoyed, but nearly everyone else did without. Electric light was safe and convenient compared to the noxious oil lanterns still used in rural areas, but urban life had long been safer and more convenient than life in the country. Equal access to electric light was not considered an important national issue. In retrospect, this might seem incredible, but back then it was not universally clear that the key to modernization was necessarily predicated upon electrification. In that context, inequality of electrical service appears far more benign, and certainly not something which necessitated government intervention. One might imagine that a metropolitan woman would give up access to electric light before access to the private laundry cooperatives that were common only in cities at the time, yet no one would argue that the government should be compelled to aid in the cleaning of laundry in rural areas.

Such reasoning may have informed “the popular belief in 1923 that only private enterprise should develop the agricultural market." Until the late 1920s, there were no national efforts to push a rural electrification agenda at a federal level. Instead, power industry trade groups collaborated with agricultural organizations to explore the possibility of establishing rural lines through strategic partnerships, the most prominent of which was the Committee on the Relation of Electricity of Agriculture (CREA).

Even by the mid 1920s, it had not become abundantly clear that the utilitarian value of electricity could justify its installation in rural areas, an endeavor that required a far larger investment per customer than in urban locales. Even if a farmer did live within range of an electric utility, he would be required to pay for the construction of the power line, and even then would likely pay twice as much per kilowatt-hour as an urban customer.

One of the few accomplishments of the CREA was a comprehensive experiment in which 379 farms were electrified on the condition that each recipient would keep careful records of their usage for three years. Sadly, the agricultural use of power was disappointing, often because stubborn farmers refused to introduce efficient new technologies like irrigation that electricity made possible. Most farms used far less than the 600 to 1,000 kilowatt-hours a year thought necessary to justify an investment in electricity. And shockingly, the vast majority of power usage was not for agricultural applications at all, as D. Clayton Brown notes:

As the Alabama project progressed it was also clear that electricity had its greatest impact in easing the burden of keeping the house… Participants devoted more time to evening activities such as reading and listening to the radio… Electric service, besides reducing toil, increased the participants’ self-satisfaction because it brought tangible evidence of modernization.

Although this discovery might be considered more valuable than the study’s stated objective (measuring the economic viability of electrification), the Alabama experiment was largely considered a failure due to the low kilowatt-hour usage it observed. But the question the CREA asked in the 1920s was never “what should the government do to expand rural electrification”, instead it was “what opportunities to profitably serve the public are private power companies missing?”

This stood in stark contrast with the European model of electrification that, by the mid twenties had far outperformed that of the United States. In part, this may have been because European countries were smaller and farms there tended to be closer to one another than they were in the America, requiring less investment in power lines per costumer. But this explanation, by itself, is insufficient. Even the atrophied Weimar Republic had achieved 60 percent rural coverage by 1927. To see the benefit of rural electrification, an American did not have to look across continents; while touring the farms of Ontario, Canada, Senator George Norris recounted the numerous benefits of government subsidized service, especially improved pride and personal satisfaction on the domestic front.

Norris would ultimately become a great hero for electrification, facilitating massive federal projects like the TVA, and subsidies on favorable loans offered by the REA, even in the face of sometimes violent opposition from self described free-enterprise fanatics. Although he started his political career as a staunchly corporatist conservative, Norris transformed into a self professed Fighting Liberal (the title of his autobiography), in large part because of his cynicism toward the massive electrical trusts he saw as dominant in American political life. Raised on a farm, Norris knew the struggle of rural life and the great benefits of electrification. He knew that the physiological effect of highly conspicuous, if largely symbolic electrical appliances in the home could foster a sense of community and citizenship, even if the economic benefits would accrue later.

One of the most overlooked benefits of early electrification is the mobility it brought its benefactors. By 1925, the physical range of daily life vastly expanded for people with access to streetcars. While a rural resident without an interurban connection in 1925 would rarely see beyond their county limits, those constraints would not have been inhibiting to even a resident of suburban America, who by then could travel from town to town cheaply on a daily basis.

No longer dependent on horses and wagons, a woman’s circle of daily life widened from a few miles to twenty or more, they could easily visit friends, shop in distant places, or take a job while living at home.

Although it was economically liberating to suddenly have the ability to shop for prices, seek jobs miles away, and travel distances without need to hire a wagon, it was psychologically liberating as well. Urban commuters were now at ease to associate with friends and family in distant places and to escape the constant scrutiny of tight-knit local communities. Meanwhile many rural farming communities were underserved by streetcars, leaving residents stranded, sometimes literally, with mobility increasingly impaired compared to those in other regions.

By the early 1920s electricity had yet to realize its full utilitarian potential, and it was already contributing enormously to America’s social stratification. Electrification split the United States into two communities; one luminous with electric light, constantly reminded of the glorious new technologies to come, the other forced either to breathe the noxious fumes of flicking gaslight or to succumb to darkness. One enjoying the benefits of new and exciting home appliances that constantly promised to lighten the drab monotony of domestic life, the other trapped in a constant struggle to keep up with simple tasks like pumping and carrying gallons of water every day. One poised to experience the greatest technological renaissance in generations, the other living lives claustrophobically similar to their peasant ancestor’s of centuries past.

The technologically fueled optimism of electrified America was salt in the wound of those without service. While many American communities fell by the wayside during the broader Industrial Revolution, the stark symbolism of electricity, or lack thereof, cemented their lifestyles as archaic remnants of a forlorn era.

The psychological stratification between Americans with access to information technology is no less distinct. By the mid 1990s, academics and politicians spoke ominously about a “digital divide” that would disenfranchise disconnected Americans. As a 1996 issue of Popular Science observed, “Computer ownership and Internet access are highly stratified along lines of wealth, race, education, and geography.” It further noted that the divide was growing, and rapidly. Fortunately, the past decade has seen an incredible proliferation of digital technology that has generally tempered those fears. For example, a 2014 Pew poll found little dissonance between Internet access in rural towns (83 percent) and urban cities (88 percent). The most polarized figure was between young and old, with nearly all young adults enjoying Internet access and only 57 percent of adults over 65 doing so.

These generally positive numbers are a result of the exuberant race to participate in the nascent Internet economy that took place in the 1990s. Like the dawn of the Electrical Age, the beginning of the Information Revolution was marked by the emergence of a collective perception of impending progress and prosperity. To basically everyone who understood it, the Information Superhighway threatened to disrupt the established social and economic order, and replace it with a new generation of progressive, open-minded, and technically competent leaders. As Historian Fred Turner put it:

In the mid 1990s, as first the Internet and then the World Wide Web swung into public view, talk of revolution filled the air. Politics, economics, the nature of the self, all seemed to teeter on collapse… The stodgy men in grey flannel suits who had so confidently roamed the corridors of industry would shortly disappear, and so to would the chains of command on which their authority depended.

Turner is credited with framing a previously under appreciated narrative of the Information Revolution as the realization of the ethic concocted from the countercultural movement of the 1960s. His argument, that the information society arose from the ethic presented in Stewart Brand’s Whole Earth Catalog, namely the embrace of the “power of the individual to conduct his own education, find his own inspiration, shape his own environment, and share his adventure with whoever is interested.” As previously noted, this was an ethic that many figures involved in the development of the personal computer and Internet subscribed to.

This narrative is only beginning to gain mainstream recognition, featuring prominently in Walter Isaacson’s 2014 book, The Innovators. However, in the 1990s, municipalities were not consciously building data infrastructure so that they could indulge in the fantasies of Brand and his cohort. Cities invested heavily in communications infrastructure because it was generally deemed necessary to remain competitive in a rapidly changing economy. As previously noted, small towns and rural areas like the previously discussed Lusk, Wyoming, wired themselves as an act of defiance against their dwindling stature. In fact, their consciousness of the fact that technology might serve a symbolic rather than a utilitarian purpose was documented in an article published shortly after the Microsoft advertisement aired. One resident told a reporter that she doubted technology would have an economically meaningful effect on Lusk:

The town needed a goal; it needed a vision. That's what this has been for us. But ‘intelligently applied’ is where we've really had to work at it... Everybody assumes that if you get the connections, that solves your problem… If you stumbled onto a spaceship from another planet and didn't know how to run it, so what?

In other words, access to digital networks cannot be presumed to predicate participation in a digital economy and society. The article went on to cite the seemingly frivolous uses school children had for new school computers (like researching their favorite television shows) as evidence that the Internet did not yield all that it had promised. This is not entirely unfamiliar. The article brings to mind the patronizing Alabama report on electrification that found little utilitarian value for service, despite the fact that women were using it copiously to better their daily lives.

Despite these reservations, the adoption of information technology in most demographics in America was ultimately successful, to the extent that nearly all Americans have access to it. Nearly half a century after light bulbs were commercialized, America still had dismal electrification numbers. Just two decades after personal computers were connected to the Internet, users are basically representative of the American population in general. In part this was due to the enthusiastic rhetoric of Al Gore and other politicians who cleared the path for rapid inclusion very early on. In part it may be because equipment like fiber optic cables and Internet routers are relatively cheap and were built on an existing telecommunications infrastructure (whereas electric light necessitated that new ground be broken). Finally, America’s enthusiasm for adopting technology may have grown from changing perceptions of progress itself. As Graham Thomas observed, many academics seemed to presume a certain linear inevitability of the advancement and adoption of technology:

(They assume) Individuals, regions, and nations will “catch up”; those who are not connected now will, or should be, soon. This is the real annihilation of space by time: the assumption the whole world shares a single timeline of development, in which some groups are further ahead than others along this shared path.

This type of progressive pathology perhaps was not quite so prevalent before the Electrical Revolution, a time when technological determinism, the theory that scientific developments govern social change and progress, might seem to have been undermined by decades of technological stagnation on the farm. Technological advances, to the average American, probably seemed more closely related to industrial and economic expansion than moral progress. But once electrical applications and the individual “intertwined”, as David Nye elegantly put it, the seemingly linear path to technological advancement seemed to lend itself to social progress. As Merritt Roe Smith, author of the Deterministic Manifesto Does Technology Drive History, postulated:

This technocratic idea may be seen as an ultimate culminating expression of the optimistic, Universalist aspirations of enlightenment rationalism. But it tacitly replaced political aspirations with technical inspiration as the primary agent of change.

Perhaps the influence of technocracy planted during the Electrical Revolution helped to prepare Americans for the Information Age by equating social progress with technological development. If so, the appeal of the Internet to this notion is overt; the Information Superhighway is innately social in that it exists to facilitate interactions between people. But in addition to inviting a technocratic psychology of progress, many argue that computers and the psyche become one through the use of the internet. Like our relationship with electricity, we have become “intertwined” in a symbiotic relationship with computers and the Internet.

Vincent Miller, in Understanding Digital Culture, argues that by using information technology to enhance their memories, comprehension, and senses, users effectively become “cyborgs” in that their psychology is dependent on machines and vice versa. People learn to exist within “cyberspace”, a place in which human creativity meets machine discipline, forcing both to conform to one another, not changing what people think, but how:

Human computer interfaces create a kind of embodied functionality in which the computers become transformed by being increasingly sophisticated in their ability to sense human movement and intentionality, and humans become increasingly adept at learning to move in ways that computers understand.

Justice Kennedy recently quipped in his majority opinion in Riley vs. California, that information technology, like Internet connected phones, “are now such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy,” a notion that contributed to the court’s decision to protect handheld digital devices from unwarranted search after an arrest.

In the 90s, to those who understood it, the potency of this combination of man and machine was both exhilarating and harrowing. Would the Information Superhighway divide the nation in two, one group of Americans living as computer-augmented Cyborgs, the other group trapped in a distant past like the rural farmers without electricity of the 20th century? Fortunately, the United States’ efforts to mobilize its resources to ensure participation in the Information Revolution seem to be far more effective than early attempts at electrification. However, as the next section will discuss, democratic access to technology does not necessarily lead to a more democratic society.

Economic Stratification

part two

As previously noted, urban electricity utilities had already adopted a slow growth model, encouraging customers with ample electrical coverage to consume more with additional appliances, rather than expanding into rural areas. An inevitable consequence of this was that urban regions became further integrated with electric networks, with more lights, kitchen appliances, electric heaters, fans, refrigerators, electric ranges, and so on. Electric equipment manufacturers developed bigger, more efficient machines for large factories, making them even more efficient. Because of the reluctance of corporations to enter the rural market, the economic advantages of regions within range of service compounded the economic dominance they had already achieved through the Industrial Revolution relative to rural society.

By 1900, most electric power in the United States was consumed by industrial factories, even though only 4 percent of factories were electrified. Most used only electric lights, which dramatically reduced accidents and subsequently insurance premiums. By 1930, 78% of factories were electrified. Hundreds of new applications for electricity had not only replaced steam power driven machines, they were doing tasks the old machines were unable to do. Many automated processes were introduced that used the precise movements of electrical motors and helped factory output triple from 1900 to 1940.

Meanwhile, agricultural jobs remained as tedious and menial as ever. But electrification could shoulder much of the burden, both domestically and in the field. The electric pump, for example, made irrigation feasible in many dry and unproductive farms and it also saved hours of back-breaking work by women, who traditionally pumped gallons of water each day and carried it often hundreds of feet back home in large jugs. In other words, electrification could both increase the economic productivity of land and relieve workers from doing tedious jobs with little economic value.

As Historian Ronald Tobey describes, the effect of electrification though the 1920s was to exacerbate the economic tensions already present in American society. Because the first widely sold electric appliances were generally manufactured and marketed only to the wealthiest families and industries, that period saw an incredible cultural dissonance between income groups that were targeted as customers and those who were not.

Power utilities and electrical manufacturers of the private electrical industry accepted the nation’s fragmented society and segmented economy as the framework for marketing domestic electrical appliances, pitching their products to the households in the top quintile of the nations income.

This notion was not lost on some of the observers of the time. For example the social critic, economist, and perennial technocratic muse Thorstein Veblen despised what he saw as an unjustly privileged aristocracy of pretentious industrialists. A decade before Fredrick Winslow Taylor evangelized on the moral need to maximize efficiency, Veblen chastised the “leisure class” for lacking the integrity and merit he felt industry required to be best managed in a socially responsible way:

The collective interests of any modern community center is industrial efficiency. The individual is serviceable to the ends of the community somewhat in proportion to his efficiency in the productive employments, vulgarly so called… Not much is to be said for the beauty, moral excellence, or general worthiness and reputability of such a rosy human nature (in attaining entrance to the managerial class).

In other words, Veblen believed that industry would never allow competent managers to manage industrial output in America because greed and pedigree were more valuable in the accumulation of industrial power than competence.

Despite the fact that electrification remained a luxury that stilted American economic life further in favor of the rich, the democratizing potential of electricity was lost on no one. As noted, many electricity companies were quick to highlight the humanitarian aspect of their work, and engineers and technocrats dreamed that they would one day be exalted as the slayers of drudgery. Thorstine Velben saw the industrial complex as a potent beneficial force in American society, however maligned the incentives of it’s owners. Yet, by the mid 1920s, after four decades of commercial electric light, electrification had succeeded mostly in improving the lot of a relatively gentile demographic.

Sometimes the fantasy that average American’s lives had already been markedly improved seeped prematurely into the public’s imagination. Bernhard Ostrolenk, an economist and a harsh critic of private power interests, was particularly galled by this trend. While the premise of a Utility was that its monopoly status would protect it from competition in exchange for low rates, he felt private interests betrayed that agreement, noting the fact that heavy industry (whose rates were unregulated) often paid less than a third the rate most rural households did per kilowatt hour. He understood State and National regulatory commissions to be shills of big power interests, and felt that the optimism surrounding electrification had been fabricated in order to protect them.

‘Electric light and power’, says the federal power commission, ‘has come to be almost as essential in our daily lives as the bread we eat and the water we drink.’ This statement by an important Government body is more of a wish than a fact, for this essential in raising the standard of living is still a luxury reserved for the few in the higher income brackets.

Meanwhile, as the poor and middle class observed the wealthy enjoy the fruits of the Industrial Revolution in brightly illuminated cities and towns, the wealthy became increasingly insular and consumerist. By the 1920s, electrified, affluent households were becoming cluttered with new status enhancing electrical devices. Because of that fact, some might presume that the rural Americans who lacked service could not afford to use electrical appliances. This is not true; Americans who did not own appliances could almost always afford them. This is evidenced by the fact that the urban poor typically used domestic appliances nearly as much as the rich did. In electrified lower working class neighborhoods of Chicago, for example, 95 percent of households owned an electric iron, 87 percent owned a vacuum cleaner, and 25 percent owned a washing machine by 1929. Low income households could easily afford electric power, but obstruction from electric utilities prevented them from receiving service. In other words, the rural poor were not excluded from the Electrical Revolution for a lack of money but for a lack of service.

Long before aggressive New Dealers worked to democratize rural access, early tremors of an emerging “right” to electricity emerged in a different context. By the 1910s, traction streetcars were fueling conspicuous class conflict in many cities and towns where they had been installed. The frequent fatal streetcar collision would sometimes pit the ‘everyman’ against elite operators in the press. Local referenda and strikes against streetcars and their owners enflamed class tensions, created ample newspaper fodder, and even became a foundation of Walter Rauschenbush’s “Social Gospel” movement. Political radicals of the time immediately understood service to be a right, calling for “public ownership of all (streetcar) services, arguing that no private company should profit from a basic human need.” Although streetcars were obviously not that basic a human need (the vast majority of the world’s population seemed to get along fine without them), the sentiment was not difficult to understand.

By 1900, streetcars were essential to the everyday lives of millions of urban Americans. In addition to mitigating the intractable congestion of horse carts that plagued many American cities by the mid 19th century, street cars often extended service to the outer limits of urban municipalities, enabling the rise of modern suburbia. The suburbs quickly attracted masses of middle class city dwellers, who paid, on average, a 5 cent fare. At a time when the typical working class wage was approximately 2 dollars a day, regular rides were too expensive for most working class American families. David Nye calculated that a middle class family would spend 48 dollars on 960 trolley rides, eroding 10 to 20 percent of an average household’s income, about the same proportional cost of cars today.

The high cost of fares effectively divided the working population… skilled workers and the middle class could afford to move out of the city center, but the unskilled had to remain behind… The poor who stare out of Jacob Riis’s photographs simply could not afford to spend ten cents for a round trip in carfare every day.

The term “wrong side of the tracks” comes to mind in this context. While that phrase is generally believed to denote geography within towns with trains running through them, its etymology is unclear. In my experience, train tracks, which generally do not run through affluent areas in the first place, rarely divide towns so neatly since it is considered undesirable to live near a noisy train track at all. Some folk lexicographers believe that because wind might be more likely to blow soot towards one side of town than the other, that would make it the “wrong side”. In any case, given that the phrase was first used with regularity in the 1920s. I believe that it’s just as likely to have originated from streetcar culture; to be from the “wrong side of the tracks” could mean to reside in a less desirable city center, rather than an affluent suburb connected to it by an electric streetcar, literally on the other side of the proverbial tracks.

The Electrical Revolution served to stratify Americans by economic class through both it’s nature, and by corporate managers handling. It empowered the wealthy with useful and conspicuous new devices that often improved their economic stance while trapping the poor both in immobile rural areas and in filthy city centers.

By the 1980s, careful observers could tell that the Information Age was preparing to unseat the legacy of the Industrial Revolution. Early in that decade, investment in factories was surpassed by business spending in computer hardware and software. There was a popular notion that the dehumanizing aspects of the Taylorist industrial factory, in which a person’s performance was constantly monitored and disciplined, would disintegrate in the face of a more humanistic, more progressive economy that depended less on rote labor and instead valued individual creativity and ingenuity. As Gavin Poynter observed:

The optimist view of the transformation of work which accompanies the emergence of the Information Age associates the new ways of working with the end of work practices and forms of industrial organization that gave rise to alienation and inequality in the workspace.

This, however, was far from a universal consensus. Some saw a resurrection of factory work in the office. Alvin Toffler, for instance, seemingly saw little distinction between the fundamental critique of Taylorism in the factory and the ethic that had emerged in office work by the 1980s. Increasingly, he argued, each person was assigned a single repetitive task, requiring no discretion, creativity, skill, or craft (a notion that might be familiar to anyone who has ever worked in data entry). Toffler argued that if the computer age was to free humanity from the yoke of dehumanizing busywork, it would have to have consequences far greater than merely moving proverbial cogs in the machine from the factory to the office. It would have to fundamentally restructure the nature of work in American society. Toffler predicted that, just as the Industrial Revolution used machines to create massive employment in generally arduous and unproductive factory jobs, then replaced them with mechanical automation, the dehumanizing office jobs would also be offset by technology. Because computers could increasingly replace repetitive processes even in a white-collar context he was optimistic that the restructuring would occur.

More recent analysis has has revealed that this dream has generally gone unfulfilled. Echoes of Ronald Tobey’s previously noted notion, that technology merely augmented a societal framework that had already been stilted in favor of an aristocratic elite, have become louder as the technology industry grows into a more ruthless and high stakes corporate endeavor. Articulating the belief of adherents to the notion that information technology simply exacerbates an unfair economic order, Robert Hassan paraphrased:

Information is not a thing, an entity; it is a social relation, which under modern capitalism expresses the characteristic’s and prevailing relations of power. In other words, the technologies of computerization and automation are being used by capitalism to expand and deepen its rule over society.

To the extent that all Americans have not shared equally in the economic benefits of the Information Revolution, this is certainly the case. For example, from 1980 to 2000, blue-collar workers were nearly twice as likely to be displaced as white-collar workers, even as manufacturing output rose, presumably because computer aided automation made them redundant. Many economic commentators have argued that the new emphasis on service jobs in the American economy will require a higher education than was necessary in the old industrial order, further imperiling the fates of disadvantaged Americans.

It is too early to tell what the effect on technology will be on inequality and economic stratification in America. More theoretical commentators have extremely dissonant views on the matter. For example, the influential economist Jeremy Rifkin believes that, just as computers made ubiquitous information virtually free, the rest of the economy will follow, effectively rendering capitalism obsolete in a world of utopian abundance. Meanwhile, the technologist and engineer Jaron Lanier believes that the digital economy is inherently stratifying, since it permits a dwindling number of capitalists to manage organizations of ever increasing size while simultaneously automating even relatively skilled jobs.

These notions are not far divorced from those of the technocrats of the Electrical Revolution who thought, as noted, that the paradox of abundance would automate jobs and make goods cheaper, such that despite lower prices, most people would be unable to partake in the abundance shepherded by new technology. However, unlike the old technocrats who despised capitalism and saw its proponents as “leisure-class” robber barons unfit to manage industry on behalf of humanity, the ‘neo-technocrats’ tend to accept capitalism as the framework that created the possibility for abundance in the first place.

As Jeremy Rifkin explained in The Zero Marginal Cost Society, capitalism succeeds because producers compete by reducing costs and increasing efficiency. Ultimately, he believes, producers will become so efficient that everything will be essentially free:

Capitalisms operating logic is designed to fail by succeeding… While capitalism is far from putting itself out of business, it’s apparent that as it brings us ever closer to a near zero marginal cost society, its once unchallenged prowess is diminishing, making way for an entirely new way of organizing economic life in an age characterized by abundance rather than scarcity.

This is a point that is more or less agreed upon between today’s neo-technocrats. Where they disagree is in the response that society will have should these assumptions prove accurate. Rifkin, for example, thinks that capitalism will reform itself with little turbulence or need for intervention. He sees the restructuring of the media and entertainment industry as instructive in how the rest of the economy might cope with this new environment. Even though digital technology has eroded the music industries ability to sell records and the news industry’s ability to sell papers, for example, they are finding creative new ways to serve their customers.

Jaron Lanier, on the other hand, sees rapid consolidation of wealth and power as an inevitable consequence of the paradox of abundance instigated by the Information Revolution. He thinks this that can only be averted through intervention. As he sees it, “the key question is not ‘how much will be automated?”, but how society will function after everything is.

Thorstein Veblen believed that industry fundamentally exists to serve the needs of its owners, rather than consumers and citizens. The fact that it employs millions of people and aids in the operation of society is a secondary consideration, and one that Lanier does not think will last once digital technology advances far enough. In response to this argument, biologist Ray Kurzweil, author of The Singularity is Near, in which he argues that the exponentially increasing speed of human progress will soon shepherd an infinitely progressive utopia, points out that, in general, stratification has not had that effect. He cites, for example, the rapidly depreciating value of a cell phone as a status symbol. “There are societies around the world in which the majority of the population were farming with their hands two decades ago and now have thriving information based economies with widespread use of cell phones”.

Compared to the Electrical Revolution, the Information Revolution does not appear to have created nearly the same degree of economic stratification. Certainly digital technology has made continuous employment in some sectors turbulent, yet most economists would agree that economic gains through increased productivity has generally offset the ill effects of computer-assisted automation. While the specter of massive capitalist consolidation that Lanier and Velben cite are certainly troubling, the critique of that view that I think Rifkin and Kurzweil would concur on is that both the fruits of production and the means of production will become increasingly accessible.

The media industry, for example, is frequently cited as one that has been most affected by the zero marginal cost syndrome. While, as previously noted, some firms have consolidated into disturbingly large conglomerates, other aspects of publishing have become fantastically cheap and democratic. The most expensive part of starting a newspaper, as Benjamin Franklin remembered in his autobiography, used to be the purchase of a press and associated equipment. Today, we might imagine that Franklin would save the trouble of dabbling with cheapskate crown governors and start a blog; as David Brooks once postulated, were he alive today, “he’d probably join the chorus of all those techno-enthusiasts who claim that the Internet and bio-tech breakthroughs are going to transform life on Earth wonderfully.”

Urban Rural Stratification

part three

By the late 1920s, corporate resistance to publicly subsidized electric power had become a pathology that drove most of their operational and marketing decisions. As previously noted, Sam Insull pioneered immersive corporate marketing, indoctrinating students by distributing “educational” materials to schools and aggressively marketing utility securities to consumer-investors. Meanwhile, the utilities marketing machine was introducing a new corporatist-tinged semi-legal lexicon to mainstream American consciousness. Phrases and concepts emphasizing the importance of markets in American life began to collectively take on the appearance of doctrine:

(refrains like) ‘The sacredness of private property’, ‘constitutional guarantees of individual liberty’, (and) ‘promotion of private enterprise’ can be traced to the electric power issue as the testing ground of a new concept in the relation between government and quasi-public business corporations.

By the Great Depression, the power trusts controlled what may have been the most organized and effective political advocacy group in the United States. When, in 1935, Congress prepared to consider legislation that aimed to enable development of rural markets, “the private utilities subjected the members of Congress and the public to one of the most carefully planned and executed lobbying campaigns in American History”, pioneering aggressive political advocacy tactics which are now common, like mobilizing investors and customers to send telegrams to their Congressmen with scripted requests written by well funded trade organizations.

Utility executives would often act in bad faith in order to preserve the primacy of private interests in the electrical industry, using methods that extended far beyond mere lobbying efforts or exuberant marketing campaigns. In part, this was because the industry wanted to eventually develop rural markets themselves. Yet at the same time, “electric companies were afraid to build rural lines because of potential REA intervention… if this proposal became law, it meant ruin for them.” Instead, “they found an effective weapon in the legal and bureaucratic procedure in the organization of a cooperative.”

One popular method was to erect “spite lines”; part of the REA legislation mandated that government sponsored cooperatives could not cross the territory of a private operator. In order to contain cooperative expansion, some private utilities would send confederates to cooperative town meetings where new service was being planned, then rapidly erect power lines along seemingly arbitrary and unprofitable routes designed to disrupt public plans for service.

All of this led to a type of corporately enforced urban-rural stratification. People from non-affluent areas could neither expect electrical utility trusts to bring them service, nor could they reliably establish service through other means. Yet by the late 1920s, electrification was undeniably the only path to modernity and improvement. This perpetuated the division of America effectively into two separate nations, one with the economic capacity for self advancement, and one confined in the past using what was by 1935, a half-century-old technology.

This division was potent enough that it became a major campaign platform for Franklin Roosevelt, as well as the thrust behind numerous New Deal agencies. Rural electrification first became a touchstone of his presidency on the campaign trail, when FDR literally passed through the threshold between wealthy, urban, electrified cities and poor, rural, backward small towns on a daily basis. Roosevelt traced the defining moment in his decision to make electrification a focus of his rhetoric to a 1932 visit to Tupelo, Mississippi. Soon after, he regularly “wove the theme of domestic modernization as social modernization into speeches and extemporaneous remarks during his presidency. In his words:

The introduction of electric cook stoves and all the other dozens of things which, when I was in the Navy, we called gadgets, is improving human life. They are things not especially new so far as invention is concerned, but more and more they are considered necessities of American life in every part of the country.

With FDR’s assent to the presidency, the rhetoric didn’t let up, and it was rewarded with the enactment programs like the REA and TVA, which were generally considered both political and humanitarian success stories, despite the rabid opposition from moneyed utility interests. By 1934, Roosevelt had perfected his “neo-Jeffersonian” tone. At a TVA press conference, he downplayed the technological aspect of electrification in general; technological advancement in rural areas was merely a means to a larger end:

Electric power is really a secondary matter. What we are doing there is taking a watershed with about three and a half million people in it, almost all of them rural, and we are trying to make a different kind of citizen out of them from what they would be under their present conditions.

This contrasted deeply with the ideology of urban development of the past. In 1909, one seventh of America’s industrial output emerged from the 700 square miles surrounding the heart of Manhattan. The vast majority of the remaining output could be found in a handful of other large cities. Metropolitan areas were coming to be conceived of as vast machines that engineers, politicians, and businessmen collectively managed for the advancement of the state. In order to maximize their benefit, they had “to be efficiently designed and managed for maximum output”.

Politicians and legal actors, eager to ensure America’s unimpeded industrial future, worked to accommodate industry however they could. For example, one important Rhode Island case deemed that disturbances caused by loud and often dangerous streetcars could not be held liable for any nuisance they caused. Other precedents enshrined the rights of private owners of public utilities into Constitutional law. Several federal court cases prevented municipalities from reducing the price charged by electrical utilities by claiming the right to guaranteed profit established in Smyth v. Ames, which originally only applied to railroads. A flurry of further litigation won streetcar companies the right to operate on Sundays, permanently disrupting the usual calm of Sabbath services in many urbanized towns.

Needless to say, the process of creating the great Urban-American “machine” was one that necessarily sidelined rural development as a relic of another era. The obsession with industrial development directed attention away from the small towns and pastures whose virtues had once defined the American ideal. Instead it emphasized the integration of technological improvement and progressivism in the cities as the new hallmarks of Americanism.

As previously noted, the dawn of the Information Age brought with it a resurgence in neo-Jeffersonian hopes and predictions, mainly centering on the improved ability of white-collar workers to “telecommute” to important and influential jobs, while also reaping the wholesome benefits of rural life. So far, these predictions have yet to be meaningfully realized, and the term “to telecommute” has taken on a less appealing meaning; now it generally refers to urban office workers who use technology to continue their jobs from home. Yet despite the fact that the original proponents of “telecommuters” were hardly prophetic, the phrase they coined does have a continuing resonance in the Internet age.

Shortly before the passage of the Telecommunication Act of 1996, a consortium of telecommunications companies, including AT&T, IBM, and a host of international telecom monopolies sponsored the Aspen Institute to create a report, Building a Global Information Society. The very first heading of the first chapter (“Historically Not New”) attempted to establish that the ascendancy of the “information society” predated the Internet, Al Gore, and the deregulation of the 70s and 80s. The growing rhetoric around reform, it argued in the next section (“Criticisms and Observations”), was not merited by the telecommunications environment, but by growing political hype around the information superhighway. The Internet was an “evolution, not a revolution” which predicated its success on the diligence of the international telecommunications utility machine.

What the Aspen Institute failed to grasp was that, unlike the telecommunications technologies before the Internet, the World Wide Web demanded a new “conceptual framework”. This was because the character of the information age had fundamentally changed in the 1990s. Emerging framework held that the internet was taking on the qualities of a physical place; “cyberspace”. Although in a narrow sense, the operation of the information superhighway merely involved the rapid transfer of digital signals, which telecoms had been doing for decades, the World Wide Web lent a certain aura to this process, creating a new “virtual reality” that, in a theoretical sense, contained space, time and cultivated consciousness and even culture. By 1996, technologists like Vincent Miller were approaching the Internet through a sociological lens, exploring the nuances of online linguistics, identity, economics, politics, and even warfare, all concepts formerly relegated to the material world.

This is why the phrase “telecommuter” has had such resonance over the past two decades; it early established the Internet as a place where someone could go to work or spend time. Along with other phrases and terms popularized in the 1990s like the “World Wide Web”, the “Information Superhighway”, and “Cyberspace”, “Telecommuting” captured the physical essence of an entity lacked physicality.

To the politicians tasked with regulating the Internet and others unversed in digital culture, this concept might have been difficult to comprehend. But to Stewart Brand’s cohort of tech savvy futurists, the online world was not just a place; it was a nation. John Perry Barlow, former Grateful Dead lyricist, republican political operative, and Electronic Frontier Foundation chairman, ostensibly declared the Internet’s independence in response to the Telecommunications Act while attending the Economic Forum in Davos in 1996:

Governments derive their just powers from the consent of the governed. You have neither solicited nor received ours. We did not invite you. You do not know us, nor do you know our world. Cyberspace does not lie within your borders. Do not think that you can build it, as though it were a public construction project. You cannot. It is an act of nature and it grows itself through our collective actions.

While Barlow’s widely read declaration did much to articulate the image of the Internet as a place, it was perhaps over zealous in its aggressive assertion of autonomous sovereignty. The presence of the Internet would, of course, have wide ranging implications within the reality bound physical world. Like all transformative technologies, the creation of cyberspace inevitably resulted in all sorts of sociological and economic shifts. As Anthony Wilhelm observed in “Digital Nation”:

The threat of information and communications technologies creating massive advantages for some societies and regions and perhaps contributing to the demise of others, follows a long line of relatively recent inventions that spawned modern metropolises and unrivaled economic productivity while also leading to the disappearance of other places and ways of life.

The “digital divide”, the disparity between people with access to high speed Internet and people without it, is not nearly as pronounced as the electrical divide before it. Yet there are still many disturbing lapses in service which invite comparisons of modern telecommunications providers to early electrical utilities. Today, there are 19 million people living in America without even the option to subscribe high speed internet, regardless of whether or not they could afford it. Further, Internet service in America is, on average, slower and more expensive than service in many comparable developed nations.

According to Susan Crawford, a columnist with Wired Magazine who tracks developments in communications regulatory policy, telecom companies have been slow to create cheaper and better service and increase coverage because they lack the incentive to do so. She attributes this to a cozy relationship that exists between major American telecoms and the FCC, their primary regulatory body. Their first line of defense, she says, is to encourage regulatory barriers at a state level, for example by disallowing municipalities from subsidizing their own networks and making it difficult for companies to obtain regulator approval, a process which the FCC could simply eliminate through policy change.

The old critics of ineffective regulatory policy, like the previously noted Bernhard Ostolenk in Electricity: For Use or for Profit, lacked the vocabulary to describe the corrosive effects of public utilities on American governance to the public. Ostolenk, for example, described a complacent system marred by “a long, sordid story of misleading propaganda, political interference, outright bribery of legislators, commission members, college professors, editors, and others on whom the public is expected to rely upon for protection.” He also described how utilities preferred to archive growth by encouraging greater use of electricity from current customers, rather than expanding coverage to new ones. Today, critics like Susan Crawford are armed with terms like “regulatory capture” and “harvesting” that describe how telecoms use the exact same tactics today. In the evolving discussion over issues like media consolidation and net neutrality, a new generation of consumer advocates might find it easier to articulate complex regulatory issues than their predecessors did.

Advocates have a reason to be worried. In addition to concerns over social divides that unequal access invite, economic issues have also emerged. Just as heavy industrial jobs gravitated towards urban environments, the white-collar service jobs that the Information Age eliminated further entrenched economic prosperity and social influence in the urban areas of the America. Yet despite its widespread adoption, digital technology promises to be even more devastating to rural communities. As noted, electrification, while skewed towards industrial use, had clear benefits in the agricultural industry. However, the Information Age seems less democratic in that it clearly benefits service-based jobs primarily found in cities while adding little value to rural economies. As an Economic Research Service study found, Internet access had little if any discernible impact on farm productivity or wages, yet the Internet is absolutely essential in urban economic centers, where over 80 percent of Fortune 500 companies require job seekers apply through an online resume submission, a norm that clearly benefits people who are fluent on the Web.

Yet to compare the economics of rural and urban regions in America neglects an arguably more important cultural dissonance which has emerged in tandem with the Information Age. This is because technology has historically served to advance the cultural mores of urban communities at the expense of rural communities, even as they struggle to adapt.

As Herald Platt notes, new technologies tend to appeal to the cultural values of city dwellers. Electrification dispersed urban communities with street-cars, slowed the ebb and flow of housewives’ incessant excursions to markets with the refrigerator, and allowed families to hear news and entertainment at home through the radio, rather than at community gatherings. These trends inevitably led to a greater sense of individual autonomy and less reliance on communal values generally held dear among rural populations:

Electricity supplied the technical means to construct not only a better physical environment for the entire (urban) community but also a particular cultural vision of the ideal city, one that embodied deep seated values of privatism and social segregation.

In other words, historically, the adoption of new technologies has served to help cities achieve their own utopian vision; one that involves not just machine-like economic efficiency, but mechanical social organization as well. This has certainly been as true of the Information Age as it had been in the Electrical Revolution. While the modern communications technology gives its users the ability to engage in all sorts of previously unfathomable social encounters, for example through online blogging, social networks, and instant messaging, the Internet, by its nature, gives rise to the type of social segmentation that Platt spoke about.

Robert Hassan, author of The Information Society, articulated the Internet’s effect on communication as the “commodification” of social interactions. He argued that, throughout history, as communication becomes more formal, the nature of human interactions become more transactional. Communicating through the Web, he argues, is the ultimate realization of the commodification of social relations. Hassan believes that the information society threatens to unhinge traditional social discourse and replace it with a new dialogue:

Informal networks of communication, such as people speaking face to face, are commoditized when those people begin to text or e-mail each other… the effect is to connect the individual with an instrumental logic that has buying and selling as its core rational… it would seem logical to argue that there are limits to the extent to which our culture may be commoditized before it mutates into something qualitatively different.

Regardless of their economic state, rural communities may have a hard time adapting to a continuously changing social atmosphere that overwhelmingly lends itself to the regimented, rational, and mechanical social values of urban communities. As cultures stratify, could there be a resurrection of a Grapes of Wrath type dissonance, in which rural communities become the victims of an ascendant urban class, each lacking the ability or even the inclination to empathize with one another? This is the true specter of the technologically fueled stratification that electrification, and now the Information Age threatens to invite.

Cover Art Credit
From the New World
YAN YOUGLIANG

The cover art and other inserts were taken from the Chinese artist Yang Yongliang’s exhibition From the New World, which I first saw at the Metropolitan Museum of Art in 2013. Trained in traditional calligraphy and painting since his boyhood in the 1980s, he started working in experimental and contemporary art using digital technology in 2005. In that time, Chinaexperienced an increase of electrification from under 50 percent of households to nearly 100 percent today.

Conclusion

It is confusing to contextualize the emergence of new technology with the social mores of a society that values tradition, and looks to past experiences for wisdom as it advances towards the future. Through the Electrical Revolution, America was blindsided by the enormity and newness of rapidly advancing technology. The fact that it took more than half a century to achieve a reasonable rate of electrification is an unacknowledged calamity in American history. Without a past framework to base it on, the trajectory of electrification was governed largely by the most dangerous social and political orthodoxies that had emerged throughout the Industrial Revolution; corporatism, corruption,  and tolerance of social stratification. 

Those destructive values were not a result of electrification per se, but instead a byproduct of the institutions that managed its adoption. With new technologies come new institutions to support them. In the past that meant the emergence of government agencies, like the REA, industry trade and advocacy groups, like the NELA and CREA, and quasi-public corporations like the electric utilities and streetcar operators. Today we have a similar set of institutions emerging to meet the needs of the Information Age. Some of them have an incredible capacity for social good: institutions that make education free and accessible, like online schools, that allow inventors and entrepreneurs to advance technology, like venture capital firms, or that allow people to associate in new and creative ways, like social media platforms. 

But institutions are also emerging that have a troubling resemblance to the most corrosive entities of the Electrical Revolution. First among these are the telecommunications companies, which today have financial strength and political influence that rival that of the early utilities. Like the utilities, many telecoms, and the policy that governs them, trace their original success to a deeply corporatist legacy; government enforced corporate success. As Susan Crawford argued, rather than customer satisfaction, the telecoms depend upon an implicit monopoly to reap enormous profits. Today there is an increasingly contentious national conversation about the role of telecoms in American life. 

Net neutrality, the principal that Internet providers should not be able to price discriminate between different websites, offers a perfect example of this. Less than a month before this paper was published, Barack Obama recommended that the FCC regulate telecom companies like utilities under Title II, a 1934 law designed to ensure that telephone trusts, namely AT&T, act on behalf of society despite their monopoly status. Senator Ted Cruz, the de facto voice of opposition to the policy, ardently argued that that type of regulation restricts innovation and leads to higher prices. Rather than enshrining the telecom monopoly in law by making them utilities, perhaps the monopoly could be broken. We can learn from the mistakes of the past.

This paper should be interpreted as a critique of an absolutist view of Technological Determinism. It should be clear that people have enormous power in deciding how technology and society converge. This responsibility should not be taken lightly. We expect technology to shepherd progressive social change. Yet we are often disappointed. A uniting theme to all three chapters of this paper is that the early adoption of technology often embeds traditional values even deeper within society, despite their conspicuous capacity to enable new and exciting progressive values. As noted, the observed effect of both information and electric technologies was that they "accepted the nation’s fragmented society and segmented economy as the framework” for future development.

This is not to say that America should become a nation of Luddites; the adoption of new technologies by its nature, as Stewart Brand so articulately understood, invites potentially liberating social, political, and economic change. Technology that changes daily life as fundamentally as electrification or ubiquitous information demands a certain amount of social upheaval in order to capture its full benefit. 

Charlie Chaplin understood this notion with acuity. Four years after he epically articulated the urban critique of the Industrial Revolution in Modern Times, America heard his voice for the first time in the “talkie”, The Great Dictator, which premiered in 1940. In his legendary speech at the end of that film, he presented an ethic that reconciled that critique with a far more  nuanced, deeply technocratic vision:

We have developed speed, but we have shut ourselves in. Machinery that gives abundance has left us in want. Our knowledge has made us cynical. Our cleverness, hard and unkind. We think too much and feel too little. More than machinery we need humanity. More than cleverness we need kindness and gentleness. Without these qualities, life will be violent and all will be lost… 

The radio and the airplane have brought us closer together. The very nature of these inventions cries out for the goodness in men - cries out for universal brotherhood - for the unity of us all. You, the people have the power, the power to create machines. The power to create happiness! You, the people, have the power to make this life free and beautiful, to make this life a wonderful adventure… 

Let us fight to free the world, to do away with national barriers, to do away with greed, with hate and intolerance. Let us fight for a world of reason, a world where science and progress will lead to all men’s happiness.

Unlike mainstream Technocracy, the speech dispensed with the elitism and exclusivity of the movement. I think that this is an ethic that today’s hackers would deeply admire. As Laurie Cranor wrote, “the interplay between society and technology is a complex improvisational dance.” The trust-busting fervor of the 1930s ended the ironically corporatist obsession with free-enterprise and allowed rural electrification. The counter-culture of the sixties destroyed the organizational ethic and enabled the creation of the personal computer. The deregulatory mood of the 1980s incited the rapid installation of the fiberoptic backbone of the Information Superhighway. The moral of this paper is that technology cannot create positive change for society until society first changes on behalf of technology. Today society has a rich heritage and history to draw from as it considers how it will adapt to the challenges that new technologies invite.