Wednesday, May 20, 2015

The Era of Impact

Of all the wistful superstitions that cluster around the concept of the future in contemporary popular culture, the most enduring has to be the notion that somehow, sooner or later, something will happen to shake the majority out of its complacency and get it to take seriously the crisis of our age. Week after week, I field comments and emails that presuppose that belief. People want to know how soon I think the shock of awakening will finally hit, or wonder whether this or that event will do the trick, or simply insist that the moment has to come sooner or later.

To all such inquiries and expostulations I have no scrap of comfort to offer. Quite the contrary, what history shows is that a sudden awakening to the realities of a difficult situation is far and away the least likely result of what I’ve called the era of impact, the second of the five stages of collapse. (The first, for those who missed last week’s post, is the era of pretense; the remaining three, which will be covered in the coming weeks, are the eras of response, breakdown, and dissolution.)

The era of impact is the point at which it becomes clear to most people that something has gone wrong with the most basic narratives of a society—not just a little bit wrong, in the sort of way that requires a little tinkering here and there, but really, massively, spectacularly wrong. It arrives when an asset class that was supposed to keep rising in price forever stops rising, does its Wile E. Coyote moment of hang time, and then drops like a stone. It shows up when an apparently entrenched political system, bristling with soldiers and secret police, implodes in a matter of days or weeks and is replaced by a provisional government whose leaders look just as stunned as everyone else. It comes whenever a state of affairs that was assumed to be permanent runs into serious trouble—but somehow it never seems to succeed in getting people to notice just how temporary that state of affairs always was.

Since history is the best guide we’ve got to how such events work out in the real world, I want to take a couple of examples of the kind just outlined and explore them in a little more detail. The stock market bubble of the 1920s makes a good case study on a relatively small scale. In the years leading up to the crash of 1929, stock values in the US stock market quietly disconnected themselves from the economic fundamentals and began what was, for the time, an epic climb into la-la land. There were important if unmentionable reasons for that airy detachment from reality; the most significant was the increasingly distorted distribution of income in 1920s America, which put more and more of the national wealth in the hands of fewer and fewer people and thus gutted the national economy.

It’s one of the repeated lessons of economic history that money in the hands of the rich does much less good for the economy as a whole than money in the hands of the working classes and the poor. The reasoning here is as simple as it is inescapable. Industrial economies survive and thrive on consumer expenditures, but consumer expenditures are limited by the ability of consumers to buy the things they want and need. As money is diverted away from the lower end of the economic pyramid, you get demand destruction—the process by which those who can’t afford to buy things stop buying them—and consumer expenditures fall off. The rich, by contrast, divert a large share of their income out of the consumer economy into investments; the richer they get, the more of the national wealth ends up in investments rather than consumer expenditures; and as consumer expenditures falter, and investments linked to the consumer economy falter in turn, more and more money ends up in illiquid speculative vehicles that are disconnected from the productive economy and do nothing to stimulate demand.

That’s what happened in the 1920s. All through the decade in the US, the rich got richer and the poor got screwed, speculation took the place of productive investment throughout the US economy, and the well-to-do wallowed in the wretched excess chronicled in F. Scott Fitzgerald’s The Great Gatsby while most other people struggled to get by. The whole decade was a classic era of pretense, crowned by the delusional insistence—splashed all over the media of the time—that everyone in the US could invest in the stock market and, since the market was of course going to keep on rising forever, everyone in the US would thus inevitably become rich.

It’s interesting to note that there were people who saw straight through the nonsense and tried to warn their fellow Americans about the inevitable consequences. They were denounced six ways from Sunday by all right-thinking people, in language identical to that used more recently on those of us who’ve had the effrontery to point out that an infinite supply of oil can’t be extracted from a finite planet.  The people who insisted that the soaring stock values of the late 1920s were the product of one of history’s great speculative bubbles were dead right; they had all the facts and figures on their side, not to mention plain common sense; but nobody wanted to hear it.

When the stock market peaked just before the Labor Day weekend in 1929 and started trending down, therefore, the immediate response of all right-thinking people was to insist at the top of their lungs that nothing of the sort was happening, that the market was simply catching its breath before its next great upward leap, and so on. Each new downward lurch was met by a new round of claims along these lines, louder, more dogmatic, and more strident than the one that preceded it, and nasty personal attacks on anyone who didn’t support the delusional consensus filled the media of the time.

People were still saying those things when the bottom dropped out of the market.

Tuesday, October 29, 1929 can reasonably be taken as the point at which the era of pretense gave way once and for all to the era of impact. That’s not because it was the first day of the crash—there had been ghastly slumps on the previous Thursday and Monday, on the heels of two months of less drastic but still seriously ugly declines—but because, after that day, the pundits and the media pretty much stopped pretending that nothing was wrong. Mind you, next to nobody was willing to talk about what exactly had gone wrong, or why it had gone wrong, but the pretense that the good fairy of capitalism had promised Americans happy days forever was out the window once and for all.

It’s crucial to note, though, that what followed this realization was the immediate and all but universal insistence that happy days would soon be back if only everyone did the right thing. It’s even more crucial to note that what nearly everyone identified as “the right thing”—running right out and buying lots of stocks—was a really bad idea that bankrupted many of those who did it, and didn’t help the imploding US economy at all.

It’s probably necessary to talk about this in a little more detail, since it’s been an article of blind faith in the United States for many decades now that it’s always a good idea to buy and hold stocks. (I suspect that stockbrokers have had a good deal to do with the promulgation of this notion.) It’s been claimed that someone who bought stocks in 1929 at the peak of the bubble, and then held onto them, would have ended up in the black eventually, and for certain values of “eventually,” this is quite true—but it took the Dow Jones industrial average until the mid-1950s to return to its 1929 high, and so for a quarter of a century our investor would have been underwater on his stock purchases.

What’s more, the Dow isn’t necessarily a good measure of stocks generally; many of the darlings of the market in the 1920s either went bankrupt in the Depression or never again returned to their 1929 valuations. Nor did the surge of money into stocks in the wake of the 1929 crash stave off the Great Depression, or do much of anything else other than provide a great example of the folly of throwing good money after bad. The moral to this story? In an era of impact, the advice you hear from everyone around you may not be in your best interest.

That same moral can be shown just as clearly in the second example I have in mind, the French Revolution. We talked briefly in last week’s post about the way that the French monarchy and aristocracy blinded themselves to the convulsive social and economic changes that were pushing France closer and closer to a collective explosion on the grand scale, and pursued business as usual long past the point at which business as usual was anything but a recipe for disaster. Even when the struggle between the Crown and the aristocracy forced Louis XVI to convene the États-Généraux—the rarely-held national parliament of France, which had powers more or less equivalent to a constitutional convention in the US—next to nobody expected anything but long rounds of political horse-trading from which some modest shifts in the balance of power might result.

That was before the summer of 1789. On June 17, the deputies of the Third Estate—the representatives of the commoners—declared themselves a National Assembly and staged what amounted to a coup d’etat; on July 14, faced with the threat of a military response from the monarchy, the Parisian mob seized the Bastille, kickstarting a wave of revolt across the country that put government and military facilities in the hands of the revolutionary National Guard and broke the back of the feudal system; on August 4, the National Assembly abolished all feudal rights and legal distinctions between the classes. Over less than two months, a political and social system that had been welded firmly in place for a thousand years all came crashing to the ground.

Those two months marked the end of the era of pretense and the arrival of the era of impact. The immediate response, with a modest number of exceptions among the aristocracy and the inner circles of the monarchy’s supporters, was frantic cheering and an insistence that everything would soon settle into a wonderful new age of peace, prosperity, and liberty. All the overblown dreams of the philosophes about a future age governed by reason were trotted out and treated as self-evident fact. Of course that’s not what happened; once it was firmly in power, the National Assembly used its unchecked authority as abusively as the monarchy had once done; factional struggles spun out of control, and before long mob rule and the guillotine were among the basic facts of life in Revolutionary France. 

Among the most common symptoms of an era of impact, in other words, is the rise of what we may as well call “crackpot optimism”—the enthusiastic and all but universal insistence, in the teeth of the evidence, that the end of business as usual will turn out to be the door to a wonderful new future. In the wake of the 1929 stock market crash, people were urged to pile back into the market in the belief that this would cause the economy to boom again even more spectacularly than before, and most of the people who followed this advice proceeded to lose their shirts. In the wake of the revolution of 1789, likewise, people across France were encouraged to join with their fellow citizens in building the shining new utopia of reason, and a great many of those who followed that advice ended up decapitated or, a little later, dying of gunshot or disease in the brutal era of pan-European warfare that extended almost without a break from the cannonade of Valmy in 1792 to the battle of Waterloo in 1815.

And the present example? That’s a question worth exploring, if only for the utterly pragmatic reason that most of my readers are going to get to see it up close and personal.

That the United States and the industrial world generally are deep in an era of pretense is, I think, pretty much beyond question at this point. We’ve got political authorities, global bankers, and a galaxy of pundits insisting at the top of their lungs that nothing is wrong, everything is fine, and we’ll be on our way to the next great era of prosperity if we just keep pursuing a set of boneheaded policies that have never—not once in the entire span of human history—brought prosperity to the countries that pursued them. We’ve got shelves full of books for sale in upscale bookstores insisting, in the strident language usual to such times, that life is wonderful in this best of all possible worlds, and it’s going to get better forever because, like, we have technology, dude! Across the landscape of the cultural mainstream, you’ll find no shortage of cheerleaders insisting at the top of their lungs that everything’s going to be fine, that even though they said ten years ago that we only have ten years to do something before disaster hits, why, we still have ten years before disaster hits, and when ten more years pass by, why, you can be sure that the same people will be insisting that we have ten more.

This is the classic rhetoric of an era of pretense. Over the last few years, though, it’s seemed to me that the voices of crackpot optimism have gotten more shrill, the diatribes more fact-free, and the logic even shoddier than it was in Bjorn Lomborg’s day, which is saying something. We’ve reached the point that state governments are making it a crime to report on water quality and forbidding officials from using such unwelcome phrases as “climate change.” That’s not the action of people who are confident in their beliefs; it’s the action of a bunch of overgrown children frantically clenching their eyes shut, stuffing their fingers in their ears, and shouting “La, la, la, I can’t hear you.”

That, in turn, suggests that the transition to the era of impact may be fairly close. Exactly when it’s likely to arrive is a complex question, and exactly what’s going to land the blow that will crack the crackpot optimism and make it impossible to ignore the arrival of real trouble is an even more complex one. In 1929, those who hadn’t bought into the bubble could be perfectly sure—and in fact, a good many of them were perfectly sure—that the usual mechanism that brings bubbles to a catastrophic end was about to terminate the boom of the 1920s with extreme prejudice, as indeed it did. In the last decades of the French monarchy, it was by no means clear exactly what sequence of events would bring the Ancien Régime crashing down, but such thoughtful observers as Talleyrand knew that something of the sort was likely to follow the crisis of legitimacy then under way.

The problem with trying to predict the trigger that will bring our current situation to a sudden stop is that we’re in such a target-rich environment. Looking over the potential candidates for the sudden shock that will stick a fork in the well-roasted corpse of business as usual, I’m reminded of the old board game Clue. Will Mr. Boddy’s killer turn out to be Colonel Mustard in the library with a lead pipe, Professor Plum in the conservatory with a candlestick, or Miss Scarlet in the dining room with a rope?

In much the same sense, we’ve got a global economy burdened to the breaking point with more than a quadrillion dollars of unpayable debt; we’ve got a global political system coming apart at the seams as the United States slips toward the usual fate of empires and its rivals circle warily, waiting for the kill; we’ve got a domestic political system here in the US entering a classic prerevolutionary condition under the impact of a textbook crisis of legitimacy; we’ve got a global climate that’s hammered by our rank stupidity in treating the atmosphere as a gaseous sewer for our wastes; we’ve got a global fossil fuel industry that’s frantically trying to pretend that scraping the bottom of the barrel means that the barrel is full, and the list goes on. It’s as though Colonel Mustard, Professor Plum, Miss Scarlet, and the rest of them all ganged up on Mr. Boddy at once, and only the most careful autopsy will be able to determine which of them actually dealt the fatal blow.

In the midst of all this uncertainty, there are three things that can, I think, be said for certain about the end of the current era of pretense and the coming of the era of impact. The first is that it’s going to happen. When something is unsustainable, it’s a pretty safe bet that it won’t be sustained indefinitely, and a society that keeps on embracing policies that swap short-term gains for long-term problems will sooner or later end up awash in the consequences of those policies. Timing such transitions is difficult at best; it’s an old adage among stock traders that the market can stay irrational longer than you can stay solvent. Still, points made above—especially the increasingly shrill tone of the defenders of the existing order—suggest to me that the era of impact may be here within a decade or so at the outside.

The second thing that can be said for certain about the coming era of impact is that it’s not the end of the world. Apocalyptic fantasies are common and popular in eras of pretense, and for good reason; fixating on the supposed imminence of the Second Coming, human extinction, or what have you, is a great way to distract yourself from the real crisis that’s breathing down your neck. If the real crisis in question is partly or wholly a result of your own actions, while the apocalyptic fantasy can be blamed on someone or something else, that adds a further attraction to the fantasy.

The end of industrial civilization will be a long, bitter, painful cascade of conflicts, disasters, and accelerating decline in which a vast number of people are going to die before they otherwise would, and a great many things of value will be lost forever. That’s true of any falling civilization, and the misguided decisions of the last forty years have pretty much guaranteed that the current example is going to have an extra helping of all these unwelcome things. I’ve discussed at length, in earlier posts in the Dark Age America sequence here and in other sequences as well, why the sort of apocalyptic sudden stop beloved of Hollywood scriptwriters is the least likely outcome of the predicament of our time; still, insisting on the imminence and inevitability of some such game-ending event will no doubt be as popular as usual in the years immediately ahead.

The third thing that I think can be said for certain about the coming era of impact, though, is the one that counts. If it follows the usual pattern, as I expect it to do, once the crisis hits there will be serious, authoritative, respectable figures telling everyone exactly what they need to do to bring an end to the troubles and get the United States and the world back on track to renewed peace and prosperity. Taking these pronouncements seriously and following their directions will be extremely popular, and it will almost certainly also be a recipe for unmitigated disaster. If forewarned is forearmed, as the saying has it, this is a piece of firepower to keep handy as the era of pretense winds down. In next week’s post, we’ll talk about comparable weaponry relating to the third stage of collapse—the era of response.

Wednesday, May 13, 2015

The Era of Pretense

I've mentioned in previous posts here on The Archdruid Report the educational value of the comments I receive from readers in the wake of each week’s essay. My post two weeks ago on the death of the internet was unusually productive along those lines.  One of the comments I got in response to that post gave me the theme for last week’s essay, but there was at least one other comment calling for the same treatment. Like the one that sparked last week’s post, it appeared on one of the many other internet forums on which The Archdruid Report, and it unintentionally pointed up a common and crucial failure of imagination that shapes, or rather misshapes, the conventional wisdom about our future.

Curiously enough, the point that set off the commenter in question was the same one that incensed the author of the denunciation mentioned in last week’s post: my suggestion in passing that fifty years from now, most Americans may not have access to electricity or running water. The commenter pointed out angrily that I’d claimed that the twilight of industrial civilization would be a ragged arc of decline over one to three centuries. Now, he claimed, I was saying that it was going to take place in the next fifty years, and this apparently convinced him that everything I said ought to be dismissed out of hand.

I run into this sort of confusion all the time. If I suggest that the decline and fall of a civilization usually takes several centuries, I get accused of inconsistency if I then note that one of the sharper downturns included in that process may be imminent.  If I point out that the United States is likely within a decade or two of serious economic and political turmoil, driven partly by the implosion of its faltering global hegemony and partly by a massive crisis of legitimacy that’s all but dissolved the tacit contract between the existing order of US society and the masses who passively support it, I get accused once again of inconsistency if I then say that whatever comes out the far side of that crisis—whether it’s a battered and bruised United States or a patchwork of successor states—will then face a couple of centuries of further decline and disintegration before the deindustrial dark age bottoms out.

Now of course there’s nothing inconsistent about any of these statements. The decline and fall of a civilization isn’t a single event, or even a single linear process; it’s a complex fractal reality composed of many different events on many different scales in space and time. If it takes one to three centuries, as usual, those centuries are going to be taken up by an uneven drumbeat of wars, crises, natural disasters, and assorted breakdowns on a variety of time frames with an assortment of local, regional, national, or global effects. The collapse of US global hegemony is one of those events; the unraveling of the economic and technological framework that currently provides most Americans with electricity and running water is another, but neither of those is anything like the whole picture.

It’s probably also necessary to point out that any of my readers who think that being deprived of electricity and running water is the most drastic kind of collapse imaginable have, as the saying goes, another think coming. Right now, in our oh-so-modern world, there are billions of people who get by without regular access to electricity and running water, and most of them aren’t living under dark age conditions. A century and a half ago, when railroads, telegraphs, steamships, and mechanical printing presses were driving one of history’s great transformations of transport and information technology, next to nobody had electricity or running water in their homes. The technologies of 1865 are not dark age technologies; in fact, the gap between 1865 technologies and dark age technologies is considerably greater, by most metrics, than the gap between 1865 technologies and the ones we use today.

Furthermore, whether or not Americans have access to running water and electricity may not have as much to say about the future of industrial society everywhere in the world as the conventional wisdom would suggest.  I know that some of my American readers will be shocked out of their socks to hear this, but the United States is not the whole world. It’s not even the center of the world. If the United States implodes over the next two decades, leaving behind a series of bankrupt failed states to squabble over its territory and the little that remains of its once-lavish resource base, that process will be a great source of gaudy and gruesome stories for the news media of the world’s other continents, but it won’t affect the lives of the readers of those stories much more than equivalent events in Africa and the Middle East affect the lives of Americans today.

As it happens, over the next one to three centuries, the benefits of industrial civilization are going to go away for everyone. (The costs will be around a good deal longer—in the case of the nuclear wastes we’re so casually heaping up for our descendants, a good quarter of a million years, but those and their effects are rather more localized than some of today’s apocalyptic rhetoric likes to suggest.) The reasoning here is straightforward. White’s Law, one of the fundamental principles of human ecology, states that economic development is a function of energy per capita; the immense treasure trove of concentrated energy embodied in fossil fuels, and that alone, made possible the sky-high levels of energy per capita that gave the world’s industrial nations their brief era of exuberance; as fossil fuels deplete, and remaining reserves require higher and higher energy inputs to extract, the levels of energy per capita the industrial nations are used to having will go away forever.

It’s important to be clear about this. Fossil fuels aren’t simply one energy source among others; in terms of concentration, usefulness, and fungibility—that is, the ability to be turned into any other form of energy that might be required—they’re in a category all by themselves. Repeated claims that fossil fuels can be replaced with nuclear power, renewable energy resources, or what have you sound very good on paper, but every attempt to put those claims to the test so far has either gone belly up in short order, or become a classic subsidy dumpster surviving purely on a diet of government funds and mandates.

Three centuries ago, the earth’s fossil fuel reserves were the largest single deposit of concentrated energy in this part of the universe; now we’ve burnt through nearly all the easily accessible reserves, and we’re scrambling to keep the tottering edifice of industrial society going by burning through the dregs that remain. As those run out, the remaining energy resources—almost all of them renewables—will certainly sustain a variety of human societies, and some of those will be able to achieve a fairly high level of complexity and maintain some kinds of advanced technologies. The kind of absurd extravagance that passes for a normal standard of living among the more privileged inmates of the industrial nations is another matter, and as the fossil fuel age sunsets out, it will end forever.

The fractal trajectory of decline and fall mentioned earlier in this post is simply the way this equation works out on the day-to-day scale of ordinary history. Still, those of us who happen to be living through a part of that trajectory might reasonably be curious about how it’s likely to unfold in our lifetimes. I’ve discussed in a previous series of posts, and in my book Decline and Fall: The End of Empire and the Future of Democracy in 21st Century America, how the end of US global hegemony is likely to unfold, but as already noted, that’s only a small portion of the broader picture. Is a broader view possible?

Fortunately history, the core resource I’ve been using to try to make sense of our future, has plenty to say about the broad patterns that unfold when civilizations decline and fall. Now of course I know all I have to do is mention that history might be relevant to our present predicament, and a vast chorus of voices across the North American continent and around the world will bellow at rooftop volume, “But it’s different this time!” With apologies to my regular readers, who’ve heard this before, it’s probably necessary to confront that weary thoughtstopper again before we proceed.

As I’ve noted before, claims that it’s different this time are right where it doesn’t matter and wrong where it counts.  Predictions made on the basis of history—and not just by me—have consistently predicted events over the last decade or so far more accurately than predictions based on the assumption that history doesn’t matter. How many times, dear reader, have you heard someone insist that industrial civilization is going to crash to ruin in the next six months, and then watched those six months roll merrily by without any sign of the predicted crash? For that matter, how many times have you heard someone insist that this or that policy that’s never worked any other time that it’s been tried, or this or that piece of technological vaporware that’s been the subject of failed promises for decades, will inevitably put industrial society back on its alleged trajectory to the stars—and how many times has the policy or the vaporware been quietly shelved, and something else promoted using the identical rhetoric, when it turned out not to perform as advertised?

It’s been a source of wry amusement to me to watch the same weary, dreary, repeatedly failed claims of imminent apocalypse and inevitable progress being rehashed year after year, varying only in the fine details of the cataclysm du jour and the techno-savior du jour, while the future nobody wants to talk about is busily taking shape around us. Decline and fall isn’t something that will happen sometime in the conveniently distant future; it’s happening right now in the United States and around the world. The amusement, though, is tempered with a sense of familiarity, because the period in which decline is under way but nobody wants to admit that fact is one of the recurring features of the history of decline.

There are, very generally speaking, five broad phases in the decline and fall of a civilization. I know it’s customary in historical literature to find nice dull labels for such things, but I’m in a contrary mood as I write this, so I’ll give them unfashionably colorful names: the eras of pretense, impact, response, breakdown, and dissolution. Each of these is complex enough that it’ll need a discussion of its own; this week, we’ll talk about the era of pretense, which is the one we’re in right now.

Eras of pretense are by no means limited to the decline and fall of civilizations. They occur whenever political, economic, or social arrangements no longer work, but the immediate costs of admitting that those arrangements don’t work loom considerably larger in the collective imagination than the future costs of leaving those arrangements in place. It’s a curious but consistent wrinkle of human psychology that this happens even if those future costs soar right off the scale of frightfulness and lethality; if the people who would have to pay the immediate costs don’t want to do so, in fact, they will reliably and cheerfully pursue policies that lead straight to their own total bankruptcy or violent extermination, and never let themselves notice where they’re headed.

Speculative bubbles are a great setting in which to watch eras of pretense in full flower. In the late phases of a bubble, when it’s clear to anyone who has two spare neurons to rub together that the boom du jour is cobbled together of equal parts delusion and chicanery, the people who are most likely to lose their shirts in the crash are the first to insist at the top of their lungs that the bubble isn’t a bubble and their investments are guaranteed to keep on increasing in value forever. Those of my readers who got the chance to watch some of their acquaintances go broke in the real estate bust of 2008-9, as I did, will have heard this sort of self-deception at full roar; those who missed the opportunity can make up for the omission by checking out the ongoing torrent of claims that the soon-to-be-late fracking bubble is really a massive energy revolution that will make America wealthy and strong again.

The history of revolutions offers another helpful glimpse at eras of pretense. France in the decades before 1789, to cite a conveniently well-documented example, was full of people who had every reason to realize that the current state of affairs was hopelessly unsustainable and would have to change. The things about French politics and economics that had to change, though, were precisely those things that the French monarchy and aristocracy were unwilling to change, because any such reforms would have cost them privileges they’d had since time out of mind and were unwilling to relinquish.

Louis XIV, who finished up his long and troubled reign a supreme realist, is said to have muttered “Après moi, le déluge”—“Once I’m gone, this sucker’s going down” may not be a literal translation, but it catches the flavor of the utterance—but that degree of clarity was rare in his generation, and all but absent in those of his increasingly feckless successors. Thus the courtiers and aristocrats of the Old Regime amused themselves at the nation’s expense, dabbled in avant-garde thought, and kept their eyes tightly closed to the consequences of their evasions of looming reality, while the last opportunities to excuse themselves from a one-way trip to visit the guillotine and spare France the cataclysms of the Terror and the Napoleonic wars slipped silently away.

That’s the bitter irony of eras of pretense. Under most circumstances, they’re the last period when it would be possible to do anything constructive on the large scale about the crisis looming immediately ahead, but the mass evasion of reality that frames the collective thinking of the time stands squarely in the way of any such constructive action. In the era of pretense before a speculative bust, people who could have quietly cashed in their positions and pocketed their gains double down on their investments, and guarantee that they’ll be ruined once the market stops being liquid. In the era of pretense before a revolution, in the same way, those people and classes that have the most to lose reliably take exactly those actions that ensure that they will in fact lose everything. If history has a sense of humor, this is one of the places that it appears in its most savage form.

The same points are true, in turn, of the eras of pretense that precede the downfall of a civilization. In a good many cases, where too few original sources survive, the age of pretense has to be inferred from archeological remains. We don’t know what motives inspired the ancient Mayans to build their biggest pyramids in the years immediately before the Terminal Classic period toppled over into a savage political and demographic collapse, but it’s hard to imagine any such project being set in motion without the usual evasions of an era of pretense being involved  Where detailed records of dead civilizations survive, though, the sort of rhetorical handwaving common to bubbles before the bust and decaying regimes on the brink of revolution shows up with knobs on. Thus the panegyrics of the Roman imperial court waxed ever more lyrical and bombastic about Rome’s invincibility and her civilizing mission to the nations as the Empire stumbled deeper into its terminal crisis, echoing any number of other court poets in any number of civilizations in their final hours.

For that matter, a glance through classical Rome’s literary remains turns up the remarkable fact that those of her essayists and philosophers who expressed worries about her survival wrote, almost without exception, during the Republic and the early Empire; the closer the fall of Rome actually came, the more certainty Roman authors expressed that the Empire was eternal and the latest round of troubles was just one more temporary bump on the road to peace and prosperity. It took the outsider’s vision of Augustine of Hippo to proclaim that Rome really was falling—and even that could only be heard once the Visigoths sacked Rome and the era of pretense gave way to the age of impact.

The present case is simply one more example to add to an already lengthy list. In the last years of the nineteenth century, it was common for politicians, pundits, and mass media in the United States, the British empire, and other industrial nations to discuss the possibility that the advanced civilization of the time might be headed for the common fate of nations in due time. The intellectual history of the twentieth century is, among other things, a chronicle of how that discussion was shoved to the margins of our collective discourse, just as the ecological history of the same century is among other things a chronicle of how the worries of the previous era became the realities of the one we’re in today. The closer we’ve moved toward the era of impact, that is, the more unacceptable it has become for anyone in public life to point out that the problems of the age are not just superficial.

Listen to the pablum that passes for political discussion in Washington DC or the mainstream US media these days, or the even more vacuous noises being made by party flacks as the country stumbles wearily toward yet another presidential election. That the American dream of upward mobility has become an American nightmare of accelerating impoverishment outside the narrowing circle of the kleptocratic rich, that corruption and casual disregard for the rule of law are commonplace in political institutions from local to Federal levels, that our medical industry charges more than any other nation’s and still provides the worst health care in the industrial world, that our schools no longer teach anything but contempt for learning, that the national infrastructure and built environment are plunging toward Third World conditions at an ever-quickening pace, that a brutal and feckless foreign policy embraced by both major parties is alienating our allies while forcing our enemies to set aside their mutual rivalries and make common cause against us: these are among the issues that matter, but they’re not the issues you’ll hear discussed as the latest gaggle of carefully airbrushed candidates go through their carefully scripted elect-me routines on their way to the 2016 election.

If history teaches anything, though, it’s that eras of pretense eventually give way to eras of impact. That doesn’t mean that the pretense will go away—long after Alaric the Visigoth sacked Rome, for example, there were still plenty of rhetors trotting out the same tired clichés about Roman invincibility—but it does mean that a significant number of people will stop finding the pretense relevant to their own lives. How that happens in other historical examples, and how it might happen in our own time, will be the theme of next week’s post.

Wednesday, May 06, 2015

The Whisper of the Shutoff Valve

Last week’s post on the impending decline and fall of the internet fielded a great many responses. That was no surprise, to be sure; nor was I startled in the least to find that many of them rejected the thesis of the post with some heat. Contemporary pop culture’s strident insistence that technological progress is a clock that never runs backwards made such counterclaims inevitable.

Still, it’s always educational to watch the arguments fielded to prop up the increasingly shaky edifice of the modern mythology of progress, and the last week was no exception. A response I found particularly interesting from that standpoint appeared on one of the many online venues where Archdruid Report posts appear. One of the commenters insisted that my post should be rejected out of hand as mere doom and gloom; after all, he pointed out, it was ridiculous for me to suggest that fifty years from now, a majority of the population of the United States might be without reliable electricity or running water.

I’ve made the same prediction here and elsewhere a good many times. Each time, most of my readers or listeners seem to have taken it as a piece of sheer rhetorical hyperbole. The electrical grid and the assorted systems that send potable water flowing out of faucets are so basic to the rituals of everyday life in today’s America that their continued presence is taken for granted.  At most, it’s conceivable that individuals might choose not to connect to them; there’s a certain amount of talk about off-grid living here and there in the alternative media, for example.  That people who want these things might not have access to them, though, is pretty much unthinkable.

Meanwhile, in Detroit and Baltimore, tens of thousands of residents are in the process of losing their access to water and electricity.

The situation in both cities is much the same, and there’s every reason to think that identical headlines will shortly appear in reference to other cities around the nation. Not that many decades ago, Detroit and Baltimore were important industrial centers with thriving economies. Along with more than a hundred other cities in America’s Rust Belt, they were thrown under the bus with the first wave of industrial offshoring in the 1970s.  The situation for both cities has only gotten worse since that time, as the United States completed its long transition from a manufacturing economy producing goods and services to a bubble economy that mostly produces unpayable IOUs.

These days, the middle-class families whose tax payments propped up the expansive urban systems of an earlier day have long since moved out of town. Most of the remaining residents are poor, and the ongoing redistribution of wealth in America toward the very rich and away from everyone else has driven down the income of the urban poor to the point that many of them can no longer afford to pay their water and power bills. City utilities in Detroit and Baltimore have been sufficiently sensitive to political pressures that large-scale utility shutoffs have been delayed, but shifts in the political climate in both cities are bringing the delays to an end; water bills have increased steadily, more and more people have been unable to pay them, and the result is as predictable as it is brutal.

The debate over the Detroit and Baltimore shutoffs has followed the usual pattern, as one side wallows in bash-the-poor rhetoric while the other side insists plaintively that access to utilities is a human right. Neither side seems to be interested in talking about the broader context in which these disputes take shape. There are two aspects to that broader context, and it’s a tossup which is the more threatening.

The first aspect is the failure of the US economy to recover in any meaningful sense from the financial crisis of 2008. Now of course politicians from Obama on down have gone overtime grandstanding about the alleged recovery we’re in. I invite any of my readers who bought into that rhetoric to try the following simple experiment. Go to your favorite internet search engine and look up how much the fracking industry has added to the US gross domestic product each year from 2009 to 2014. Now subtract that figure from the US gross domestic product for each of those years, and see how much growth there’s actually been in the rest of the economy since the real estate bubble imploded.

What you’ll find, if you take the time to do that, is that the rest of the US economy has been flat on its back gasping for air for the last five years. What makes this even more problematic, as I’ve noted in several previous posts here, is that the great fracking boom about which we’ve heard so much for the last five years was never actually the game-changing energy revolution its promoters claimed; it was simply another installment in the series of speculative bubbles that has largely replaced constructive economic activity in this country over the last two decades or so.

What’s more, it’s not the only bubble currently being blown, and it may not even be the largest. We’ve also got a second tech-stock bubble, with money-losing internet corporations racking up absurd valuations in the stock market while they burn through millions of dollars of venture capital; we’ve got a student loan bubble, in which billions of dollars of loans that will never be paid back have been bundled, packaged, and sold to investors just like all those no-doc mortgages were a decade ago; car loans are getting the same treatment; the real estate market is fizzing again in many urban areas as investors pile into another round of lavishly marketed property investments—well, I could go on for some time. It’s entirely possible that if all the bubble activity were to be subtracted from the last five years or so of GDP, the result would show an economy in freefall.

Certainly that’s the impression that emerges if you take the time to check out those economic statistics that aren’t being systematically jiggered by the US government for PR purposes. The number of long-term unemployed in America is at an all-time high; roads, bridges, and other basic infrastructure is falling to pieces; measurements of US public health—generally considered a good proxy for the real economic condition of the population—are well below those of other industrial countries, heading toward Third World levels; abandoned shopping malls litter the landscape while major retailers announce more than 6000 store closures. These are not things you see in an era of economic expansion, or even one of relative stability; they’re markers of decline.

The utility shutoffs in Detroit and Baltimore are further symptoms of the same broad process of economic unraveling. It’s true, as pundits in the media have been insisting since the story broke, that utilities get shut off for nonpayment of bills all the time. It’s equally true that shutting off the water supply of 20,000 or 30,000 people all at once is pretty much unprecedented. Both cities, please note, have had very large populations of poor people for many decades now.  Those who like to blame a “culture of poverty” for the tangled relationship between US governments and the American poor, and of course that trope has been rehashed by some of the pundits just mentioned, haven’t yet gotten around to explaining how the culture of poverty all at once inspired tens of thousands of people who had been paying their utility bills to stop doing so.

There are plenty of good reasons, after all, why poor people who used to pay their bills can’t do so any more. Standard business models in the United States used to take it for granted that the best way to run the staffing dimensions of any company, large or small, was to have as many full-time positions as possible and to use raises and other practical incentives to encourage employees who were good at their jobs to stay with the company. That approach has been increasingly unfashionable in today’s America, partly due to perverse regulatory incentives that penalize employers for offering full-time positions, partly to the emergence of attitudes in corner offices that treat employees as just another commodity. (I doubt it’s any kind of accident that most corporations nowadays refer to their employment offices as “human resource departments.” What do you do with a resource? You exploit it.)

These days, most of the jobs available to the poor are part-time, pay very little, and include nasty little clawbacks in the form of requirements that employees pay out of pocket for uniforms, equipment, and other things that employers used to provide as a matter of course. Meanwhile housing prices and rents are rising well above their post-2008 dip, and a great many other necessities are becoming more costly—inflation may be under control, or so the official statistics say, but anyone who’s been shopping at the same grocery store for the last eight years knows perfectly well that prices kept on rising anyway.

So you’ve got falling incomes running up against rising costs for food, rent, and utilities, among other things. In the resulting collision, something’s got to give, and for tens of thousands of poor Detroiters and Baltimoreans, what gave first was the ability to keep current on their water bills. Expect to see the same story playing out across the country as more people on the bottom of the income pyramid find themselves in the same situation. What you won’t hear in the media, though it’s visible enough if you know where to look and are willing to do so, is that people above the bottom of the income pyramid are also losing ground, being forced down toward economic nonpersonhood. From the middle classes down, everyone’s losing ground.

That process doesn’t continue any further than the middle class, to be sure. It’s been pointed out repeatedly that over the last four decades or so, the distribution of wealth in America has skewed further and further out of balance, with the top 20% of incomes taking a larger and larger share at the expense of everybody else. That’s an important factor in bringing about the collision just described. Some thinkers on the radical fringes of American society, which is the only place in the US you can talk about such things these days, have argued that the raw greed of the well-to-do is the sole reason why so many people lower down the ladder are being pushed further down still.

Scapegoating rhetoric of that sort is always comforting, because it holds out the promise—theoretically, if not practically—that something can be done about the situation. If only the thieving rich could be lined up against a convenient brick wall and removed from the equation in the time-honored fashion, the logic goes, people in Detroit and Baltimore could afford to pay their water bills!  I suspect we’ll hear such claims increasingly often as the years pass and more and more Americans find their access to familiar comforts and necessities slipping away.  Simple answers are always popular in such times, not least when the people being scapegoated go as far out of their way to make themselves good targets for such exercises as the American rich have done in recent decades.

John Kenneth Galbraith’s equation of the current US political and economic elite with the French aristocracy on the eve of revolution rings even more true than it did when he wrote it back in 1992, in the pages of The Culture of Contentment. The unthinking extravagances, the casual dismissal of the last shreds of noblesse oblige, the obsessive pursuit of personal advantages and private feuds without the least thought of the potential consequences, the bland inability to recognize that the power, privilege, wealth, and sheer survival of the aristocracy depended on the system the aristocrats themselves were destabilizing by their actions—it’s all there, complete with sprawling overpriced mansions that could just about double for Versailles. The urban mobs that played so large a role back in 1789 are warming up for their performances as I write these words; the only thing left to complete the picture is a few tumbrils and a guillotine, and those will doubtless arrive on cue.

The senility of the current US elite, as noted in a previous post here, is a massive political fact in today’s America. Still, it’s not the only factor in play here. Previous generations of wealthy Americans recognized without too much difficulty that their power, prosperity, and survival depended on the willingness of the rest of the population to put up with their antics. Several times already in America’s history, elite groups have allied with populist forces to push through reforms that sharply weakened the power of the wealthy elite, because they recognized that the alternative was a social explosion even more destructive to the system on which elite power depends.

I suppose it’s possible that the people currently occupying the upper ranks of the political and economic pyramid in today’s America are just that much more stupid than their equivalents in the Jacksonian, Progressive, and New Deal eras. Still, there’s at least one other explanation to hand, and it’s the second of the two threatening contextual issues mentioned earlier.

Until the nineteenth century, fresh running water piped into homes for everyday use was purely an affectation of the very rich in a few very wealthy and technologically adept societies. Sewer pipes to take dirty water and human wastes out of the house belonged in the same category. This wasn’t because nobody knew how plumbing works—the Romans had competent plumbers, for example, and water faucets and flush toilets were to be found in Roman mansions of the imperial age. The reason those same things weren’t found in every Roman house was economic, not technical.

Behind that economic issue lay an ecological reality.  White’s Law, one of the foundational principles of human ecology, states that economic development is a function of energy per capita. For a society before the industrial age, the Roman Empire had an impressive amount of energy per capita to expend; control over the agricultural economy of the Mediterranean basin, modest inputs from sunlight, water and wind, and a thriving slave industry fed by the expansion of Roman military power all fed into the capacity of Roman society to develop itself economically and technically. That’s why rich Romans had running water and iced drinks in summer, while their equivalents in ancient Greece a few centuries earlier had to make do without either one.

Fossil fuels gave industrial civilization a supply of energy many orders of magnitude greater than any previous human civilization has had—a supply vast enough that the difference remains huge even after the vast expansion of population that followed the industrial revolution. There was, however, a catch—or, more precisely, two catches. To begin with, fossil fuels are finite, nonrenewable resources; no matter how much handwaving is employed in the attempt to obscure this point—and whatever else might be in short supply these days, that sort of handwaving is not—every barrel of oil, ton of coal, or cubic foot of natural gas that’s burnt takes the world one step closer to the point at which there will be no economically extractable reserves of oil, coal, or natural gas at all.

That’s catch #1. Catch #2 is subtler, and considerably more dangerous. Oil, coal, and natural gas don’t leap out of the ground on command. They have to be extracted and processed, and this takes energy. Companies in the fossil fuel industries have always targeted the deposits that cost less to extract and process, for obvious economic reasons. What this means, though, is that over time, a larger and larger fraction of the energy yield of oil, coal, and natural gas has to be put right back into extracting and processing oil, coal, and natural gas—and this leaves less and less for all other uses.

That’s the vise that’s tightening around the American economy these days. The great fracking boom, to the extent that it wasn’t simply one more speculative gimmick aimed at the pocketbooks of chumps, was an attempt to make up for the ongoing decline of America’s conventional oilfields by going after oil that was far more expensive to extract. The fact that none of the companies at the heart of the fracking boom ever turned a profit, even when oil brought more than $100 a barrel, gives some sense of just how costly shale oil is to get out of the ground. The financial cost of extraction, though, is a proxy for the energy cost of extraction—the amount of energy, and of the products of energy, that had to be thrown into the task of getting a little extra oil out of marginal source rock.

Energy needed to extract energy, again, can’t be used for any other purpose. It doesn’t contribute to the energy surplus that makes economic development possible. As the energy industry itself takes a bigger bite out of each year’s energy production, every other economic activity loses part of the fuel that makes it run. That, in turn, is the core reason why the American economy is on the ropes, America’s infrastructure is falling to bits—and Americans in Detroit and Baltimore are facing a transition to Third World conditions, without electricity or running water.

I suspect, for what it’s worth, that the shutoff notices being mailed to tens of thousands of poor families in those two cities are a good working model for the way that industrial civilization itself will wind down. It won’t be sudden; for decades to come, there will still be people who have access to what Americans today consider the ordinary necessities and comforts of everyday life; there will just be fewer of them each year. Outside that narrowing circle, the number of economic nonpersons will grow steadily, one shutoff notice at a time.

As I’ve pointed out in previous posts, the line of fracture between the senile elite and what Arnold Toynbee called the internal proletariat—the people who live within a failing civilization’s borders but receive essentially none of its benefits—eventually opens into a chasm that swallows what’s left of the civilization. Sometimes the tectonic processes that pull the chasm open are hard to miss, but there are times when they’re a good deal more difficult to sense in action, and this is one of these latter times. Listen to the whisper of the shutoff valve, and you’ll hear tens of thousands of Americans being cut off from basic services the rest of us, for the time being, still take for granted.

Wednesday, April 29, 2015

The Death of the Internet: A Pre-Mortem

The mythic role assigned to progress in today’s popular culture has any number of odd effects, but one of the strangest is the blindness to the downside that clamps down on the collective imagination of our time once people become convinced that something or other is the wave of the future. It doesn’t matter in the least how many or obvious the warning signs are, or how many times the same tawdry drama has been enacted.  Once some shiny new gimmick gets accepted as the next glorious step in the invincible march of progress, most people lose the ability to imagine that the wave of the future might just do what waves generally do: that is to say, crest, break, and flow back out to sea, leaving debris scattered on the beach in its wake.

It so happens that I grew up in the middle of just such a temporary wave of the future, in the south Seattle suburbs in the 1960s, where every third breadwinner worked for Boeing. The wave in question was the supersonic transport, SST for short: a jetliner that would fly faster than sound, cutting hours off long flights. The inevitability of the SST was an article of faith locally, and not just because Boeing was building one; an Anglo-French consortium was in the lead with the Concorde, and the Soviets were working on the Tu-144, but the Boeing 2707 was expected to be the biggest and baddest of them all, a 300-seat swing-wing plane that was going to make commercial supersonic flight an everyday reality.

Long before the 2707 had even the most ghostly sort of reality, you could buy model kits of the plane, complete with Pan Am decals, at every hobby store in the greater Seattle area. For that matter, take Interstate 5 south from downtown Seattle past the sprawling Boeing plant just outside of town, and you’d see the image of the 2707 on the wall of one of the huge assembly buildings, a big delta-winged shape in white and gold winging its way through the imagined air toward the gleaming future in which so many people believed back then.

There was, as it happened, a small problem with the 2707, a problem it shared with all the other SST projects; it made no economic sense at all. It was, to be precise, what an earlier post here called  a subsidy dumpster: that is, a project that was technically feasible but economically impractical, and existed mostly as a way to pump government subsidies into Boeing’s coffers. Come 1971, the well ran dry: faced with gloomy numbers from the economists, worried calculations from environmental scientists, and a public not exactly enthusiastic about dozens of sonic booms a day rattling plates and cracking windows around major airports, Congress cut the project’s funding.

That happened right when the US economy generally, and the notoriously cyclical airplane industry in particular, were hitting downturns. Boeing was Seattle’s biggest employer in those days, and when it laid off employees en masse, the result was a local depression of legendary severity. You heard a lot of people in those days insisting that the US had missed out on the next aviation boom, and Congress would have to hang its head in shame once Concordes and Tu-144s were hauling passengers all over the globe. Of course that’s not what happened; the Tu-144 flew a handful of commercial flights and then was grounded for safety reasons, and the Concorde lingered on, a technical triumph but an economic white elephant, until the last plane retired from service in 2003.

All this has been on my mind of late as I’ve considered the future of the internet. The comparison may seem far-fetched, but then that’s what supporters of the SST would have said if anyone had compared the Boeing 2707 to, say, the zeppelin, another wave of the future that turned out to make too little economic sense to matter. Granted, the internet isn’t a subsidy dumpster, and it’s also much more complex than the SST; if anything, it might be compared to the entire system of commercial air travel, which we still have with us or the moment. Nonetheless, a strong case can be made that the internet, like the SST, doesn’t actually make economic sense; it’s being propped up by a set of financial gimmickry with a distinct resemblance to smoke and mirrors; and when those go away—and they will—much of what makes the internet so central a part of pop culture will go away as well.

It’s probably necessary to repeat here that the reasons for this are economic, not technical. Every time I’ve discussed the hard economic realities that make the internet’s lifespan in the deindustrial age  roughly that of a snowball in Beelzebub’s back yard, I’ve gotten a flurry of responses fixating on purely  technical issues. Those issues are beside the point.  No doubt it would be possible to make something like the internet technically feasible in a society on the far side of the Long Descent, but that doesn’t matter; what matters is that the internet has to cover its operating costs, and it also has to compete with other ways of doing the things that the internet currently does.

It’s a source of wry amusement to me that so many people seem to have forgotten that the internet doesn’t actually do very much that’s new. Long before the internet, people were reading the news, publishing essays and stories, navigating through unfamiliar neighborhoods, sharing photos of kittens with their friends, ordering products from faraway stores for home delivery, looking at pictures of people with their clothes off, sending anonymous hate-filled messages to unsuspecting recipients, and doing pretty much everything else that they do on the internet today. For the moment, doing these things on the internet is cheaper and more convenient than the alternatives, and that’s what makes the internet so popular. If that changes—if the internet becomes more costly and less convenient than other options—its current popularity is unlikely to last.

Let’s start by looking at the costs. Every time I’ve mentioned the future of the internet on this blog, I’ve gotten comments and emails from readers who think that the price of their monthly internet service is a reasonable measure of the cost of the internet as a whole. For a useful corrective to this delusion, talk to people who work in data centers. You’ll hear about trucks pulling up to the loading dock every single day to offload pallet after pallet of brand new hard drives and other components, to replace those that will burn out that same day. You’ll hear about power bills that would easily cover the electricity costs of a small city. You’ll hear about many other costs as well. Data centers are not cheap to run, there are many thousands of them, and they’re only one part of the vast infrastructure we call the internet: by many measures, the most gargantuan technological project in the history of our species.

Your monthly fee for internet service covers only a small portion of what the internet costs. Where does the rest come from? That depends on which part of the net we’re discussing. The basic structure is paid for by internet service providers (ISPs), who recoup part of the costs from your monthly fee, part from the much larger fees paid by big users, and part by advertising. Content providers use some mix of advertising, pay-to-play service fees, sales of goods and services, packaging and selling your personal data to advertisers and government agencies, and new money from investors and loans to meet their costs. The ISPs routinely make a modest profit on the deal, but many of the content providers do not. Amazon may be the biggest retailer on the planet, for example, and its cash flow has soared in recent years, but its expenses have risen just as fast, and it rarely makes a profit. Many other content provider firms, including fish as big as Twitter, rack up big losses year after year.

How do they stay in business? A combination of vast amounts of investment money and ultracheap debt. That’s very common in the early decades of a new industry, though it’s been made a good deal easier by the Fed’s policy of next-to-zero interest rates. Investors who dream of buying stock in the next Microsoft provide venture capital for internet startups, banks provide lines of credit for existing firms, the stock and bond markets snap up paper of various kinds churned out by internet businesses, and all that money goes to pay the bills. It’s a reasonable gamble for the investors; they know perfectly well that a great many of the firms they’re funding will go belly up within a few years, but the few that don’t will either be bought up at inflated prices by one of the big dogs of the online world, or will figure out how to make money and then become big dogs themselves.

Notice, though, that this process has an unexpected benefit for ordinary internet users: a great many services are available for free, because venture-capital investors and lines of credit are footing the bill for the time being. Boosting the number of page views and clickthroughs is far more important for the future of an internet company these days than making a profit, and so the usual business plan is to provide plenty of free goodies to the public without worrying about the financial end of things. That’s very convenient just now for internet users, but it fosters the illusion that the internet costs nothing.

As mentioned earlier, this sort of thing is very common in the early decades of a new industry. As the industry matures, markets become saturated, startups become considerably riskier, and venture capital heads for greener pastures.  Once this happens, the companies that dominate the industry have to stay in business the old-fashioned way, by earning a profit, and that means charging as much as the market will bear, monetizing services that are currently free, and cutting service to the lowest level that customers will tolerate. That’s business as usual, and it means the end of most of the noncommercial content that gives the internet so much of its current role in popular culture.

All other things being equal, in other words, the internet can be expected to follow the usual trajectory of a maturing industry, becoming more expensive, less convenient, and more tightly focused on making a quick buck with each passing year. Governments have already begun to tax internet sales, removing one of the core “stealth subsidies” that boosted the internet at the expense of other retail sectors, and taxation of the internet will only increase as cash-starved officials contemplate the tidal waves of money sloshing back and forth online. None of these changes will kill the internet, but they’ll slap limits on the more utopian fantasies currently burbling about the web, and provide major incentives for individuals and businesses to back away from the internet and do things in the real world instead.

Then there’s the increasingly murky world of online crime, espionage, and warfare, which promises to push very hard in the same direction in the years ahead.  I think most people are starting to realize that on the internet, there’s no such thing as secure data, and the costs of conducting business online these days include a growing risk of having your credit cards stolen, your bank accounts looted, your identity borrowed for any number of dubious purposes, and the files on your computer encrypted without your knowledge, so that you can be forced to pay a ransom for their release—this latter, or so I’ve read, is the latest hot new trend in internet crime.

Online crime is one of the few fields of criminal endeavor in which raw cleverness is all you need to make out, as the saying goes, like a bandit. In the years ahead, as a result, the internet may look less like an information superhighway and more like one of those grim inner city streets where not even the muggers go alone. Trends in online espionage and warfare are harder to track, but either or both could become a serious burden on the internet as well.

Online crime, espionage, and warfare aren’t going to kill the internet, any more than the ordinary maturing of the industry will. Rather, they’ll lead to a future in which costs of being online are very often greater than the benefits, and the internet is by and large endured rather than enjoyed. They’ll also help drive the inevitable rebound away from the net. That’s one of those things that always happens and always blindsides the cheerleaders of the latest technology: a few decades into its lifespan, people start to realize that they liked the old technology better, thank you very much, and go back to it. The rebound away from the internet has already begun, and will only become more visible as time goes on, making a great many claims about the future of the internet look as absurd as those 1950s articles insisting that in the future, every restaurant would inevitably be a drive-in.

To be sure, the resurgence of live theater in the wake of the golden age of movie theaters didn’t end cinema, and the revival of bicycling in the aftermath of the automobile didn’t make cars go away. In the same way, the renewal of interest in offline practices and technologies isn’t going to make the internet go away. It’s simply going to accelerate the shift of avant-garde culture away from an increasingly bleak, bland, unsafe, and corporate- and government-controlled internet and into alternative venues. That won’t kill the internet, though once again it will put a stone marked R.I.P. atop the grave of a lot of the utopian fantasies that have clustered around today’s net culture.

All other things being equal, in fact, there’s no reason why the internet couldn’t keep on its present course for years to come. Under those circumstances, it would shed most of the features that make it popular with today’s avant-garde, and become one more centralized, regulated, vacuous mass medium, packed to the bursting point with corporate advertising and lowest-common-denominator content, with dissenting voices and alternative culture shut out or shoved into corners where nobody ever looks. That’s the normal trajectory of an information technology in today’s industrial civilization, after all; it’s what happened with radio and television in their day, as the gaudy and grandiose claims of the early years gave way to the crass commercial realities of the mature forms of each medium.

But all other things aren’t equal.

Radio and television, like most of the other familiar technologies that define life in a modern industrial society, were born and grew to maturity in an expanding economy. The internet, by contrast, was born during the last great blowoff of the petroleum age—the last decades of the twentieth century, during which the world’s industrial nations took the oil reserves that might have cushioned the transition to sustainability, and blew them instead on one last orgy of over-the-top conspicuous consumption—and it’s coming to maturity in the early years of an age of economic contraction and ecological blowback.

The rising prices, falling service quality, and relentless monetization of a maturing industry, together with the increasing burden of online crime and the inevitable rebound away from internet culture, will thus be hitting the internet in a time when the global economy no longer has the slack it once did, and the immense costs of running the internet in anything like its present form will have to be drawn from a pool of real wealth that has many other demands on it. What’s more, quite a few of those other demands will be far more urgent than the need to provide consumers with a convenient way to send pictures of kittens to their friends. That stark reality will add to the pressure to monetize internet services, and provide incentives to those who choose to send their kitten pictures by other means.

It’s crucial to remember here, as noted above, that the internet is simply a cheaper and more convenient way of doing things that people were doing long before the first website went live, and a big part of the reason why it’s cheaper and more convenient right now is that internet users are being subsidized by the investors and venture capitalists who are funding the internet industry. That’s not the only subsidy on which the internet depends, though. Along with the rest of industrial society, it’s also subsidized by half a billion years of concentrated solar energy in the form of fossil fuels.  As those deplete, the vast inputs of energy, labor, raw materials, industrial products, and other forms of wealth that sustain the internet will become increasingly expensive to provide, and ways of distributing kitten pictures that don’t require the same inputs will prosper in the resulting competition.

There are also crucial issues of scale. Most pre-internet communications and information technologies scale down extremely well. A community of relatively modest size can have its own public library, its own small press, its own newspaper, and its own radio station running local programming, and could conceivably keep all of these functioning and useful even if the rest of humanity suddenly vanished from the map. Internet technology doesn’t have that advantage. It’s orders of magnitude more complex and expensive than a radio transmitter, not to mention the 14th-century technology of printing presses and card catalogs; what’s more, on the scale of a small community, the benefits of using internet technology instead of simpler equivalents wouldn’t come close to justifying the vast additional cost.

Now of course the world of the future isn’t going to consist of a single community surrounded by desolate wasteland. That’s one of the reasons why the demise of the internet won’t happen all at once. Telecommunications companies serving some of the more impoverished parts of rural America are already letting their networks in those areas degrade, since income from customers doesn’t cover the costs of maintenance.  To my mind, that’s a harbinger of the internet’s future—a future of uneven decline punctuated by local and regional breakdowns, some of which will be fixed for a while.

That said, it’s quite possible that there will still be an internet of some sort fifty years from now. It will connect government agencies, military units, defense contractors, and the handful of universities that survive the approaching implosion of the academic industry here in the US, and it may provide email and a few other services to the very rich, but it will otherwise have a lot more in common with the original DARPAnet than with the 24/7 virtual cosmos imagined by today’s more gullible netheads.

Unless you’re one of the very rich or an employee of one of the institutions just named, furthermore, you won’t have access to the internet of 2065.  You might be able to hack into it, if you have the necessary skills and are willing to risk a long stint in a labor camp, but unless you’re a criminal or a spy working for the insurgencies flaring in the South or the mountain West, there’s not much point to the stunt. If you’re like most Americans in 2065, you live in Third World conditions without regular access to electricity or running water, and you’ve got other ways to buy things, find out what’s going on in the world, find out how to get to the next town and, yes, look at pictures of people with their clothes off. What’s more, in a deindustrializing world, those other ways of doing things will be cheaper, more resilient, and more useful than reliance on the baroque intricacies of a vast computer net.

Exactly when the last vestiges of the internet will sputter to silence is a harder question to answer. Long before that happens, though, it will have lost its current role as one of the poster children of the myth of perpetual progress, and turned back into what it really was all the time: a preposterously complex way to do things most people have always done by much simpler means, which only seemed to make sense during that very brief interval of human history when fossil fuels were abundant and cheap.

***
In other news, I’m pleased to announce that the third anthology of deindustrial SF stories from this blog’s “Space Bats” contest, After Oil 3: The Years of Rebirth, is now available in print and e-book formats. Those of my readers who’ve turned the pages of the two previous After Oil anthologies already know that this one has a dozen eminently readable and thought-provoking stories about the world on the far side of the Petroleum Age; the rest of you—why, you’re in for a treat. Those who are interested in contributing to the next After Oil anthology will find the details here.

Wednesday, April 22, 2015

A Field Guide to Negative Progress

I've commented before in these posts that writing is always partly a social activity. What Mortimer Adler used to call the Great Conversation, the dance of ideas down the corridors of the centuries, shapes every word in a writer’s toolkit; you can hardly write a page in English without drawing on a shade of meaning that Geoffrey Chaucer, say, or William Shakespeare, or Jane Austen first put into the language. That said, there’s also a more immediate sense in which any writer who interacts with his or her readers is part of a social activity, and one of the benefits came my way just after last week’s post.

That post began with a discussion of the increasingly surreal quality of America’s collective life these days, and one of my readers—tip of the archdruidical hat to Anton Mett—had a fine example to offer. He’d listened to an economic report on the media, and the talking heads were going on and on about the US economy’s current condition of, ahem, “negative growth.” Negative growth? Why yes, that’s the opposite of growth, and it’s apparently quite a common bit of jargon in economics just now.

Of course the English language, as used by the authors named earlier among many others, has no shortage of perfectly clear words for the opposite of growth. “Decline” comes to mind; so does “decrease,” and so does “contraction.” Would it have been so very hard for the talking heads in that program, or their many equivalents in our economic life generally, to draw in a deep breath and actually come right out and say “The US economy has contracted,” or “GDP has decreased,” or even “we’re currently in a state of economic decline”? Come on, economists, you can do it!

But of course they can’t.  Economists in general are supposed to provide, shall we say, negative clarity when discussing certain aspects of contemporary American economic life, and talking heads in the media are even more subject to this rule than most of their peers. Among the things about which they’re supposed to be negatively clear, two are particularly relevant here; the first is that economic contraction happens, and the second is that that letting too much of the national wealth end up in too few hands is a very effective way to cause economic contraction. The logic here is uncomfortably straightforward—an economy that depends on consumer expenditures only prospers if consumers have plenty of money to spend—but talking about that equation would cast an unwelcome light on the culture of mindless kleptocracy entrenched these days at the upper end of the US socioeconomic ladder. So we get to witness the mass production of negative clarity about one of the main causes of negative growth.

It’s entrancing to think of other uses for this convenient mode of putting things. I can readily see it finding a role in health care—“I’m sorry, ma’am,” the doctor says, “but your husband is negatively alive;” in sports—“Well, Joe, unless the Orioles can cut down that negative lead of theirs, they’re likely headed for a negative win;” and in the news—“The situation in Yemen is shaping up to be yet another negative triumph for US foreign policy.” For that matter, it’s time to update one of the more useful proverbs of recent years: what do you call an economist who makes a prediction? Negatively right.

Come to think of it, we might as well borrow the same turn of phrase for the subject of last week’s post, the deliberate adoption of older, simpler, more independent technologies in place of today’s newer, more complex, and more interconnected ones. I’ve been talking about that project so far under the negatively mealy-mouthed label “intentional technological regress,” but hey, why not be cool and adopt the latest fashion? For this week, at least, we’ll therefore redefine our terms a bit, and describe the same thing as “negative progress.” Since negative growth sounds like just another kind of growth, negative progress ought to pass for another kind of progress, right?

With this in mind, I’d like to talk about some of the reasons that individuals, families, organizations, and communities, as they wend their way through today’s cafeteria of technological choices, might want to consider loading up their plates with a good hearty helping of negative progress.

Let’s start by returning to one of the central points raised here in earlier posts, the relationship between progress and the production of externalities. By and large, the more recent a technology is, the more of its costs aren’t paid by the makers or the users of the technology, but are pushed off onto someone else. As I pointed out a post two months ago, this isn’t accidental; quite the contrary, as noted in the post just cited, it’s hardwired into the relationship between progress and market economics, and bids fair to play a central role in the unraveling of the entire project of industrial civilization.

The same process of increasing externalities, though, has another face when seen from the point of view of the individual user of any given technology. When you externalize any cost of a technology, you become dependent on whoever or whatever picks up the cost you’re not paying. What’s more, you become dependent on the system that does the externalizing, and on whoever controls that system. Those dependencies aren’t always obvious, but they impose costs of their own, some financial and some less tangible. What’s more, unlike the externalized costs, a great many of these secondary costs land directly on the user of the technology.

It’s interesting, and may not be entirely accidental, that there’s no commonly used term for the entire structure of externalities and dependencies that stand behind any technology. Such a term is necessary here, so for the present purpose,  we’ll call the structure just named the technology’s externality system. Given that turn of phrase, we can restate the point about progress made above: by and large, the more recent a technology is, the larger the externality system on which it depends.

An example will be useful here, so let’s compare the respective externality systems of a bicycle and an automobile. Like most externality systems, these divide up more or less naturally into three categories: manufacture, maintenance, and use. Everything that goes into fabricating steel parts, for instance, all the way back to the iron ore in the mine, is an externality of manufacture; everything that goes into making lubricating oil, all the way back to drilling for the oil well, is an externality of maintenance; everything that goes into building roads suitable for bikes and cars is an externality of use.

Both externality systems are complex, and include a great many things that aren’t obvious at first glance. The point I want to make here, though, is that the car’s externality system is far and away the more complex of the two. In fact, the bike’s externality system is a subset of the car’s, and this reflects the specific historical order in which the two technologies were developed. When the technologies that were needed for a bicycle’s externality system came into use, the first bicycles appeared; when all the additional technologies needed for a car’s externality system were added onto that foundation, the first cars followed. That sort of incremental addition of externality-generating technologies is far and away the most common way that technology progresses.

We can thus restate the pattern just analyzed in a way that brings out some of its less visible and more troublesome aspects: by and large, each new generation of technology imposes more dependencies on its users than the generation it replaces. Again, a comparison between bicycles and automobiles will help make that clear. If you want to ride a bike, you’ve committed yourself to dependence on all the technical, economic, and social systems that go into manufacturing, maintaining, and using the bike; you can’t own, maintain, and ride a bike without the steel mills that produce the frame, the chemical plants that produce the oil you squirt on the gears, the gravel pits that provide raw material for roads and bike paths, and so on.

On the other hand, you’re not dependent on a galaxy of other systems that provide the externality system for your neighbor who drives. You don’t depend on the immense network of pipelines, tanker trucks, and gas stations that provide him with fuel; you don’t depend on the interstate highway system or the immense infrastructure that supports it; if you did the sensible thing and bought a bike that was made by a local craftsperson, your dependence on vast multinational corporations and all of their infrastructure, from sweatshop labor in Third World countries to financial shenanigans on Wall Street, is considerably smaller than that of your driving neighbor. Every dependency you have, your neighbor also has, but not vice versa.

Whether or not these dependencies matter is a complex thing. Obviously there’s a personal equation—some people like to be independent, others are fine with being just one more cog in the megamachine—but there’s also a historical factor to consider. In an age of economic expansion, the benefits of dependency very often outweigh the costs; standards of living are rising, opportunities abound, and it’s easy to offset the costs of any given dependency. In a stable economy, one that’s neither growing nor contracting, the benefits and costs of any given dependency need to be weighed carefully on a case by case basis, as one dependency may be worth accepting while another costs more than it’s worth.

On the other hand, in an age of contraction and decline—or, shall we say, negative expansion?—most dependencies are problematic, and some are lethal. In a contracting economy, as everyone scrambles to hold onto as much as possible of the lifestyles of a more prosperous age, your profit is by definition someone else’s loss, and dependency is just another weapon in the Hobbesian war of all against all. By many measures, the US economy has been contracting since before the bursting of the housing bubble in 2008; by some—in particular, the median and modal standards of living—it’s been contracting since the 1970s, and the unmistakable hissing sound as air leaks out of the fracking bubble just now should be considered fair warning that another round of contraction is on its way.

With that in mind, it’s time to talk about the downsides of dependency.

First of all, dependency is expensive. In the struggle for shares of a shrinking pie in a contracting economy, turning any available dependency into a cash cow is an obvious strategy, and one that’s already very much in play. Consider the conversion of freeways into toll roads, an increasingly popular strategy in large parts of the United States. Consider, for that matter, the soaring price of health care in the US, which hasn’t been accompanied by any noticeable increase in quality of care or treatment outcomes. In the dog-eat-dog world of economic contraction, commuters and sick people are just two of many captive populations whose dependencies make them vulnerable to exploitation. As the spiral of decline continues, it’s safe to assume that any dependency that can be exploited will be exploited, and the more dependencies you have, the more likely you are to be squeezed dry.

The same principle applies to power as well as money; thus, whoever owns the systems on which you depend, owns you. In the United States, again, laws meant to protect employees from abusive behavior on the part of employers are increasingly ignored; as the number of the permanently unemployed keeps climbing year after year, employers know that those who still have jobs are desperate to keep them, and will put up with almost anything in order to keep that paycheck coming in. The old adage about the inadvisability of trying to fight City Hall has its roots in this same phenomenon; no matter what rights you have on paper, you’re not likely to get far with them when the other side can stop picking up your garbage and then fine you for creating a public nuisance, or engage in some other equally creative use of their official prerogatives. As decline accelerates, expect to see dependencies increasingly used as levers for exerting various kinds of economic, political, and social power at your expense.

Finally, and crucially, if you’re dependent on a failing system, when the system goes down, so do you. That’s not just an issue for the future; it’s a huge if still largely unmentioned reality of life in today’s America, and in most other corners of the industrial world as well. Most of today’s permanently unemployed got that way because the job on which they depended for their livelihood got offshored or automated out of existence; much of the rising tide of poverty across the United States is a direct result of the collapse of political and social systems that once countered the free market’s innate tendency to drive the gap between rich and poor to Dickensian extremes. For that matter, how many people who never learned how to read a road map are already finding themselves in random places far from help because something went wrong with their GPS units?

It’s very popular among those who recognize the problem with being shackled to a collapsing system to insist that it’s a problem for the future, not the present.  They grant that dependency is going to be a losing bet someday, but everything’s fine for now, so why not enjoy the latest technological gimmickry while it’s here? Of course that presupposes that you enjoy the latest technological gimmicry, which isn’t necessarily a safe bet, and it also ignores the first two difficulties with dependency outlined above, which are very much present and accounted for right now. We’ll let both those issues pass for the moment, though, because there’s another factor that needs to be included in the calculation.

A practical example, again, will be useful here. In my experience, it takes around five years of hard work, study, and learning from your mistakes to become a competent vegetable gardener. If you’re transitioning from buying all your vegetables at the grocery store to growing them in your backyard, in other words, you need to start gardening about five years before your last trip to the grocery store. The skill and hard work that goes into growing vegetables is one of many things that most people in the world’s industrial nations externalize, and those things don’t just pop back to you when you leave the produce section of the store for the last time. There’s a learning curve that has to be undergone.

Not that long ago, there used to be a subset of preppers who grasped the fact that a stash of cartridges and canned wieners in a locked box at their favorite deer camp cabin wasn’t going to get them through the downfall of industrial civilization, but hadn’t factored in the learning curve. Businesses targeting the prepper market thus used to sell these garden-in-a-box kits, which had seed packets for vegetables, a few tools, and a little manual on how to grow a garden. It’s a good thing that Y2K, 2012, and all those other dates when doom was supposed to arrive turned out to be wrong, because I met a fair number of people who thought that having one of those kits would save them even though they last grew a plant from seed in fourth grade. If the apocalypse had actually arrived, survivors a few years later would have gotten used to a landscape scattered with empty garden-in-a-box kits, overgrown garden patches, and the skeletal remains of preppers who starved to death because the learning curve lasted just that much longer than they did.

The same principle applies to every other set of skills that has been externalized by people in today’s industrial society, and will be coming back home to roost as economic contraction starts to cut into the viability of our externality systems. You can adopt them now, when you have time to get through the learning curve while there’s still an industrial society around to make up for the mistakes and failures that are inseparable from learning, or you can try to adopt them later, when those same inevitable mistakes and failures could very well land you in a world of hurt. You can also adopt them now, when your dependencies haven’t yet been used to empty your wallet and control your behavior, or you can try to adopt them later, when a much larger fraction of the resources and autonomy you might have used for the purpose will have been extracted from you by way of those same dependencies.

This is a point I’ve made in previous posts here, but it applies with particular force to negative progress—that is, to the deliberate adoption of older, simpler, more independent technologies in place of the latest, dependency-laden offerings from the corporate machine. As decline—or, shall we say, negative growth—becomes an inescapable fact of life in postprogress America, decreasing your dependence on sprawling externality systems is going to be an essential tactic.

Those who become early adopters of the retro future, to use an edgy term from last week’s post, will have at least two, and potentially three, significant advantages. The first, as already noted, is that they’ll be much further along the learning curve by the time rising costs, increasing instabilities, and cascading systems failures either put the complex technosystems out of reach or push the relationship between costs and benefits well over into losing-proposition territory. The second is that as more people catch onto the advantages of older, simpler, more sustainable technologies, surviving examples will become harder to find and more expensive to buy; in this case as in many others, collapsing first ahead of the rush is, among other things, the more affordable option.

The third advantage? Depending on exactly which old technologies you happen to adopt, and whether or not you have any talent for basement-workshop manufacture and the like, you may find yourself on the way to a viable new career as most other people will be losing their jobs—and their shirts. As the global economy comes unraveled and people in the United States lose their current access to shoddy imports from Third World sweatshops, there will be a demand for a wide range of tools and simple technologies that still make sense in a deindustrializing world. Those who already know how to use such technologies will be prepared to teach others how to use them; those who know how to repair, recondition, or manufacture those technologies will be prepared to barter, or to use whatever form of currency happens to replace today’s mostly hallucinatory forms of money, to good advantage.

My guess, for what it’s worth, is that salvage trades will be among the few growth industries in the 21st century, and the crafts involved in turning scrap metal and antique machinery into tools and machines that people need for their homes and workplaces will be an important part of that economic sector. To understand how that will work, though, it’s probably going to be necessary to get a clearer sense of the way that today’s complex technostructures are likely to come apart. Next week, with that in mind, we’ll spend some time thinking about the unthinkable—the impending death of the internet.