If science continues burgeoning the way it did in the Twentieth Century, by the year 2070 everyone on Earth will be a postdoctoral research fellow.
If knowledge systems like the Internet proliferate at their present pace, all the world's data will fit into a pill, cheaper and easier to digest than a potato chip.
These two wry forecasts illustrate the problem with futuristic punditry. Extrapolations can fool you.
That doesn't keep us from trying, though.
In the first of these articles I talked about humanity's obsession with the future, rooted in unique bits of brain-matter called the prefrontal lobes. The urge to look ahead is so compelling, we devote much of our economy to all kinds of forecasting, from weather reports and stock analyses to financial and strategic planning, from sports handicapping to urban design, from political prophets to those charlatans on psychic hotlines. Which variety of seer you listen to can often be a matter of style. Some prefer horoscopes, while others like to hear consultants in Armani suits present a convincing "business case."
Each of us hopes to prepare for what's coming, to improve our fate in the years ahead. This may be humanity's most distinctive trait, explaining our mastery over the world. But the task is muddied by life's essential competitiveness. If several rivals get the same data and plot the same trends, each will try to change the equation, shifting things in their favor. No wonder people seem conflicted over information policy and "privacy." We need knowledge to hold others accountable, yet each of us worries that others know too much about us.
These quandaries will only grow more intense as human cognitive powers expand in coming years. Memory will be enhanced by vast, swift databases, accessed at the speed of thought. Vision will explode in all directions as cameras grow ever-smaller, cheaper, more mobile and interconnected. In such a world, it will be foolish to depend on the ignorance of others. If they don't already know your secrets, there is a good chance someone will pierce your veils tomorrow, without you ever becoming aware of it. The best firewalls and encryptions may be bypassed by a gnat-camera in your ceiling or a whistle-blower in your back office.
How can you be sure it hasn't already happened?
The secrecy-option always had this basic flaw — that it's not robust or verifiable. Some businessfolk, like Jack Stack, author of The Great Game of Business, have adapted by using open-book management to reduce costs, enhance employee morale, foster error-detection, eliminate management layers and do business in ways that make it irrelevant how much their competitors know.
Companies that pay millions to conceal knowledge will strive endlessly to plug leaks, yet gain no long term advantage or peace of mind. Because the number of ways to leak will expand geometrically as both software and the real world grow more complex. Because information is not like money or any other commodity. The cracks that it can slip through are almost infinitely small, and it can be duplicated at almost zero cost. Soon information will be like air, like the weather, and as easy to control.
Let's take this a bit farther. Say you're walking down a street in the year 2015. Your sunglasses are also cameras. Each face you encounter is scanned and fed into a global pattern-search.
Your glasses are also display screens. Captions seem to accompany pedestrians and passing drivers, giving names and compact bios. With an eye-flick you command a fresh view from an overhead satellite. Tapping a tooth, you retrieve in-depth data about the person in front of you, including family photos and comments posted by friends, associates... even enemies.
As you stroll, you know that others see you similarly captioned, indexed, biographed.
Sound horrific? Well, what are you going to do about it? Outlawing these tools will only keep common folk from using them. Elites — government, corporate, criminal, technical and so on — will still get these new powers of sight and memory, despite the rules. So we might as well have them too.
Compare this future to the old villages where our ancestors lived, until quite recently. They, too, knew intimate details about everyone they met on a given day. Back then, you recognized maybe a thousand people. But we won't be limited by the capacity of organic vision and memory. Our enhanced eyes will scan ten billion fellow villagers while databases vastly supplement our recall. We'll know their reputations, and they will know ours.
This portrayal of our near future may cause mixed feelings, even deep misgivings. Will it be the "good village" of Andy Hardy movies... safe, egalitarian and warmly tolerant of eccentricity? Or the bad village of Sinclair Lewis's Babbit and Main Street, where the mighty and narrow-minded suppress all deviance from a prescribed norm?
We'd better start arguing about this now — how to make the scary parts less scary, and the good parts better — because there's not stopping the clock. The village is coming back, like it or not.
All this new technology should make it easy to index and track forecasts made by those seeking public trust — the "futurists" out there soaking up much of the economy-analyzing trends, casting horoscopes, or creating business plans. The same statistical tools that today hold scientists accountable may debunk the worst scammers and demagogues. They may also reveal anomalies — people who just happen to be right a lot! (I wonder how this column will score?)
Tools like the Internet promise new ways to empower private citizens, making them smarter consumers and voters... or else turning them into perfect prey for opportunists. Some foresee instant democracy — or demarchy — with millions of citizens "meeting" in virtual assemblies, then voting on issues of the day, skipping the intermediate stage of legislatures and elected officials. As in Periclean Athens, we may replace the delegated authority of a republic with rapid, direct polling of the sovereign electorate from their homes, with the flick of a button.
Some commentators depict this possibility with horror — public issues reduced to shallow sound bites and "deliberated" with the maturity of a mob. Yet, similar dire predictions were made a century ago, when citizens established the initiative process in California and other western states. Today voters get thick booklets filled with pro and con arguments. They hear debates on public radio. All told, the effects aren't as awful as opponents forecast around 1900.
Elitist gloom is a cliché that crops up whenever common folk are about to be enfranchised or empowered with new prerogatives. Remember how the credit-reporting industry foretold disaster if consumers were allowed to look over their own credit records? While it can feel satisfying, this habitual disdain seems tiresome in light of how much better-educated, less bigoted, and more savvy people are today than their ancestors ever imagined.
Is it so hard to envisage that tomorrow's citizens — our children — may rise to fresh challenges, as we have done?
We had better hope they do, because some form of demarchy is unavoidable. Public opinion polls already play a crucial role in the two-way exchange of sovereignty between officials and the electorate. Future high-tech surveys will sample a wired, sophisticated populace in real time. Whether this turns into a nightmare, or a dazzling extension of rambunctious citizenship, may depend on how completely people are informed, and how seriously they take their responsibilities.
Do you see your neighbors as helpless victims of modern times — clueless consumers and couch-potatoes — devouring fast food and passive entertainment? What about the millions who seem engaged in a myriad spirited activities from gardening to choreographed group-skydiving? Radio societies refine their own spacecraft designs. Exotic seed clubs maintain winnowed gene pools. Aficionados revive dead languages and while others frenetically invent new kinds of sports, to achieve 15 minutes of fame on TV. Hobbies drive the economy, even more than our passion for predictions. Might this trend turn out to be important?
Why not? It happened once before, in Victorian times, when proficient amateurs became a major force in human innovation. As both skill and free time multiply in the next century, the same thing may happen again, multiplied ten thousand-fold.
Are we entering a "Century of Amateurs?" Society may be increasingly influenced by new kinds of know-how, developed outside older centers of expertise like universities, corporations or government bureaus. This new trend is illustrated by the rise of Linux and the "open-source movement," unleashing legions of passionate amateurs into a realm formerly dominated by the cubicled minions of major corporations. Might even more out-of-control creativity emerge when cheap chemical synthesis-in-a-box arrives on every desktop, letting private citizens concoct new organic compounds at will?
There will be a dark side to such inventiveness. Hateful types will misuse new technologies to wreak harm. In the long run, we may survive this kind of "progress" only if decent people are vastly more numerous and competent than vicious types.
In other words, we'll be all right, if humanity as-a-whole grows more sane.
"Sane?" Did I really say that?
Well, yes. In the long run, our grandchildren may need far better understanding of that word than we have today.
The 20th Century dawned amid enthusiastic hopes for a useful paradigm of human nature and psychology. Simplistic models were promulgated by followers of vaunted sages, from Marx to Freud, but these naïve hopes all dashed against reefs of human complexity. Our chief accomplishment in later generations was to demolish countless hoary fables about humanity: myths based on self-deception and over-reliance on cultural norms. For example, we've learned to chip away at age-old rationalizations for racism, sexism and oppression.
Alas, this necessary debunking also put under dark suspicion any attempt to use words like sanity. Post-modernists decry the term as meaningless, but that may be going too far. Like an elephant fondled by blind men, sanity is hard to define, but we can often tell when it is there, or not. Tragedies tend to happen in its absence.
Entering a new century, are we finally ready to try again for a new definition? One that is culturally-neutral, based on satiability, empathy, diversity and adaptability? One that celebrates human eccentricity, while at the same time drawing gentle offers of help toward those who fume among us, like smoldering powder-kegs?
Already, studies of brain chemistry suggest that many of our most pleasant behaviors — athletics, sex, music, affection, parenting — are reinforced by psychoactive compounds we release into our own bloodstreams. Studying this reinforcement system — and how some modern humans hijack it for abuse — may be more useful than any tool yet brought to bear in the agonizing "war" against illicit drugs.
I may be deluded to think we've made progress toward a saner world, and for predicting more dramatic strides in days to come. But consider the alternative — a near-future world of ten billion people, many of them poor and angry, yet also able to instantly access everything from atom bomb designs to complete maps of the human genome. A "Bladerunner" future, oft-portrayed in lurid sci fi films, where fantastic technology is unmatched by advances in maturity. Where crucial decisions are made by opulent and unaccountable elites. The image is kind of cute, in a ninety-minute noir movie. But in real life, such a world would self-destruct. It must serve as a stage to something better, or else something much worse.
Navigating that path will be the demanding task of citizens in the coming Transition Age. Those mighty folk will determine Earth's destiny — whether we achieve our potential or sink into a nightmare worse than any in our past.
It's quite a challenge.
Prepare your kids to face it.
Copyright © 1999 by David Brin. All rights reserved.
These quandaries will only grow more intense as human cognitive powers expand in coming years. Memory will be enhanced by vast, swift databases, accessed at the speed of thought. Vision will explode in all directions as cameras grow ever-smaller, cheaper, more mobile and interconnected. In such a world, it will be foolish to depend on the ignorance of others.
"Probing the Near Future" (published in full here) originally appeared in AOL's Online Magazine iPlanet in late 1999 as part two of a series commissioned specifically to discuss the new Millennium. Part One, "The Self-Preventing Prophecy," and Part Three, "Do We Really Want Immortality?," are also available on this site.
Sinclair Lewis, Babbitt (book)
Sinclair Lewis, Main Street (book)
David Brin blogs at Contrary Brin and comments on Facebook, Twitter, Google+, and Quora specifically to discuss the political and scientific issues he raises in these articles. If you come to argue rationally, you're voting, implicitly, for a civilization that values open minds and discussions among equals.
David Brin's science fiction novels have been New York Times Bestsellers, winning multiple Hugo, Nebula and other awards. At least a dozen have been translated into more than twenty languages. They range from bold and prophetic explorations of our near-future to Brin's Uplift series, envisioning galactic issues of sapience and destiny (and star-faring dolphins!).
Short stories and novellas have different rhythms and artistic flavor, and Brin's short stories and novellas, several of which earned Hugo and other awards, exploit that difference to explore a wider range of real and vividly speculative ideas. Many have been selected for anthologies and reprints, and most have been published in anthology form.
Since 2004, David Brin has maintained a blog about science, technology, science fiction, books, and the future — themes his science fiction and nonfiction writings continue to explore.
Who could've predicted that social media — indeed, all of our online society — would play such an important role in the 21st Century — restoring the voices of advisors and influencers! Lively and intelligent comments spill over onto Brin's social media pages.
David Brin's Ph.D in Physics from the University of California at San Diego (the lab of nobelist Hannes Alfven) followed a masters in optics and an undergraduate degree in astrophysics from Caltech. Every science show that depicts a comet now portrays the model developed in Brin's PhD research.
Brin's non-fiction book, The Transparent Society: Will Technology Force Us to Choose Between Freedom and Privacy?, continues to receive acclaim for its accuracy in predicting 21st Century concerns about online security, secrecy, accountability and privacy.
Brin speaks plausibly and entertainingly about trends in technology and society to audiences willing to confront the challenges that our rambunctious civilization will face in the decades ahead. He also talks about the field of science fiction, especially in relation to his own novels and stories. To date he has presented at more than 200 meetings, conferences, corporate retreats and other gatherings.
Brin advises corporations and governmental and private defense- and security-related agencies about information-age issues, scientific trends, future social and political trends, and education. Urban Developer Magazine named him one of four World's Best Futurists, and he was cited as one of the top 10 writers the AI elite follow. Past consultations include Google, Microsoft, Procter & Gamble, and many others.
Do not enter if you want a standard "Party" line! Contrary Brin's community pokes at too-rigid orthodoxies, proposing ideas and topics that fascinate and infuriate.