One of the prevailing narratives of our time is that we are innovating our way into the future at break-neck speed. It’s just dizzying how quickly the world around us is changing. Technology is this juggernaut that gets ever bigger, ever faster, and all we need to do is hold on for the wild ride into the infinitely cool. Problems get solved faster than we can blink.
But I’m going to claim that this is an old, outdated narrative. I think we have a tendency to latch onto a story of humanity that we find appealing or flattering, and stick with it long past its expiration date. Many readers at this point, in fact, may think that it’s sheer lunacy for me to challenge such an obvious truth about the world we live in. Perhaps this will encourage said souls to read on—eager to witness a spectacular failure as I attempt to pull off this seemingly impossible stunt.
The (slightly overstated) claim is that no major new inventions have come to bear in my 45-year lifespan. The 45 years prior, however, were chock-full of monumental breakthroughs.
A Tale of Three Times
Before diving into the defense of my bold claim, let’s set the stage with a thought experiment about three equally-separated times, centered around 1950. Obviously we will consider the modern epoch—2015. The symmetric start would then be 1885, resulting in 65-year interval comparisons: roughly a human lifetime.
So imagine magically transporting a person through time from 1885 into 1950—as if by a long sleep—and also popping a 1950 inhabitant into today’s world. What an excellent adventure! Which one has a more difficult time making sense of the updated world around them? Which one sees more “magic,” and which one has more familiar points of reference? The answer is obvious, and is essentially my entire point.
Take a moment to let that soak in, and listen for any cognitive dissonance popping inside your brain.
Our 19th Century rube would fail to recognize cars/trucks, airplanes, helicopters, and rockets; radio, and television (the telephone was 1875, so just missed this one); toasters, blenders, and electric ranges. Also unknown to the world of 1885 are inventions like radar, nuclear fission, and atomic bombs. The list could go on. Daily life would have undergone so many changes that the old timer would be pretty bewildered, I imagine. It would appear as if the world had blossomed with magic: voices from afar; miniature people dancing in a little picture box; zooming along wide, hard, flat roads at unimaginable speeds—much faster than when uncle Billy’s horse got into the cayenne pepper. The list of “magic” devices would seem to be innumerable.
Now consider what’s unfamiliar to the 1950 sleeper. Look around your environment and imagine your life as seen through the eyes of a mid-century dweller. What’s new? Most things our eyes land on will be pretty well understood. The big differences are cell phones (which they will understand to be a sort of telephone, albeit with no cord and capable of sending telegram-like communications, but still figuring that it works via radio waves rather than magic), computers (which they will see as interactive televisions), and GPS navigation (okay: that one’s thought to be magic even by today’s folk). They will no doubt be impressed with miniaturization as an evolutionary spectacle, but will tend to have a context for the functional capabilities of our gizmos.
Telling ourselves that the pace of technological transformation is ever-increasing is just a fun story we like to believe is true. For many of us, I suspect, our whole world order is built on this premise.
On the flip side, I can think of loads of things about modern life that would have been perfectly familiar even to an ancient Egyptian. These are on the side of what it means to be human: laughter, drama, jealousy, shelter, bodily functions, family, jerk-wads, motherly love, tribalism, scandal, awe over the stars, etc. Because these are such constants, it is not hard for me to imagine key elements of the far future of humanity (see previous list). As far as technology goes: buzzing electric toothbrushes? I’d be foolish to count on them. But I’d bet on the wheel remaining important.
Space Leaps
Another interesting consideration: the 65-year time span we considered before is very similar to the amount of time it took to go from the first airplane to landing people on the Moon (in 65.6 years, we went from no powered flight to Moon-walking). Prior to the flight era, humans might have been able to get tens of meters off of terra firma without risking likely death. The Moon landings extended this pre-flight scale by seven orders of magnitude, so a pace of about an order-of-magnitude per decade. Not only have we not kept pace—we should have seen humans twice as far as Pluto by now and at the light-year scale by 2040—but we stopped our upward/outward march completely! Try convincing someone in 1965 that the U.S. would not have a human space launch capability 50 years later, or that we would retreat from far-flung human exploration after 1972 and they would think you to be stark-raving mad.
In My Life
I was born 9.5 days after the epoch of Unix Time, at the beginning of 1970. It’s very convenient for several reasons. 1-9-70 is 1970. President Nixon’s birthday is the same, and I was born when he was in office. It doesn’t make me a crook. Remembering my age in a particular year is easy math, especially so close to the New Year. And if I want to know my age in seconds, I just grab Unix time from any computer programming language’s time library function call. Answer: 1.44 billion seconds.
So my claim is that I was born into a post-invention world. I can’t possibly mean this in the extreme. I myself invented the first cryogenic image slicer, and co-invented a nifty airplane detector that is selling to observatories. But these are not big deals—just derivative products.
The big deals are: the computer revolution, the internet, mobile phones, GPS navigation, and surely some medical innovations. But I would characterize these as substantial refinements in pre-existing gizmos. It’s more an era of hard work than of inspiration. I’m not discounting the transformative influence of the internet and other such refinements, but instead pointing out that the fundamental technological underpinnings—the big breakthroughs— were in place already.
Computers existed before I was born, and even talked to each other over (local) networks. Mobile phones have a long history predating my birth. GPS navigation is a space-based refinement of the older LORAN system, which is also based on timing of signal receipt from transmitters at known locations. Lasers (now important for optical drives and many other devices) were invented before I was born and were even used to measure the Earth-Moon distance to few-decimeter precision in 1969. The microwave oven was invented just after World War II; the first countertop model became available in 1967.
Medically?
Before my birth it was understood that vitamin C fixes scurvy, and vitamin D rickets. Prior to the 20th Century we already had vaccines for smallpox, cholera, anthrax, and rabies. The 1920’s saw insulin, penicillin, and vaccines for Diphtheria, tuberculosis, whooping cough, scarlet fever, and tetanus. In later years, we got vaccines for Yellow Fever, Polio, Measles, Mumps, and Rubella. Since my birth, we’ve seen vaccines for chicken pox, Hepatitis A and B, meningitis, Lyme disease, rotavirus, and possibly malaria and ebola this year. Obviously we have not stopped the march, and that’s encouraging. But consider that the amount of funding poured into medical research has skyrocketed in my lifetime, so that the progress per dollar spent surely is going down. The easy battles were fought first, naturally. Cancer, Multiple Sclerosis, and a raft of other pernicious diseases resist cures despite large continuing investments. But I admit a lack of expertise when it comes to medical research/progress (see overview here), so take this one with some Epsom salt.
Energy
I am more familiar with—and concerned about—energy technologies. What’s new on the table since my birth? Solar, wind, hydro/tidal, geothermal, nuclear fission (including thorium), wave, biofuels, fuel cells, etc.: all were demonstrated technologies before I was born. Where are the new faces? It’s not as if we have lacked motivation. Energy crises are not unknown to us, and there have been times of intense interest, effort, and research in my lifetime. Tellingly, the biggest energy innovation in my time is enhanced recovery techniques for fossil fuels: perhaps not the most promising path to the future.
We continue to work on nuclear fusion (note that we have succeeded in producing fusion in Tokamaks, for instance, and also in the spectacular explosions of hydrogen bombs). Should we succeed at controlled, sustained, net-positive fusion, we would qualify it as a new face at the table. I might characterize it as the most expensive way to create electricity ever devised (and electricity is not the hard nut to crack). If that’s our only substantial hope for game-changing innovation, we risk losing this game.
The true game changers would turn sunlight into liquid fuels. Agricultural routes compete for food and require substantial sustained labor (low EROEI), and algae may have water and gunk problems (see post on the biofuel grind). Artificial photosynthesis remains a favorite fantasy for me, but there may also be thermo-chemical approaches using concentrated sunlight.
But I digress: I’m trying to make a point about lack of fundamental inventions in my lifetime, and the energy domain fits the same pattern.
Social Progress
One realm that has seen substantial progress in my lifetime is not technological, but social. Tolerance for different races, ethnicities, sexual orientations, and other conditions/choices marking individuals as “different” has improved in most parts of the world. This is not without exception, and at times appears to lurch backwards a bit. But there is no doubt that the world I live in today is more tolerant than the one I grew up in. And only part of that involves moving from Tennessee to California.
The one caution I cannot resist raising is that I view this tolerance as stemming from a sated world. In times of plenty, we can afford to be kind to those who are different. We are less threatened when we are comfortable. If our 21st Century standard of living peaks—coincident with a peak in surplus energy (i.e., fossil fuels)—then we may not have the luxury of viewing our social progress as an irreversible ratchet. Hard times revive old tribal instincts: different is not welcome.
Down with the Narrative
To me, this is all the more reason to raise awareness that we ought not take our future for granted. I believe that the narrative we have elected to believe—that progress is an unstoppable force and ever-accelerating technology will save us—is ironically the very attitude that can bring “progress” crashing down.
I think we should admit that our hypothetical 1885 person would be more bewildered by the passage of 65 years than the 1950 “modern” human. I think we should admit that the breathtaking pace of major breakthroughs has actually declined. That’s different from stopping, note. I think we need to take our energy predicament seriously, and acknowledge that we have few new ideas and don’t have any consensus on how to design our future infrastructure given the pieces we already know very well.
Note to commenters
I can predict that this post will be offensive to many and that the comments will be loaded with anecdotes and “what about X, you moron” sorts of posts—but hopefully more politely put. I will likely have little time to respond to each such thing. Just know I am not downplaying how transformative refinements (internet, computers, etc.) can be. But also know that the odd counterexample has a hard time dismantling the larger picture: just imagine that I could lob three pre-1950 inventions back for every post-1950 offered if I deemed it worth the time to play that game. Now, if you find yourself in this “offended” category, ask yourself: why is this so upsetting to you? How reliant are you on the narrative of progress for your sanity and understanding of our world and its future? I’m just sayin’—you might want to have that looked at.
Views: 9254
Yep, yep, yep! You’re pointing out the diminishing returns of technology 🙂 I’ve already read about it here: http://theanarchistlibrary.org/library/jason-godesky-thirty-theses#toc1
It’s a long read, but I’d love to hear if you can find any major flaws with it, cause I can’t – only nitpicking.
I have put it in a slightlly different way. The major difference between people of my parents’ generation and my generation (or my daughter’s generation) is the “democratisation of stuff”. Most of the everyday items that are nowadays regarded as essential everyday items were available in the 30s and 40s — but only for the very rich.
Examples: radios, TV, refridgerators, fresh fruit, foreign holidays, cars, food variety, well heated/cooled homes, food safety.
Nope, that hasn’t changed much either. The problem you have is you have an American perspective. For most of the world these things have not been “democratised” at all. There are now 7 billion people on the planet and the majority of them will have few of these taken for granted.
“Most of the everyday items that are nowadays regarded as essential everyday items were available in the 30s and 40s — but only for the very rich.
Examples: radios, TV, refridgerators, fresh fruit, foreign holidays, cars, food variety, well heated/cooled homes, food safety.”
Interestingly enough living in 2015 I have been without TV, radio, a fridge, fresh fruit, a holiday of any sort, a car, well heated and cooled home more that I ever lacked it as a child of the 1980’s.
I agree. I’ve thought recently a lot about the world my into which my grandmother was born in 1901 vs. the one she left in 1984 as contrasted with the one into which I was born in 1954 and the one I see now. She saw autos, phones, airplanes, computers, spaceflight, electrification, etc. come into wide use (yes, the phone and electrification existed before she was born but were certainly not in widespread use). The only monumental changes my friends and I have come up with in my lifetime are medical imaging, robotics, and most significantly in my opinion, our ability at any time and any place (almost) to use a handheld device to find out almost anything that we want to know.
“The only monumental changes my friends and I have come up with in my lifetime are medical imaging, robotics, and most significantly in my opinion, our ability at any time and any place (almost) to use a handheld device to find out almost anything that we want to know.”
I very much agree with Tom’s argument in this blog post but I would caution those of us who are techno-skeptics to not underestimate the transformative potential of the internet. I stress the word potential since so much of its use (speaking only anecdotally) is for porn, selfies, spam, kitten videos, celebrity gossip, and other such nonsense. But the capacity to learn is incredibly powerful. So too, of course, is the ability of the powers-that-be to monitor our activities and distract us with reptilian-brain triggers (cue Donald Trump).
Much is said of the democratizing influence of the Internet. The ‘net, where anyone can create a web page, anyone can read it, most anyone can comment on it (not ALL pages have comment sections), is a huge, democratizing influence. Democracy (in this sense) is pivotal to creating a “bazaar,” where everyone can see what everyone else is doing, what everyone else is saying, and memes and designs can evolve rapidly. When Tim says that we’ve made small improvements on lots of things, that’s typical of the bazaar.
And yes, that’s a reference to “The Cathedral and the Bazaar,” by Eric S. Raymond. If you’ve not read that, I highly recommend it. It expresses many things that I’ve felt and observed, but codified them into language I hadn’t developed.
The cathedral is all about an overriding plan, lots of moving parts and a long gestation period. It may take decades to build that cathedral, involving thousands of people, all around one person’s vision. The transistor was more of a cathedral, involving deep R & D and the deep pockets and widely varied resources of Bell Labs. So was the Integrated Circuit, the brainchild of a couple men, made real by larger organizations (Texas Instruments and Fairchild Semiconductor).
The cathedral builders of the middle ages took what they saw in others’ designs and ran with them, learned from them, evolved them. Similarly, once transistors and ICs were on the market, the bazaar took them and ran with them, both in applying them and in creating competing products. Because the bazaar isn’t just about observation and evolution, it’s also where you sell your wares. And once your wares hit the market, it’s game on.
You need both. A lack of cathedrals results in a lack of major, revolutionary technologies. A lack of bazaars means that those technologies come about, but are slow to find greater application, slower to evolve, slower to become widely available. The last few decades has witnessed a VERY active bazaar.
Long-term Intellectual Property, whether it be through copyrights on art (current standard is death of author + 70 years) or patents on hardware/software, would attempt to kill the bazaar. We’ve spent time and treasure developing this, no one else can even dream of running with it without being sued into insolvency. Alternately, we bought this idea from someone, no one else can run with it without paying us (Patent Trolls). There’s a solid case for having copyrights and patents, but there needs to be recognition that all of these things MUST GO into the public domain. And not at some, unspecified point in the far, distant future. The sooner they’re in the public domain, the sooner the bazaar can run with them, without the risk of being sued.
Organizations which would turn the ‘net into a walled garden, showing ONLY that which they can monetize or which they approve, blocking access to other things, attempt to kill the bazaar. If all you can see is what your particular ISP or government endorses, how are you going to find those disruptive things, created/evolved by others, which could improve your life?
We need cathedrals and bazaars. Anything which puts the kibosh on them needs to be fought. Actively. Otherwise, you’ll see significant deceleration in the bazaar. And that’s what brings the most visible improvement for the majority of us.
Neural networks, narrow AI, and coming sooner than one might guess AI superintelligence. Here is a blog post someone wrote that summarizes the best guess of many industry experts on where we are in that timeline:
http://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html
Having computers vs having AI is like having fire vs creating the steam engine. The steam engine was the first real time we harnessed power greater than people or animals could provide. That’s the driving force that has allowed us to transform the world as much as we have in recent history.
AI is the equivalent for intelligence, we are starting to harness intelligence greater than any human could provide. What that driving force will allow us to do is likely the best or worst thing humanity will ever know.
AI is not going to be anywhere nearly as transformative as steam was. Why? Any chimp-level AI is still quite a few generations of technology away, and we don’t really have many generations of technology left before some serious problems begin to bite us in the ass.
Anything less than chimp-level intelligence is not going to be of much use on a planet with 7+ billion freak-smart-chimps walking around. The problem we face is *not* a lack of intelligence. The problem is the *diminishing utility* of intelligence. All the low-hanging fruit has been picked. the only thing a super-intelligent AI will say is “good job freaky-chimps, you figured it all out. 42.”
We’ve already seen that there are lots of uses for microcontrollers dumber than a cockroach. Programmable monkey brains working for “free” would be a fairly big deal. Also, if the monkey thinks orders of magnitude faster than you, that matters. Would you be more productive at your job with some reliable, focused, monkey aides? (My boss would be…) I’d guess that the failure modes – and success modes – of AI can be quite different, and thus radically useful, relative to primate intelligence. In short, many kinds of intelligence, all of which can be monetized.
AI is a bit like practical fusion: it’s been just around the corner for the last fifty years.
Fusion would actually be useful, though.
John Michael Greer hijacked your blog? 🙂 No seriously, good post!
Thanks for the compliment. I have not read his works extensively, but what I have read has been characteristically profound and exceptionally well-written. I am not in the same league, but thanks anyway.
[late edit: I am mainly familiar with Greer’s eulogy for space, and also read something that seemed to bridge a middle road between utopian and apocalyptic thinking: which struck me as sensible. Some of the comments that follow paint a picture that is unfamiliar to me, anyway.]
I read Greer consistently. While he’s extraordinarily knowledgeable about, as he puts it, the history of ideas, he seems a bit cocksure. I’m always a bit skeptical of people who KNOW.
I understand what you’re saying about Greer.
OTOH, his writings have given me a number of good heuristics, i.e. ways to think. Case in point, his explanation that apocalypse and triumphant futurism (the Singularity et al) are just two sides of the same “this time it’s different” coin.
Greer was insightful in his early years, but in the last year or two he hasn’t written much that’s interesting, insightful, or even accurate. It’s a hard task, no doubt, to write a new insightful blog post every week, but he’s at the point where most of the new posts are derivatives of old ones. Also, I’ve noticed, as you say, that he’s too certain he’s right about things; this is compounded by the fact that he’s made many small, easily-verified factual errors that over the past couple of years that he never corrects (and often repeats) even when they’re pointed out.
I’m hoping a new writer steps into his shoes and picks up where he left off.
I just stumbled over this on Wikipedia:
“Iggers (1965) says the great failing of the prophets of progress was that they underestimated the extent of man’s destructiveness and irrationality. The failing of the critics of the Idea of Progress, he adds, came in misunderstanding the role of rationality and morality in human behavior.”
Greer redux – aside from peddling his particular strain of “apocalypse lite”, I take exception with the proposal that yearning for progress – defined in some meaningful way (such as life expectancy, literacy, health etc.) – is somehow unsavory, undesirable, and should be denunciated as the ultimate cause of our mistakes. The solution to ignorance and irrationality is not to abolish reason, or – passively and with a certain giddiness – wait for its “inevitable” demise. As the scientific method demonstrates, our individual cognitive limits can, to a significant extent, be transcended by cooperation and process. At its core, Greer’s message is just another glibertarian denial that consensus, governance and policy are necessary and powerful means. That’s even less helpful than all the ignorant cheering for inevitable “progress”. Whatever can be said about Space Cadets, at least they have dreams.
“I am not in the same league, but thanks anyway.”
You’re selling yourself short here. This blog is the most erudite example from what I call the “energy decline” movement. You are probably the most knowledgeable person in that movement. If you’re not convincing, then nothing in this movement is convincing.
-Tom S
Hi,
first, thank you for your post. I like the way you approach topics with an unconventional angle that very few medias do.
I am born in 1990 and I must say that I mostly agree with your post. But as a young man I also wonder about our future (I am not meaning you are not, you just do not talk about it). To draw the future, let use your description of the past and present innovations. As you said, our new products are the refinements of pre-existing technologies. However I would add that these innovations also came from the combination of those existing technologies applied to new fields e.g. gps + battery + telephone gives a smart-phone that finally leads to autonomous driving cars when integrated to a car. Following this line I believe that the next thing that will change humanity is not a yet undiscovered technology but the merge of all of our current knowledge to create a true artificial intelligence and fully autonomous robots. I am afraid that at this point our energy problems will be the least of our problem…
And even if a big discovery is made e.g. such as a grand unified theory I strongly believe that artificial intelligence will have a bigger impact in our society and lifestyles than everything else (without meaning the rise of cyborgs).
Cheers!
P.S. sorry for possible mistakes. I am not an English native speaker.
IMO, the availability to women of effective contraception which they control has fundamentally changed the human condition. Sex now has much much lower chance of resulting in pregnancy. Improvements in healthcare, including the availability of safe abortions, mean that pregnancy is much less likely to result in death. This results in fundamental changes to society. The pill is not a refinement of earlier contraceptive technologies and its level of effectiveness is far superior.
I’m not convinced by the notion that computers are just a form of interactive television. Consumer uses of computers may be but computers and robotics have also transformed the workplace. Possibly you could argue that this is just continuing an existing trend towards automation?
By the way, you may be interested to read David Graeber’s book, “The Utopia of Rules”, in which he makes much the same argument. (Neither of you completely convince me).
Interesting blog post, as always. While I do agree that completely new inventions seem to have slowed since the 60s, I think that the refinement of existing technologies has more than made up for that. Most importantly, global communication has blossomed in the past decade.
You do talk about the Internet and computers, but I feel that you gloss over the importance of this. When looking back even within my lifetime (I am about 10 years younger than you) I can see major changes in the availability of information. People have information literally at their fingertips about things that would have taken days if not weeks of research 20 – 30 years ago. We have the ability to communicate with family, friends, and strangers worldwide at essentially no cost (sure there is the sunk cost of a computer and some low monthly connection fee, but you are not charged by the minute for a video call or per email). For those who do not have access to computers, public libraries in many parts of the world will provide such services for low or no cost.
This information explosion has allowed me personally to do things that would have been very difficult previously. I can teach myself new skills that would have taken specialized courses (or at least text books) in years past. Even simple things like looking up information on how to fix my washing machine can make a difference in real life (there are Youtube videos on how to fix just about everything!)
Anyway, keep up the excellent blog; I have been following for a few years now and always enjoy your well reasoned and thoughtful viewpoints.
Cheers
Good points–thanks. Interestingly, as an educator, I sense that the era of easy information has had a negative impact on developing cognitive skills. Much like calculators erode mental computation skills. Maybe this is all okay, but it makes us brittle as a society: take away the conveniences (power outage or worse) and we have little to fall back on.
I was fortunate, I suppose, in that I was born into a world of slide rules and adding machines (I have two slide rules, “just in case,” university libraries for journals, the Dewey Decimal system, etc. and my (blind) father taught me to calculate in my head. So I’ve got that advantage along with the access to the world’s knowledge and Wolfram Alpha’s calculating ability on my smart phone. On the other hand, I’m 61.
I still have THREE slide rules. Two K&E Log Log Duplex Decitrig (pocket and briefcase size) and one Pickett. Twenty or so years ago (I was not retired yet) I was relearning how to use the K&E’s. Preparing for a meeting which I feared might take a wrong turn due to a pompous know-it-all, I put the pocket K&E in my briefcase and removed my HP calculator.
THE ISSUE came up. I calculated several exponential growth numbers on the slide rule. All talk ceased, everybody stared at the slide rule. Somebody pulled out a real calculator and got the numbers right after several tries. Yeah, that distraction was derailed.
However, everybody, even the nontechnical folks, recognized the slide rule for what it was. Last year a granddaughter took the Pickett to High School chemistry. Nobody had heard about them. The teacher had read about them, but told her to put it away, it was interesting but a useless distraction.
Understanding of genetics. Although breeding as an ad hoc practice has been around for millennia, in 1950 although DNA was known to be involved somehow, we didn’t even know its structure. Now we’ve got genome sequencing costs down to $1000 and falling, CRISP-Cas9 tailored gene editing, and are starting to have discussions on the ethics of gene line modifications because a practical technology for that is in sight if not here.
The -onics. Semiconductor electronics was (just) known in 1960, now we’ve got photonics, plasmonics, spintronics, etc. These probably fall under the “sufficiently advanced technology” banner – most people aren’t going to see or understand them, but they promise interesting changes to our devices.
2D materials might count, if we can actually do something with them other than get research grants for “the next big thing”.
I think you are broadly correct. I think your timescales are mildly wrong however: a lot of really enormous changes happened longer ago than 1935. I would push things back to, perhaps, 1870 or before.
My mother is nearly 80, and was born in a first-world country (the UK), and not into poverty or anything like it. The house she was born in had no electric light, and I think there was probably no electric light until the end of the war. I’ll repeat: her parents were not poor or living somewhere stupidly remote, they were just not living in a city. They did have, for instance, at least one car, a radio, and (I think) servants or at least a maid. The world has changed hugely in her lifetime, and most of the really big changes happened before 1970.
But it also changed hugely in the period before she was born. Her mother’s generation were probably the first not to expect to lose children or die in childbirth (if you wander around 19th-century graveyards you will rapidly realise just how common both of these events were): the medical revolution was not complete by 1936 but it was well under way, and it made an enormous difference.
Cars made in 1930 were about half as fuel-efficient as the best cars made today (mine gets around 30 MPG) were completely practical, if rather more dangerous than we would be comfortable with now.
Further back: by 1900 a middle-income person could travel from London to Edinburgh in a day, and might well expect to commute by rail. A rich person could cross the Atlantic for a holiday. When was the first transatlantic cable? The industrial revolution really was a revolution. Not to mention things like increasing crop yields which go back even further.
I am both old and cynical, but I often think that, looking back from a few hundred years time, the moon landings will be seen as the high point of our civilisation. Not the end, but the moment when things started slowing down.
First transatlantic cable: between Google and Wikipedia, this is very easy — no need to get up and look through an expensive encyclopedia, or go to public library to look it up. First electric telegraph at all was 1845; first transatlantic telegraph was 1858, though the cable broke after 3 weeks; a more durable one was finally achieved in 1866.
Telegraph had expanded a lot by 1852. By 1874 all the inhabited continents were connected; by 1880 100,000 miles of undersea cable were laid. In 1844 it took ten weeks to sail from London to Bombay and back, in 1874 a telegram took 4 minutes (not sure if that’s one way or message + reply.)
Some firsts I have saved up:
first telegraph 1845
first photograph 1826, of person 1836 (10 minute exposure), 1851 wet collodion (seconds), 1884 film, first color 1861 Maxwell 3-filter plates later used by Prokudin-Gorskii for 1910 Russia, first commercial color 1907
first Kodak amateur camera 1888
first commercial intercity railway 1830, 7000 miles laid in Britain by 1850s
early US railroads 1830s, but 1850s long distance at reasonable rates
first elevated rail in NY 1867
high speed rail 1964 shinkansen
first subway 1863, first electric 1890
first fax kind of 1865, for transmitting drawing; 1881 for photo scanning
So, this is kind of the thing: if you look at top signal speeds, there’s a big change with the optical telegraph, and even bigger with the electric telegraph, and a small one to radio, and we don’t anticipate going any faster. Light speed, after all. The telegraph alone ‘shrank’ and unified the world in a way that won’t happen again, at that level. OTOH, increasing the *ubiquity* of fast communications is also a major change: a British Empire telegraph is a big deal, but so is being able to e-mail someone in Afghanistan from my pocket. There’s a sense in which the first is more fundamental, but it’s hard to conclusively say it’s more important.
Likewise, inventing the photograph, being able to capture scenes as seen, is a big deal. But so is everyone having a camera in their pocket (and with 2 billion projected smartphone users, we’re getting there.)
***
But all that said, once people because city dwellers in electrified houses with home appliances and cars, things outside of IT didn’t change nearly as much.
It’s not just top signal speeds but the incredible increase in the top speed of physical travel. It wasn’t any faster to travel from Rome to Paris in 1800 than it was when Ceasar Agustus ruled. The coming of the railroad decrease that by almost an order of magnitude.
The telegraph and the railroad are the foundations of globalization.
I agree. In fact I always thought this was kind of obvious? When you read about the industrial revolution, it’s full of these huge inventions that massively changed the way we live. There’s so many it’s hard to even list them all. The inventions we have now seem like toys, in comparison- it’s hard to imagine a history book putting the iPhone6 on the same level as, say, the Model T.
If anything, it’s even worse than you make it sound. The 20th century had a lot of inventions that were mundane but actually did a lot to boost productivity- I’m thinking of things like the filing cabinet, the paperclip, and rubber bands. We’re running out of low-hanging fruit like that to invent, and it shows up as much slower productivity growth.
I have’nt thought about progress this way but you make a good point. We invented many things before 1950, now we are digesting them. Social progress is a reaction to world changes made by inventions.
I see huge difference in popular culture – how we think the furute will be and how people thought about this years ago. Compare Star Trek, Jetsons and other where future is bright and shiny with current zombie/alien/electrical apocalyptic movies. I believe in social terms we are slowly trying to accept future as a decline of humanity.
What will future bring? I don’t think we have digested all the modern world yet. We still have to learn much as a global societies. So probably further decline of our progress will come.
When will we have next big invention? What will it be? Not sure. But note that many inventions were made near not-so-peaceful times. Will such times come again? Do they have to come to force us to make scientific progress?
Interesting point about the flavor of movies. Yes: there are many many movies/shows now about how we messed the world up (or it messed up on our watch) and have to live with the consequences.
“Compare Star Trek, Jetsons and other where future is bright and shiny with current zombie/alien/electrical apocalyptic movies.”
Sure, but that’s cherry-picking. There were all kinds of movies about “invasion from outer space” in the 1960s, far more than now. The first zombie movie (Night of the Living Dead) was from 1968, and most of the classic zombie films were from the 70s. There were lots of movies and books in that era about AI or robots taking over (the book “I, Robot” was published in 1950). There were many movies about nuclear apocalypse (Dr Strangelove and many others). It’s possible that movies have become somewhat less optimistic on average since 1960 but that would be hard to measure.
“I believe in social terms we are slowly trying to accept future as a decline of humanity.”
There was definitely more doomsday stuff when I was a kid in the late 1970s. Most people don’t remember it, but the original peak oil and energy collapse movements were from the late 70s, and were far more popular than now. Back then, there was actual emergency gasoline rationing, and a lot of people thought it meant the end of civilization. The President of the US gave speeches about a looming energy crisis which meant catastrophe for us all, unless drastic action was taken. Time magazine (the most popular news magazine in the country by far) had a special issue devoted entirely to the energy crisis. Books like “Limits to Growth” and “The Population Bomb” were part of everyday conversation. Also there were lots of movies about nuclear annihilation (The Day After). All of those things were intended as non-fiction. Of course, not everyone believed that stuff, but I think doomsday thinking was more popular then. These days, the issue of peak oil barely registers with most people.
-Tom S
I wonder about this change. It seems to coincide with a couple political and social movements, as well. Socially, around 1980, the mass of Baby Boomers had reached adulthood. Politically, the shift from Carter to Reagan occurred at the same time. Simultaneously, there was a sudden shift towards leveraged growth on a scale not previously seen.
From where I stand, it does appear that cultural dominance of post-war generations removed the sensibleness of previous generations and replaced it with a sort of fingers-in-the-ears attitude towards the future. There was a total rejection of the narrative of decline, including the idea of a controlled descent.
I certainly don’t want to read too much into this. Generational differences are overblown (the apotheosis of the Greatest Generation over the last 15 years comes to mind), but the mathematical reality of our energy situation was obviously known 40 years ago and was much more widely accepted and discussed. It’s something that has stuck out to me since the economic events of the summer of 2008, and the fluctuations in the price of oil during that same time.
I don’t remember the 70s so I can’t compare that. You may be right that before utopian future we saw in 80-90s mowies was a phase of dystopian average.
For cherry picking – we had Mad Max, Terminator and so on which was doomsday scenarios. They were also in 80-90s.
But my feeling is that on average in 80-90s we felt that the future is bright. Those doomsday movies were just a interesting entertainment. Even in eastern europe under control of comunism where I grew up we belived that future will be better.
Now my feeling is that progress has been slowed down. Even computers don’t get as much faster as they did 10-15 years ago. And there is no popular tv show which tells us about future that is better because of science. No shows about humans exploring the galaxy. We moved to thinking about magic, supernatural, irrational things like vampires, ghosts, zombies.
So maybe we can find one or two shows that contradict this but in general we moved from science to magic and from utopian to dystopian future.
I dont understand why you think the distinction between new inventions and refinements on existing inventions is so important. I think what you are driving at is:
“I think we need to take our energy predicament seriously, and acknowledge that we have few new ideas and don’t have any consensus on how to design our future infrastructure given the pieces we already know very well.”
and
“To me, this is all the more reason to raise awareness that we ought not take our future for granted.”
And those are all fine points, but the question i have is, why do we need *new* ideas to solve the problems we face? If I could reduce the cost of solar by 1 or 2 orders of magnitude we could all but eliminate the use of carbon based fuels, who cares that solar technology isnt at all new? Likewise electric cars. Not at all new, but without continuous refinement they are unlikely to replace our existing IC based cars. If you care about the end result, the improvement of an idea is as important, if not more important than the idea itself.
Mentally, we reserve a place of honor for the big innovators, we care about who is first, but as far as effect on the real world is concerned, making something work is more important that making something first.
A solid point: I think we could make do with the fundamental energy inventions on the table as of 1970—refined as they are today. I think the world essentially acknowledges that this would be hard: not superior substitutes to our current sources of energy. So we spend much of our effort seeking new transformative technologies rather than getting down to business and formulating/executing a plan using the pieces we have.
Its not just about making do, its about how we think about innovation. We put an undue emphasis on the truly new, but maybe we shouldnt. Yea, the Wright brothers were the first to fly (well, fly and land successfully) a heavier than air craft, but without the countless people improving on that idea , we would have a useless curio rather than a usable service.
The same thing is true when we think about innovation to solve our energy problems. Yea, maybe someone will come up with some zero point reactor that solves all our issues, but the much more likely scenario is that we walk towards the solution with a series of ‘make this old process %5 more efficient’ or ‘This thing that was impractial is now worthwhile now that someone figured out how to make it cost half as much’ and so on.
Not to get all dime store psych on you, but i think that perhaps you have a bias towards the new because you are an academic. I know when i was studying CS, the university types would favor more ‘elegant’ solutions even if a simple, unclever solution was provably better. Its just the nature of academics to celebrate the clever and new over the mundane but useful. You want a good example of that look at how many researchers are dedicated to papers that feature new results vs researchers focused on repeating and validating other works.
To be clear, I’m more responding to what I perceive as a hangup of our society at large rather than a personal preference for the new. Personally, I build stuff, and have greater appreciation for simple, direct, low-level, bare-bones solutions that are robust and I can easily understand than complex, feature-laden, elegant solutions that are more prone to malfunction. I embrace the solutions on the table and say “get on with it” rather than hold on to a dream of “better” around the corner. This post tries to prod people into a similar mentality: don’t expect monumental breakthroughs to be the salvation. So I’m the opposite of what you assume, and my computer code is anything but elegant (works, though!).
I agree that we have slowdown and even regress in energy hungry areas because energy just come more expensive than previously.
For these who still are excited about IT and Moore’s law, I’d like to remind them, that doubling period is slowed down from initial 12 months to almost 24 months (and in similar fields also — disk capacity, network speeds etc.)
But in other miniaturization (and therefore energy modest areas) fields we still can hope for some (r)evolutions.
Sorry, I can’t agree. This analysis focuses entirely on the invention of goods and technologies, but that is not a perspective that is capable of fully addressing the history of human innovation, which is also a history of crucial social change.
Let’s take a western European person from 1750, and bring them to 1850. The material world that they would experience would be very similar – industrialization’s most disruptive technologies were still not wide-spread. However, industrialization had, by 1850, substantially restructured western European economic and social relations. Early industrialization produced primarily things and used primarily technologies that would have been familiar to a late pre-industrial person, but it was a revolution nonetheless.
The same is true now. Computers have been around since before either of us were born, but their effects on our society have not been consistent throughout that entire period. There’s no fundamental innovation separating DARPANet from Facebook, but it’s a revolution nonetheless. Between our births and now, we’ve entered a world in which a literal majority of all humans are now directly, instantaneously, connected to each other. Computerization similar has brought about a revolution in our economic relations, even though we’re still producing most of the same things. This is a revolution of scale.
I’m reminded of Charles Holland Duell’s apocryphal quote that “Everything that can be invented has been invented.” Our time traveller from 1750 might very well utter these words, but only because the sheer magnitude of the paradigm shift of industrialization is visible only in hindsight.
If we are all around in 2150, I think that we will view this period the same way we now look at the industrial revolution.
I agree with the basic idea — I’m used to 1850-1900, 1900-1950, 1950-now comparisons — but can’t resist quibbling on a few points.
“Tellingly, the biggest energy innovation in my time is enhanced recovery techniques for fossil fuels” I’m not sure why that’s a bigger innovation than reducing solar costs by 10x, or practical lithium batteries.
Medically, a huge change is birth control, whether The Pill or IUDs. Both have precursor research earlier in the century, but you don’t see good versions of either until after 1950. A huge deal for women, not just in preventing pregnancies but often in making periods less painful, or even making them go away.
If you described the capabilities of a smartphone in a fantasy novel, most readers would probably have rejected it as too unrestrained magic. Worldwide communications, locater and map genie, entertainment both canned and interactive, pocket access to the knowledge of humanity, weather sensor…
Genetic engineering is pretty radical, and likely to get more so.
So yeah, compared to inventing radio (or the telegraph) and steam engines and electricity and washing machines, I think there’s been a huge slowdown in radical changes, but it is a slowdown, not a stop. And there’s potential for bigger changes: exowombs, true AI, anti-aging. Nothing I’m holding my breath for, but possible.
Totally agree. I think the ubiquity of modern technology has had a veiling effect on the plateauing of fundamentally new knowledge. They follow tightly related but distinct paths. The inflection point of that knowledge creation plateau does seem somewhere between 1950 and 1970 with the arc of technological creation lagging by 50 to 60 years. There’s a similar lag at the beginning of this era, between Newton, Liebniz, Hooke, Boyle, etc. and the practical applications put to use by Watt and his contemporaries 80 or so years later.
I hadn’t though about inventions like this, but after a litle contemplation, I mostly agree. However, I fail to see the point or argument in your post, usually you have one. in this case however, what you write about seems just to describe the logical evolution of technology:
Refinement of existing concepts is natural for humans and it is what we have always been good at, because of our ability to transfer large amounts of information from generation to generation.
Transformative inventions come about because someone somewhere asks or answers a new question about the world: What is light? What is heat / energy? How and why do the planets move? Why do things fall down? How can I turn the power of boiling water into movement? All of these are related to things we can directly experience with our senses. I believe we have simply run out of unexplained things in our lives. There are just not many obvious questions left. The fundamental questions which are still unanswered are so refined that it takes a lifetime of study just to understand the question, and ridiculously expensive equipment to research them: Think particle physics/LHC or Astronomy/Space telescopes.
I think (or rather hope) that the great inventions of our time will be rather social in nature than technological. We have to cooperate to overcome war, poverty, overpopulation, climate change and pollution if we want to have anyone to remember us. And I believe this will only be possible with very fundamental changes in our society. Many of the solutions may use technology (wind and solar energy, crypto currencies, social networks…) but those will not be the invention in and of themselves, just the tools.
Sorry the point got lost. There’s always a point, even if I don’t make it as obvious as it should be. In this case, to the extent we suffer under a religious belief that breakneck innovation of fundamentally new ideas will be our saving grace, think again. Over-reliance on technological solutions yet to be formulated is perhaps not wise, given the slowdown and plucking of the low-hanging fruit.
I agree that transformation on the cultural/values side may have more power at this point. This may mean expecting less, slowing down, trying no-growth economies, and releasing ourselves from misplaced techno-fervor.
There are a lot of people who *do* think the pace of technological advancement is ever-increasing, has never been higher, etc. Which is probably true if you look at number of patents, but is less so when you look at major transformations and discoveries, vs. changing drugs slightly to keep monopoly profits on their parents.
Refinement is important — I’ve read Newcomen and Watt got steam engines from 1% efficiency to 15% efficiency, which is huge — but at the same time, as you say, major transformations like “heat engine”, “electricity”, “periodic table”, “germ theory” don’t seem to be coming at the pace they did for a few centuries. Arguably the last big physics breakthrough was fission, in 1938 (and it took only 7 years to make a bomb out of that! less than 18 for commercial power.)
And yeah, you could transform society a lot just by applying tools we have. Direct democracy, Keynesian economics, electrified public transit, heat pumps, sane water pricing…
“Now consider what’s unfamiliar to the 1950 sleeper. Look around your environment and imagine your life as seen through the eyes of a mid-century dweller. What’s new? Most things our eyes land on will be pretty well understood. The big differences are cell phones (which they will understand to be a sort of telephone, albeit with no cord and capable of sending telegram-like communications, but still figuring that it works via radio waves rather than magic), computers (which they will see as interactive televisions), and GPS navigation (okay: that one’s thought to be magic even by today’s folk).”
To use your rhetorical techniques in dismissing of newer technologies as refinement rather than invention:
What’s unfamiliar in 2015 to the 1885 sleeper? Cell phones they will understand as a refinement of the telephone (1876), but using radio waves to transmit (1878, although the mechanism was unclear at the time) the signal instead of a wire; computers are but highly refined electrified versions of the mechanical calculating machines of their day; and the principles of gravitation (1687) allowing for the orbiting of GPS satellites had been understood for centuries, the radio waves they use were already known, the noninfinite speed of light had been known for some time (1676), and at the end of the day GPS is about turning navigation into a geometry problem to be solved – exactly what it was in 1885 as well (you have to dig down into GPS’ corrections for the ionosphere’s behavior and General Relativity to get any truly novel principles to show our awakened sleeper).
Modern cars would most certainly not be “magic” to someone from 1885 who kept up with the times: the internal combustion engine was invented in 1859 and the steam-powered automobile was invented in 1769. Replacing one type of engine with a different one and making the whole thing much more efficient is impressive but it does not magic make.
On the energy front, the physical laws concerning conservation of it were established in the course of the 19th century and so where we might extract useful work from for our own use has been well mapped out for some time, the only novel form since then being nuclear energy. Any new broad category would require new physics (well, we know the principle of extracting energy by throwing our garbage into black holes, but this seems to pose some significant engineering challenges not likely to be overcome on decadal timescales). Solar panels have gone from ~5% efficiencies in 1970 to breaking the Shockley-Queisser limit and can even be printed out now.
On the US lack of manned spaceflight capability as seen in 1965, forget 2015, how about 1974-1981? The Saturn 1B and CSM used for Apollo-Soyuz in 1975 were only available as surplus stock.
What this post lacks is a quantitative aspect: for what it’s worth, the USPTO’s rate of patents issuance has roughly doubled every 40 years for at least the last century and a half. But a lot of this increasing complexity is hidden from public view: in 1970, most would have a good idea of how every component of their car worked. Today, the computer algorithms learning your driving style and relating your right foot’s position to the fuel/air mix being injected into the engine’s cylinders for optimal fuel economy and/or torque are opaque to all but a few.
Aeroplane wings still look pretty much the same and fulfill the same role today as in 1970, but now they are the result of an enormous amount of computer modeling for their shape and materials research to make them more efficient, give better performance, be lighter and stronger. From an everyday perspective we can’t tell the difference so are under the illusion that nothing has changed. It has taken truly enormous amounts of invention to keep pushing up computer processor transistor densities and hard drive densities over the last 40 years, but the only thing most of the public know about all this vast creativity and experimentation going on behind the scenes is its result of increasing performance and capacity.
We can’t all be experts in everything, and we can’t possibly appreciate the details of what we don’t have time to learn. With our civilization using ever more complex technologies in a multitude of fields we simply can’t see very far below the surface functionality of many of the devices we use.
Well constructed. I don’t find a great deal of disagreement. Two points: 1) while virtually all “inventions” can be cast in the light of refinement, we still see a qualitative difference in the kinds of devices we’re exposed to daily across the two spans I compare (difference in primary instantiation vs. growth in under-the-hood complexities); 2) efficiency improvements have been important in the more recent span, but have a saturation effect built in, so that this gift will not continue indefinitely.
The patent rate increase is, interestingly, close to the population trend.
I think that you are missing the qualitative impact computers are having and the sheer number of inventions that are being crammed into that single category.
As an engineer, most of my working day is spent interacting with a computer in one form or another. Many of my tasks are significantly easier, or only even possible, because of computers.
Going from work to home I can instantly know the weather forecast, decide whether to bike or not, choose and be guided on the best route, find out where the buses and cabs are in real time and compare prices.
At home I continue using computers to order groceries (only non GMO of course… and while I’m on amazon how about an inflatable kayak?), talk to family and friends (across four continents via free video chat), play completely interactive games (with both computer and distant human opponents), and learn new things (about anything and everything for free, at my own pace). I can even be involved in graduate course level debate about society and technology from my home.
Obviously it’s not all good. I do rely more on my calculator (at least before I learned, EXCEL, MATLAB, EES etc). I am more open to distraction and the mountains of garbage online (wth is with all the cat videos?). It is possible to be addicted to the interactive games (anything that triggers dopamine release…). I get drawn into intellectual debates at midnight when I should be sleeping…
Sure computers are only one of many appliances and are far from the majority of our physical effort, energy consumption etc. Also, like every other revolutionary technology, they are mostly used to do the same things humans have always done. But at least for me, the ubiquitous of computers has resulted in very large qualitative changes.
I would also be careful about inferring too much from the patent rate.
This comment already summarizes many of the thoughts I have on this subject. I would like to add that I believe the original article is probably correct in a manner of speaking: So far as inventions that replace physical human effort are concerned, the rate of change is substantially slower than it was when the first (steam) and second (electrification) industrial revolutions were at their peaks. I also agree with the assertions that low hanging fruit has been taken preferentially and that we shouldn’t rely, with almost blind faith, on technological innovation to solve more problems than it creates. This is especially true for something as fundamental as energy generation.
However I think that the article misses some major points about the nature of technological progress and the recent lack of inventions. I believe that the assertion that any of the inventions listed from any time period were “new” as opposed to “derived” is a false dichotomy. For instance the first air plane was a glider (already existing) coupled with an IC engine (already existing) much like the first smart phones were “merely” combinations of existing technologies with a few additions and minor innovations to make it all work together.
As for recent lack of inventions and progress I think toast21 is right in asserting that a) the change in computers is a bigger qualitative deal than many realize and b) a lot of the brain power is being focused on narrower and less obvious tasks than electrifying house hold appliances. I do not know whether the end result will be as dramatic as from the first two industrial revolutions but one has merely to look at some of the inventions in biotech to be both amazed and very concerned.
I see I’m not the only one to immediately think of Greer and his assertion that the ‘civil religion of progress’ is the ruling mythology of our age.
I am surprised no one has brought up Joseph Tainter’s thesis that declining marginal returns on increasing complexity bring on the collapse of complex civilizations. Think of the simple 19th century laboratories of Professor Murphy’s own field compared to the expense and complexity of the LHC.
I don’t see any degree of social or technological innovation rescuing us from the consequences of peak oil and peak top soil (now receding in the rear view mirror) and the looming peak water and peak phosphorus.
http://www.postcarbon.org/the-law-of-diminishing-returns/
The two eras you point out plainly divide progress in physics (even more so if we push back another 12 years to before Maxwell’s treatise). Take a physicist from 1865 and drop them in 1950 and I don’t think they could even understand the new developments. Do the same from 1950 to today, and I bet most could still contribute, given a little time to catch up with the literature.
Modern physics has provided no game changers in the field of energy production and I don’t think it ever will. With our understanding of ‘new physics’ and effective field theories, any undiscovered forces are either too weak to be useful or just out of our reach.
Social “Progress” — you left out wealth inequality, crushing debt, asset bubbles, energy wars, and refugees.
I’m surprised you, a physicist, din’t resort to measuring things. Life expectancy, literacy, energy consumption, agricultural productivity, energy efficiency, computational power, etc. There is also the more arguable (but should be more important if kept consistent) Human Development Index. If you look at the numbers you’ll be forced to agree we’ve improved a lot. I call this progress.
Life expectancy in particular is interesting – it almost looks like a case study in refuting the idea of “unstoppable progress”. I find papers stating that the transition to agriculture reduced life expectancy drastically from hunter-gatherer times before it eventually recovered (and by now exceeds the latter). Urbanization likewise appears to have an impact (as well as events such as the dissolution of the Soviet Union) if looked at by specific regions. The time series is by no means monotonically increasing, and one could certainly argue that advances have largely leveled off at this point.
Yes, you could call this “progress”, but it would just make Tom’s point. The question is not whether there are still incremental changes, or even an overall trend, the question how would you argue that it will *inevitably* continue when it obviously did not in the past.
There is a determinism here, a foolhardy sense of inevitability, but it is not fatalism (and the attitude is not stoicism) because the connotations are all wrong. Panglossing over the details, this… progressism, for lack of a better word, awaits passively the advances that will come *no matter what we do*. It denies free will, declaring that we can’t [bleep] up history even if we tried our damndest.
I should add that ignorant cheering for “progress” is essentially the posture of a freeloader – yeah, the future will be bright, and I won’t have to lift a finger!
It takes hard work, and we might fail. To say otherwise is an vicious insult to our ancestors that lived and died to get us where we are, however haphazard and incidental it usually went, and however ill-advised many of their decisions might have been. [Bleep]-ups ‘R Us, because we know better than anybody before us ever did.
I don’t disagree with your premise, but I think you’ve forgotten one major innovation, which is so intrinsic to our modern world that it’s hard to imagine life without it: plastic. Pre-1950, there were thermoset plastics, such as Bakelite (awesome stuff; I had a phone made of it, and some other knick-knacks).
Plastic as we know it, however, was largely unavailable to the average consumer until the late 1950s or even later. Certainly not on the same scale as Internal Combustion (existent, but not widespread, in 1885) and Flight, but still something of a miracle, speaking as one who has been involved with the development of consumer goods.
In the same line of the article you can read these paper:
Huebner, J. (2005): “A possible declining trend for worldwide innovation”. In Technological Forecasting & Social Change, DOI: 10.1016/j.techfore.2005.01.003.
Tom I love your blog but I think you are leaving a few things out. How about the router, Google’s search algorithm, WiFi, the iPhone, cellular service, fiber optics, HTML, Crispr/Cas9, 3d printing, integrated circuits, li-ion batteries, UAVs, nuclear powered ships and spacecraft, cloud computing, GPS, rare earth magnets, hundreds of new plastics and other chemical compounds, alloys and materials. Nano materials, meta materials. Flash memory. I’m sure there are more but most of my knowledge is in computer tech.
Perhaps we disagree as to what a fundamental or important invention is. Yes ENIAC was built in the 1940’s, but that doesn’t make Google less fundamental just because it depends on the computer. I would say that the importance is related to how much it affects everyday life. Early computers as in 1950, zero. Now? How many Google searches do you do an hour?
Tom, I’ve been around since 1954, and I I generally agree with your thoughts. I think what you are actually touching on here was well defined by a science fiction author many years ago in an essay he published in the pages of Analog magazine. I’m working from memory here as I no longer have access to the article, but the author’s day job was as a historian. He realized that the history of inovation is characterized by three main stages. First, a major paradigm shift in world view is introduced. This step usually takes two to three generation to become widely adopted, (60 to 100 years)(my interpretation: the thought police of the old guard needs to die before new thought has freedom). The second steps is a period of innovation were the implications of the new understanding of the world is put to practical use. We call this technology. This period typically lasts 200 years or so. The third is a period of technological stagnation where little change happens because you are waiting for a new world view to emerge. This can last for many centuries. If I remember correctly, the author identified six such thought shifts in the history of humans. Fire was likely the first. The list is of course open to debate, but the foundational shift that set the stage for “modern” life was the the theory of chemistry in the 1660’s. Try to name an inovation that uses something more profound in world view than that. Okay, chaos theory and quantum behavior could qualify, but wait, Chaos Therory first emmerged in the 1880’s. So it was the author’s contention that we are currently I n an unusual innovation period (practical applications of our current world view) that has stretched longer than the norm because of overlapping paradigm shifts. There is no way to know in advance how long this may continue, but as you have pointed out about rising cost for incremental change, it certainly will not continue indefinitely. It seems likely to me that the lowering EROEI of our primary fuel sources may have the unintended consequence (definable with Chaos Therory?) of shorten this ride.
Another great post!
I’ll further comment on the myopia you described. People will say something like “I can download an encyclopedia’s worth of knowledge in a minute. Generations ago had to use Morse code, which would have taken forever. We have to handle more change.”
But people generations ago weren’t thinking they were slow to us in their future or thinking about things in our lives that they lacked. They were saying the same thing about their past, like how a cable could get a message across a continent almost instantly compared to by horse, not that the message wasn’t a whole encyclopedia.
We could just as well say that today we are limited compared to our future, but we don’t. It’s hard to imagine people in the past did. It seems another reason not to think of our time as particularly special.
On the social development side, the 1885-1950 person would have learned that people created civil engineering to mobilize and fight two world wars, including multiple genocides of millions of people. I guess you could say the 1950-2015 person would have seen we found ways not to repeat, but I don’t think it’s the same.
I would nominate the emancipation of Women and their increasing participation in all aspects of society and life to be a major social innovation and advance. I see it as more significant than the emancipation and fuller participation of ethnic minorities: women are half of humanity.
We can credit (blame) universal education, birth control, medical advances and more for making it possible. What made it happen is a deeper question.
You might enjoy (parts of) this thread, which is kind of the rear-view mirror perspective on Tom Murphy’s take on the recent past:
http://www.antipope.org/charlie/blog-static/2015/09/the-present-in-deep-history.html
Your point figured prominently there, the question remains whether this important change is irreversible. The emancipation of women – just like the end of slavery – could just be an artifact of energy abundance from fossil fuels.
When I was 14 years old, I came home from High School one afternoon and told my father that one kid in History class asked the teacher why, if Archimedes and Hiero had made a steam turbine that spun around, didn’t they make a steam engine do real work? The teacher had replied that in those days there was so much cheap slave labor that there was no need for labor-saving machines.
Dad snorted. He told me that slave labor never was cheap other than briefly. A steam engine is a power source, not a labor saving machine (like the flying shuttle). In 19th Century U.S. and Brazil, slaves operated steam engines. Mosly, tho, no engineer could ever make a practical, economic steam engine if all he had to work with was outrageously expensive metals of erratic, uncertain physical properties.
My father’s comments might be interpreted as suggesting that order(s)-of-magnitude cost reduction can be as transformative as anything else.
“slave labor was never cheap”
Has the question of the profitability and sustainability of slavery been conclusively resolved?
See
The Economics of Slavery in the Ante Bellum South. Alfred H. Conrad; John R. Meyer. The Journal of Political Economy, Vol. 66, No. 2 (Apr., 1958), 95-130.
Jefferson, with hands-on experience, calculated the profit accruing from his slaveholder estate to be four percent annually from childbirth alone.
http://www.smithsonianmag.com/history/the-dark-side-of-thomas-jefferson-35976004/?no-ist=&preview=_p&page=3
There is a difference between seeing opportunity cost in slavery when many “energy slaves” are available to us, and claiming that slavery was not profitable – and will not be profitable again if we do not solve the issue of energy supply. After all, slavery has been sustained for thousands of years in human history.
You might find this interesting:
http://hereandnow.wbur.org/2014/11/19/slavery-economy-baptist
One aspect of the post-carbon future Greer, Kunstler et.al. have yet to discuss is whether or not the lack of “energy slaves” for many of us makes a return to human slaves for a few of us a profitable proposition again.
I’ve got nothing useful to add that has not already been said above.
I do want to say how refreshing and pleasant it is to find an intelligent post with civil intelligent responses, even when there is disagreement. Nice community here. I wish Tom would write a little more often.
Tom,
There is at least some academic work backing up this argument.
See: http://onlinelibrary.wiley.com/doi/10.1002/sres.1057/abstract
Are you familiar with the work of John Gray? No no, not the love guru, but the political philosopher from the UK?
He’s been eviscerating the progress myth for some time now.
PS – I should like to say that many social innovations that we are beginning to take for granted now aren’t exactly new – many of our ancient ancestors lived before the invention of race, and many of them would not have batted an eye at a woman on the front lines or in a position of power. Ancient Kemet had more female rulers than the US, Canada, and UK combined.
It just feels like we’re further along since we took a good dozen steps backwards since Christianity escaped the Middle East.
i think you are looking at to small of a time frame. let’s do 100 years. 1850, 1950, and 2050. obviously we cant say what 2050 will be like exactly but we have a very good idea if we extrapolate certain trends. we should be on mars by then. in fact we are making serious plans for exploring and colonizing mars now. i personally dont think we will have any legitimate colonies on mars for a long time but im certain that we will at least walk on mars by 2050. we already landed on the since 1950. coal as an energy source will be nearly obsolete. i know everyone always jokes about fusion but with several recent advances and the experimental fusion reactors being build already and coming online sometime in the next 15 years or so it is safe to say that the joke is almost over. look at how convenient life is in the developed world now compared to 1950. it will only get easier as time goes by. you could also compare how wars were fought in 1950 and now. they would be amazed at the difference and it will only continue to change. lasers weapons and rail guns are practically a reality in the military now. robots and unmanned aircraft are everywhere in the battlefield and soldiers can video chat with their family over skype from the battlefield. we already have access to massive amounts of information thanks to the internet and computers exist that are capable of having a conversation with anyone. an emergency trip to the hospital today involves all sorts of technology that didnt exist in 1950. i wouldnt be surprised if by 2050 the xray was considered archaic. im aware there is a lot of debate about how far we can take computer technology given the limitations of our current techniques but im also aware how many billions of dollars we are spending to develop and refine alternative techniques. by 2050 the average desktop will be almost as smart (as far as processing is concerned) as all of humanity if the current trends are able to continue. the next decade will be able to indicate where we are headed with that. if trends do continue then computers that powerful will be able to replace almost every job on earth. we will have near perfect weather mapping capabilities with meter level forecasts. currency as we know it might not survive. we also already have companies who are trying to get into the business of asteroid mining and in another 3 decades im sure they will be raking in profits faster than they can spend it. i think the differences will be almost endless even if computers dont advance as much as we expect. at the same time these differences wont be at the front and center of our every day lives but in the background where we hardly notice them. in my opinion the invisible technology will be the biggest miracle of all.
Your certainty of the future frightens me. The whole point is to challenge our instinct to extrapolate (sometimes meaninglessly) by holding to the conversation to known periods so that extrapolation is seen not to work so well. Certainty and unchallenged assumptions can turn into a serious threat if they placate us away from taking real challenges seriously.
i used very cautious and uncertain words in my comment. most of what i spoke of was how we are in the infancy of something much greater with everything that is already in the works. the only thing i am truly certain of is that technology wont be as omnipresent in appearance as it is today. i think it will fade into the background in day to day life for most things. if that is the case it would be hard to impress someone from the past because they wont be able to truly see all of the technology around them without going out of their way to experience it. (battlefields, hospitals, extreme environments) my biggest worry personally is that living the easy life that technology has promised us will be too convenient and we will begin to lose our collective desire to advance and understand everything around us.
I was just responding to the actual words: certain trends; certain to walk on Mars; safe to say; it will; etc. I’m highly sensitive to misplaced certainty, and your post had some flags. Granted, you also couch many of your statements with qualifiers, and those never set me off.
i understand what you are saying. i did my best to only speak with certainty on the few things that i feel are absolutely certain. i also understand that feelings dont really have a place in a discussion like this so im sure i could have done better. if fusion and a mars walk arent absolutely certain in the next 35 years then i will be very concerned. to me it says that we quit moving forward because the necessary technologies are almost all in place for both of those to happen. I honestly dont really have high hopes that we will do anything productive on mars before im dead but i think mars will soon become like the race to the moon. everyone will get very excited but when we finally get there we wont care anymore. as for fusion ive been reading about some very necessary and interesting advances that would allow a fully functioning fusion plant to be up and running in under a decade if someone was just willing to pay the bill. the problem with geniuses is that they tend not to be billionaires. i have a hard time believing that things are slowing down. i follow science and technology very closely. we still have so many amazing ideas on the drawing board and we are still learning at a rapid pace how our universe works so that we can bring those ideas to fruition. i hope that the excitement that we have for research and invention lives on for centuries because i think it will spell the beginning of our end when that excitement finally vanishes.
All of this is reasonable, and may be how things go. However, the cost of a Mars “race” may prove to be higher than our society is willing to prioritize. This will be especially true if resource limitations begin to make ordinary things more expensive. So I can’t be as optimistic in light of possible foundational shifts. The fact that no one wants to foot the bill for a fusion reactor is also a warning: this is the most expensive form of electricity presently imaginable. So what if the fuel is free/unlimited: can we afford the machine?
“i think mars will soon become like the race to the moon. everyone will get very excited but when we finally get there we wont care anymore”
If we can predict that a Mars race will be unproductive and end in not caring, just like the Moon race, why should we bother entering the race in the first place? Might as well start not caring right now and save the trillion dollars. It’s precisely our confidence that we could land a person on Mars[1] if we threw enough money at it that makes it uninteresting as an endeavor in its own right. (Getting more science out of it would be another matter, but (a) no one spends that much on science and (b) while a geologist beats a robot, it’s not clear a geologist beats a geologist’s price tag of robots.)
[1] The really tricky bit seems to be getting them back off of Mars.
Regarding solar to liquid fuels, some years ago I came across http://www.jouleunlimited.com/ (http://www.jouleunlimited.com/how-it-works), now ex CEO of Total heads them. Basically they have genetically engineered cyanobacteria taking sunlight, water (can be brackish) and co2 to nearly directly produce liquid fuels (it seems the directness comes from their genetic manipulation, which seems to be their primary innovation). All this sounds great but what they don’t highlight is that the glamorous number they claim regarding fuel per acre seems to assume a concentrated stream of CO2 because in the how-it-works, a picture shows “industrial emitter or pipeline”. They are probably very expensive (very low of negative net energy) if they use (dissipated) atmospheric CO2 (either they would have to burn energy to concentrate co2 or rate of production of their fuel will be reduced due to more time required to produce the fuel, effectively lowering solar utilization or efficiency). So if my understanding is correct, when looked as a whole, they are basically trying to improve fossil fuel net energy using extra land near fossil fuel burning plants. But they market it as sustainable energy.
The entire technological endeavor has evolved to drain various large batteries in the environment. Large herbivores, soils, forests, oil, natural gas, coal, uranium and others. Most of what you experience is just complexity fluff made from all of the excess net energy. None of this is progress but can be more clearly seen as an episode, an event, a phenomenon which will not cycle, unlike the complexity in the ecosystem. I see it as a complex cancer, although there are differences between metazoan renegade cells and what you participate in. I’ve provided some explanation at http://www.megacancer.com. Check Dual Systems Theory in the August archives.
If the history of life on earth were divided into ninety-year lifespans and each lifespan were represented by a sheet of copy paper, then the reams of paper could run along a highway for about four miles. Industrial civilization is comprised of the the last two sheets of paper although we wiped-out most of the large herbivores, destroyed soils and forests and killed most of our competition in the last thirty sheets or so. This sudden and soon self-limiting technological evolution is not progress.
You might enjoy this:
Human domination of the biosphere: Rapid discharge of the earth-space battery foretells the future of humankind
Earth is a chemical battery where, over evolutionary time with a trickle-charge of photosynthesis using solar energy, billions of tons of living biomass were stored in forests and other ecosystems and in vast reserves of fossil fuels. In just the last few hundred years, humans extracted exploitable energy from these living and fossilized biomass fuels to build the modern industrial-technological-informational economy, to grow our population to more than 7 billion, and to transform the biogeochemical cycles and biodiversity of the earth. This rapid discharge of the earth’s store of organic energy fuels the human domination of the biosphere [..] The laws of thermodynamics governing the trickle-charge and rapid discharge of the earth’s battery are universal and absolute; the earth is only temporarily poised a quantifiable distance from the thermodynamic equilibrium of outer space.
https://collapseofindustrialcivilization.files.wordpress.com/2015/07/pnas-2015-schramski-1508353112.pdf
Hi Tom,
I have a question. Is there any likelihood you will be updating any of your older articles anytime soon to take into account the drop in the costs of renewable energy in the last four or five years?
The energy trap might not be quite so deep as it looked that far back- and for what it is worth, a country with a flat to declining population can probably deal with a two percent drop in fossil fuel supplies on an annual basis without lowering economic productivity and living standards by way of increasing energy efficiency and implementing conservation measures that ought not be TOO painful.
A per capita per household or a per capita per household tax on electricity consumption for instance could work wonders by encouraging people to upgrade appliances, install more insulation, and simply adjust thermostats a little.
Love your analysis. I’ve been following you for years. I understand where you are coming from in this piece.
I remember reading a Sci Fi story in which there were a few high priests of technology, and many,many, uneducated and lowly people who struggled to cope with anything. I see that today in the international divisions between societies and the prominence of mediaeval religions in current affairs. My view is that we need breakthroughs in the social sciences or political science more than better GPS, chips or phones. Can’t see that happening based on their track record.
I’ve upgraded my phone to IOS 9. Meanwhile, the slaughter of innocents continues in the Middle East, as it has for centuries, and is coming ever closer to home as a wave of fundamentalism.
Where does that leave us? A cure for cancer will be a a fabulous and historic achievement. However, a cure for political/religious fundamentalism would positively impact more people’s lives.
The physical, biological, and medical sciences may feed/power/cure us. But it will be the other sciences that save us from ourselves. So, beyond inventing “cognitive dissonance”, are they stepping up to the plate?
Tom,
I mostly agree with you in this case. I’ve studied the history of the industrial revolution and I think that the period of 1850-1920 had the most rapid technological progress ever. In 1850, most people in the US were subsistence farmers whose lifestyles weren’t drastically different from millennia earlier. People in 1850s US and Europe had no electricity, no medicine (other than opium), no running water, no factories, no mass manufacture, no cars or trucks, and nothing else we associate with contemporary civilization. By 1920, on the other hand, the US was almost a first world country, with an electricity grid, a rail network covering the entire country, many people driving around in cars, factories, mass manufacture, ocean liners for travel, airplanes, and so on. The era of 1850-1920 had the most rapid progress the world has ever seen or probably will ever see again.
That said, I think you are understating two recent technological developments. First, computing technology has undergone sustained dramatic improvements over the last few decades and we all take it for granted now (that’s how I’m responding to you). Modern computer computer technology would be baffling to people from the 1950s. You claim that 1950s people would see modern computers as “interactive televisions” but that would just mean they didn’t understand what they were looking at. Computers are not refined televisions, and if people from the 1950s saw them as such, then it would mean that those people had no framework for understanding what they were seeing.
I also think you are seriously underestimating the recent advancements in solar photovoltaics. You say that “Solar, wind … all were demonstrated technologies before”. However, solar PV is 1000x less expensive than it was back then. Even when I was a child, in the 1970s, the best solar could provide for the average person was to power a desktop calculator at 0.01 watts. Now there are massive improvements (costs have dropped by 80% in five years) and there are all kinds of promising and rapidly developing new technologies, such as various kinds of multijunction cells, perovskites, organic cells, dye-sensitized cells, quantum dot cells, the siemens process for solar grade silicon, and many others. When it comes to solar PV, we are living through a Renaissance right now. Even ten years ago, it would have been absurd to generate much electricity for the grid from solar PV. Now it has actually happened in some countries, and is happening in many others.
-Tom S
I also have one other response to your article. You rightly point out that almost everything we take for granted now was invented decades earlier. However, the refinement of an invention can be far more important than the invention itself. This point was made by some other commenters, but I think I can add some historical context here.
Let’s take the steam engine as an example. The Newcomen steam engine was invented in 1712 and was deployed at hundreds of sites in Great Britain. Those early steam engines were so inefficient (1% efficiency), and produced so little power for their weight, that they were useless for anything other than pumping water out of mines. They did not change society much. Furthermore, no technological progress was made on steam engines for approximately 60 years after that.
Then James Watt came up with the idea of the separate condenser. His refinement was followed by a series of other dramatic improvements. From 1775 to 1849, there was the Watt engine, followed by the compound engine, the Trevithick engine, the Corliss engine, and others. The efficiency of steam engines improved dramatically, from 1% to about 10%. All of the sudden (historically speaking) the steam engine was a practical invention. You could put steam engines in ships, in locomotives, and in factories. Steam engines started showing up everywhere, and they completely transformed society. For the first sixty years, steam engines were nothing important, but then they changed everything. It was the refinement, not the invention, which was important.
It is also worth pointing out that steam engines had been invented and forgotten about repeatedly in ancient times. Steam engines were invented in ancient Greece, again in ancient Rome, again in the golden age in the middle east, again in Renaissance Italy, and possibly other times. Without the subsequent refinements, the invention was so inefficient that it meant nothing and was forgotten.
A similar situation exists with computer technologies today. It was the refinement, not the invention, which was important. You can point out that the transistor was invented in 1947, but how much use was it back then? A single transistor weighed half a pound. It was a replacement for vacuum tubes in radios.
Inventions often seem unimportant at first. Artificial photosynthesis was invented in the late 1960s, and so far it’s had little effect. Then again, the steam engine had little effect for the first 60 years of its existence.
-Tom S
Useful invention timelines can vary a lot. Thousands of years from Hero’s “steam engine” to the metallurgy, thermodynamics and rubber(from the New World!) to make something useful. Even decades on the more recent buildup, as you say. A couple decades for photography to get useful. I think the basic physics of stellar fusion were worked out in the 1930s, maybe even the 1920s, and we still don’t have fusion power.
OTOH, 13 years from the first successful airplane to the first aerial dogfight in WWI. 7 years from discovering fission to the fission bomb, 18 to commercial fission power. 7 years from electric telegraph to telegraph wires sprouting everywhere that could afford them.
I guess we can say that making something useful can take a long time; once it is useful, it often reaches its potential very swiftly.
But I still think there’s lots of historically recent progress that followed pretty swiftly from the basic science enabling it. Electricity -> telegraph, electric power, radio, etc. Themodynamics -> better steam engines, IC engines, gas turbines. Periodic table and 19th century chemistry -> plastics and chemical engineering. Penicillin -> antibiotics industry. (Speaking of resources we might be mining out.) 1850-1920 saw lots of figuring how the world works, and we put that to work in short order. There’s been a lot less fundamental discovery since then.
In my father’s lifetime (75 and still counting) he has gone from no electricity (only Coleman white gas table lamps) to a small Jacobs Wind Charger (for a few lights and the radio) to hooked into the grid and now finally back to where he feels he needs to ration his electric use again for the good of the planet! One lifetime! We didn’t ride that electric wave very long did we. Scares me.
Interesting, but I think you’re dismissing the “interactive televisions” a little too easily.
1950-to-present handily includes the entire history of the personal computer. Key word “PERSONAL”, although it’s also just two years short of including all stored-program machines.
One odd thing I’ve noticed: Greer points out that collapse is invisible to the layman because it is so slow, and posits 1973 as the point where our collapse started. In 1994, John Ralston-Saul also singled out the same year, pointing out “By the early nineties, it was received wisdom to think of the eighties as an unhealthy anomaly. Yet there was no tendency to draw the logical conclusion. If the eighties did not constitute a recovery, then we are still in the crisis of the seventies.”
But 1973 is also pretty close to the starting gun of personal computing. It’s as if some Lesser God of Civilization, able to inspire individual humans but not revise the laws of physics or conjure more fuel, realized that a little extra gift would be needed to keep the kids of the seventies and beyond loyal to his program through the rocky road ahead….
Personal computers are a special invention because they are re-programmable to do new things. They’re like wishing for more wishes.
You could say that since we’ve been thwarted in our hopes to expand into space, the only remaining theoretical frontier, we have instead begun exploring a virtual frontier through our computers. The problem though, it that while this sates our psychological hunger for a frontier better than the real thing, we can’t bring any resources back from these limitless platonic realms.
I’ll leave with one thought experiment to show how I think: Imagine a computer enthusiast of the seventies, with an Apple II and relatively good games for it, and an intellectual understanding of why those particular games had to be so simple. Now suppose a demon shows him computer games of our time, such as Minecraft, and then asks him to choose between a future where such games are available to him but we never again send a human to the Moon or further, or a future where space is practical but computer games quickly stop getting better. What would he choose?
Hi Michael,
“Greer points out that collapse is invisible to the layman because it is so slow, and posits 1973 as the point where our collapse started.”
In my opinion, what Greer has written is unfalsifiable pseudoscience. Furthermore, what he has written is extremely typical for pseudoscience. He is just making unfalsifiable assertions in order to avoid refutation. Greer is just employing the usual tactics of doomsday groups which have failed.
Greer’s theories are not even wrong. His theories are simply invalid; they are not legitimate scientific theories. They lack any criterion of falsifiability, and so do not meet the minimal criteria of valid scientific theories.
Greer has been around for awhile, and he’s been following collapse groups since the early 1980s. When collapse didn’t happen, he responded by making his theory less and less falsifiable, until he removed all hint of testability, in which case his theory is simply invalid. He claimed that collapse really DID start happening back in the 1970s, but collapse is indistinguishable from not collapsing. That is just meaningless pseudoscience, because the “predictions” of his theory are now so totally vague that even robust worldwide economic growth (which actually happened) is compatible with “collapse”.
A similar tactic was used by a religious person named Harold Camping recently. He bought space on hundreds of billboards and claimed that his calculations form the book of Revelations implied that the world would end on a certain day, a few years back. When it didn’t happen, he claimed that the world HAD ACTUALLY ENDED on that day, and we were all living in an illusion now which exactly resembled the real world. In other words, the world ending looks exactly like the world not ending. Camping dealt with the failure of his predictions by making his theory totally unfalsifiable. In my opinion, that is exactly the tactic which Greer is using, and it means that Greer is engaging in typical unfalsifiable pseudoscience and his theories are invalid (not even wrong, but invalid).
-Tom S
Sigh, I wait for a response, and get one that only talks about the part of my post that is the least interesting to me.
Greer is still falsifiable, he’s just pointing out that one kind of evidence for not-collapse isn’t valid. And while I don’t fully sign on to anyone’s collapse theory as certain, it is a scary world out there and I think just his “collapse can happen while you’re not looking” point is well taken.
The best way to falsify Greer would be to demonstrate that energy prices have returned to sixties levels and seem stable there. (And not due to some other breakdown that locks most customers willing to pay that price out of the market.) They…. haven’t.
Really? Unfalsifiable theories = opinions. Everyone’s entitled to those, including JMG.
Is anyone seriously arguing that humankind is not heading into a future where energy from fossil fuels will not be available to it? Of course, even that is an unfalsifiable hypothesis absent time travel.
Equally, from the standpoint of September 25, 2015, we can only take educated guesses at what that future will look like and how the process of getting there will take place or even how long it will take. JMG is evidently of the opinion that it will conform to what he describes as ‘deindustrialisation’ and that it will proceed stepwise over a very long time.
After all, either we (meaning global humanity today) are the ultimate civilisation or we’re not. If not, then it will eventually go through a process of disintegration/large-scale rearrangement/collapse/rapid evolution (take your pick of phraseology – they all amount to the same thing).
If it IS the ultimate civilisation, well, wow. Although in a fight between an abstraction and a truckload of coal, you’ve gotta bet on the coal.
BTW. Have you read any of JMG’s recent ‘retrotopia’ posts? ‘Quaint Max’.
I mostly agree. However, I think 1880-1950 is an extraordinarily hyperactive period of invention. You van link this to the industrial revolution, but I link it to the three most destructive wars ever fought by humanity (Franco-Prussian, WW I, and W²I²). Throw in the early cold war (or do you think we landed on the moon for “science reasons”) and you start to see what I mean. We invented stuff because we would have died otherwise. Ever it is with progress, both evolutionary and man made.
On a related note, the Principal of Sufficiency: No more progress will be made if the current state is sufficient.
Take another historical period as an example: The Hundred Years War (1325-1450). A war so bad, they apparently killed mathematics. (Groan!). The first practical Matchlock Muskets were invented in this period, despite gunpowder being known for centuries. After a few post-war improvements, replacing match cord with flint and a strike plate, the Flintlock Musket (circa 1515)was the pinnacle of small arms technology for 300 years!!!! The percussion cap being invented in 1808.
Take a musketeer from Milan and drop him at the Alamo, and he will understand perfectly the operation of Davy Crockett’s Kentucky Rifle…
“So what if the fuel is free/unlimited: can we afford the machine?”
One could make the case that fission technology has already been a test of this trade-off. Well said – I’d like to paint this with a broader brush [cut as desired, hope you enjoy].
I’d like to propose that this affordability issue is one basis of “asymptotes of technology”. There is a decoupling of technology from science: we might understand more and more, but increased understanding does not automatically translate into cost savings or practicality. If the nuclear reactor is an example for implementation cost in application, SSC and LHC (and ITER) are examples in the domain of research, so even the advance of knowledge will be subject to negative feedback from cost and practicality. Science remains possible as long as technology is still able to provide a least a few of the machines required at acceptable cost – acceptable to society.
This disconnect between knowledge and practicality singles out the electromagnetic force: we have no way to control the weak force, no practical way to operate at the energies of the electroweak force. The unification of the other fundamental forces requires even more energy, so we can practically exploit the strong nuclear force only in very limited special cases (by means of an exploit called chain reaction). Gravity we can manipulate only by moving large amounts of mass. Science does provide us with upper limits on any possible unknown 5th force at conditions compatible with possible (let alone affordable) machinery as well. These I see as important, fundamental constraints.
Furthermore, I see advances in technology based on fundamentals of the electromagnetic force as a function of how many degrees of freedom electromagnetic waves/photons have – which are known. Our means to build machines rests on the interaction of electrons and photons. Our means to manipulate large numbers of electrons “for effect” are chemistry and solid state physics (fusion would be a lot simpler if we could manipulate electrons better, or muons affordably). Hence, advances – e.g. abiotic photosynthesis – rely on material science. But material science advances, more and more, by increasing complexity, which will increasingly defeat prediction, even using simulation instead of models.
I like to consider genetic/biological technology a massively more complex version of material science. There is no “genetic engineering” yet – we are barely plumbing the beginnings, with excessive confidence and very limited understanding, and all to confident how far and how fast we will be able to proceed. Protein folding is a good example of a breakdown of prediction and the cost of trial and error. But “progress” is still possible by virtue of the large number of stable and almost-stable elements available to re-arrange electrons in ever more sophisticated spatial patterns, at conditions not too incompatible with the environment we (are able to) live in. Surprise discoveries that are both affordable and significant – abiotic photosynthesis again – are still possible (but by no means guaranteed, and more and more without any “proof of concept”).
However, we should have learned enough from centuries of observation and experiment to understand that it is really lucky breaks in the very fabric of our universe that are the backbone of our technology (even our very existence). How much of it is based on arrangements of carbon? How much of the rest is based on the properties of silicon? Our exploitation of the strong nuclear force rests almost in its entirety on the properties of uranium. On a related note, how much of our ability to pursue science and technology was made possible only by the presence of fossil fuels, another – much less foundational – accident?
We know the foundations – the fundamental forces, the particles. We don’t understand all (maybe even most) of it, we know we don’t know all of it – dark matter – but we do know, from collider experiments and from observing the past states of the universe in the sky, that there are strict limits on how much of what we know is affecting matter at the energies, densities etc. that we can afford to build machinery – and to what extent anything we do not yet know about could be exploited. Much of what we see are effects at densities and energies we might never be able to produce in a laboratory affordably, or at all (again, fusion is an illustration). So maybe we already know most of the foundations that are technologically relevant at all?
However, carbon-based material science might go far (e.g. along the example of DNA) to create a vastly different world. There is precedent going back a billion years or more. But that precedent also illustrates that (a) “progress” is not a synonym for “change”, and that (b) while evolution is largely not a reversible process, it does have convergences, local optima, and it is dominated by a large but limited number of recurring patterns. Worse, the search space might well be prohibitively large, which limits trial-and-error searches for specific “bio”-technology by affordability. Oil or knowledge, what is mined easily is mined first.
The irony is that a lot of the Singularity/Cornucopia thinking is resting on a profound misunderstanding of a very specific and very well documented example of the dynamics of progress, Moore’s “Law”. As an example of an – extremely profitable – attempt to understand the interplay of fabrication economics and material properties well enough and early enough to exploit a unique opportunity (and build Intel), it is a textbook case for a smart way to consider changes in technology.
It is also a textbook case to demonstrate Kurzweilitis, and how empirical observation and reasonable prediction becomes that kind of “progress” by fiat, leading to unquestioned confidence based on ignorance.
Tom, I wonder whether you would consider the life and death of the Office of Technology Assessment – 1972 to 1995, RIP – a supporting datum regarding your suggestion that we might well be far into various asymptotes – or into the spread of the mindset that advances are guaranteed to happen, instead of a result of labor and effort gated by opportunity. I certainly would consider the DARPA foray into imagined opportunities of hafnium isomers an example of where willful ignorance leads (JASON is another datum).
I just realized that Prof. Lewis’ research into artificial photosynthesis as Tom Murphy described here:
https://dothemath.ucsd.edu/2011/11/the-biofuel-grind/
is a perfect illustration of the point I am trying to make.
Try every combination of N elements from a subset of M out of the periodic table – and that’s not even brute force, as there is a theory to inform the selection of the subset. Superconductor research sounds very similar at times. This, I think, is the difference between foundational advances – new, relevant physics – and incremental advances: research becomes a breadth-first search guided by increasingly less specific predictions. Once you run out of elements in the periodic table, even the most brute force search will end without a result, but if we do not know enough to predict this outcome, we have to try anyway.
The hafnium isomer tale illustrates that even when we know enough, we’ll waste effort on the wrong objectives for the wrong reasons, too.
There is one scary thought that came to me: Maybe, just maybe we invented everything there was to invent? Can we find some scientific area that gives us at least a hope for world-changing inventions in future?
This may be a hint for new blog post – can you name reasearch or theory which is being developed now and potentialy can bring something big?
Taken literally, that’s silly: new inventions are being made every day, even big important ones, like CRISPR. And there’s at least a factor of a million to go in the energy efficiency of computation. And we know we don’t know all of physics.
But in spirit, that’s kind of what I’ve seen saying. The 19th and early 20th centuries saw a wave of fundamental discoveries in physics, chemistry, and biology, that aren’t happening any more. And the low hanging applications of those got invented pretty quickly. We haven’t run out of refinements or even new applications, but at the level of whole new *categories* of things — like heat engines, plastics, antibiotics or vaccines — invention has died down.
We also have reason to believe we’re not far off from the physical limits of efficiency in lots of things that aren’t computation. Heat engines are at 33%, Tom has a post on light suggesting LEDs are at least under 6x of physical limits, communication speeds can’t go faster than light, transportation speeds are limited by considerations other than how fast we can push something…
A lot of this is described in Techgnosis, by Eric Davis. Technology has in some ways become a religion, including the blind belief that more technology will always make things better. There are enough parallels between Christian beliefs and the predicted coming of the super intelligence that will be our savior, curing diseases, heal the land and create abundance. I always found those similarities a bit funny.
A lot of the issues with tech are blamed by the ‘business-as-usual’ attitude of the modern world. The best invention we could have, is for the business-as-usual to be replaced with something a lot better.
You’re not exactly wrong here, but you’re not right either.
Yes, a person from 1950 might look at my smartphone and understand it as a sort of miniaturized combination color-TV/walkie-talkie. But the key point is, that person would be wrong. As wrong as looking at a car and thinking of it as a buggy with invisible horses, or looking at a skyscraper and thinking of it as an unusually tall wattle-and-daub hut.
A characteristic of technology growth in the past forty years or so is that we often retain the surface appearances of things (or at least skeuomorphically reference them), so that for example our “phones” still have “area codes” and “phone numbers” which we (sometimes) still have to “dial”, but everything underneath has been ripped out and replaced with unicorn hair and pixie [bleep]. Your area code and phone number is not a pulse-tone signal to an analog switching network causing it to initiate a physical connection over copper wires, it’s a domain query allowing two devices to establish a lossy stream of digital packets encoding highly compressed audio and routed via the currently-shortest-path in a multimodal network. The savviest AT&T engineer from 1970 wouldn’t have the slightest idea what was going on behind the scenes, but could make a phone call without much confusion, and so could anyone else. So if you look at surfaces, it looks like a trivial change.
But the important thing about a technology is not its outward appearance, or its user interface, or even its broad capabilities; it’s the uses people put it to. I mean, “talking pictures” weren’t just silent movies with sound added, and they weren’t canned stage plays either; they were a new thing that went in new directions no one expected.
You can say “pish tush, we already had phones in 1970”, but we also had rolodexes and phone books and hackneyed jokes about parents never being able to use their phones because their teenage daughters were monopolizing it. “We had computers in 1970”, but we also had card catalogs in libraries, and engineers using drafting tools and slide rules, and a stenography pool at the office who used up lots of carbon paper. “We had cars in 1970”, but every car had a glove compartment stuffed with crumpled re-folded road maps. “We had television in 1970”, but we didn’t have incidents of police brutality filmed from six different angles and available for inspection within minutes of the event. This is a massively different world, even if we do still use refrigerators.
I think you contradict your own case. I agree that it’s the uses that really matter, but I think that reduces the importance of what’s under the hood. GPS is cheap and convenient, but what matters is that I be able to get maps easily when I try to get around, so buying local maps from the gas station or bookstore is closer to us than to the world without printed maps. In a city at night, what matters is that there’s lighting, whether gaslight or LEDs. Computer catalogs are neat, but a library with a card catalog is closer to that than it is to a library no one’s bothered to index.
I think the London of 1890s Sherlock Holmes, with city lights, police, trains, sewers, and Underground, is closer to us than it would be to the London of 1810s Austen novels, if she’d actually set any in London. A lot of ‘modernity’ was invented in the Victorian period; we’ve refined the capabilities, but that’s less revolutionary than inventing it in the first place. (Or re-inventing, for sewers.)