Evidence, Please?

I spend a fair bit of time asking myself the question: Am I crazy?

I mean, without really wanting to do so, I seem to have landed on a fringe view within our culture, which is not a comfortable place for me in a social sense. I don’t love it. The easiest—seemingly most likely—explanation for the glaring mismatch is that I’m the one off kilter.

My statement: Modernity (even if defining starting 10,000 years ago) is a short-lived phase that will self-terminate—likely starting this century.

Common response: That’s crazy. Just look around you! We’ve created a new normal. Humans have transcended the bounds of nature—no longer mere animals. Ingenuity has unlimited potential, and we’re really on our way now. This changes everything, and we will never lose our technological mastery, now that we have found it. Modernity is our destiny—and kind-of the whole point of it all.  It’s what makes us truly human.

But let’s look at evidence: like evidence that modernity is a new normal that can go on at least as long as our species is around (relevant timescales are 106±1 years, or a million years plus-or-minus an order of magnitude).

What’s that? Zero evidence? Of course we can’t know. The future is not kind enough to present evidence to the present. Hmmm—maybe that’s because we’re so mean to the future, frantically robbing its lives of Earth’s bounty and biodiversity.

The basic observation that we can receive no evidence from the future cuts both ways, of course. I have no future evidence that modernity will begin shutting down within a century.

However, we are not completely in the dark, here. We know some things (see my previous post on things about which I can be relatively certain).  As obvious illustrations, we can be super-confident that day will follow night in a consistent cycle throughout our lives, that we will each die someday, and that the sun will render Earth uninhabitable on its way to spending its fuel.  In a similar fashion, we can lay claim to a host of other near-certainties even without evidence from the future.

Continue reading

Views: 3985

Certainty

Image by Victoria from Pixabay

I struggle to strike a balance between certainty and circumspection. Our culture has a tendency to favor certainty, while one of my favorite and frequent fall-backs—seldom wrong—is: “we don’t know.” Certainty is often the hobgoblin of decontextualized, rigid, (only) logical thinking: an artificial by-product of incomplete mental models. That said, I feel that I can do more than throw up my hands on every issue. I can be fairly certain that I will never perform a standing jump to the moon or breathe underwater (without apparatus) like I often do in dreams.

Thus, I write this post in full appreciation of the red flag around certainty. Yet, in full consideration, I can indeed identify some elements of reality about which I can be fairly certain—to a reasonable degree. At the very least, these things would appear to be consistent with a robust account of how the world appears to work.

I’ll skip an exhaustive list of certainties, and stick with points that have some bearing on the meta-crisis of modernity.  But for illustration that certainty is not misplaced, I think most would agree that we can function under certainty that in the next billion years, say, gravity won’t turn off; the sun will continue to shine; Earth will keep rotating to produce the familiar day/night pattern; if I pound my fist on the table my hand won’t sail through it, etc.  We are justified in “taking these to the bank.”  The items below are not all as completely iron-clad, but are helpful in forming a basis.  I have asked myself for each one: “could I be convinced otherwise?”  Generally the answer is “yes, I suppose,” to varying degrees, but some would be a tough pull, requiring solid evidence.  Most of the content is a repackaging of points I have expressed before, but I hope in a useful, consolidated form.

So let’s get to it: here are things I am reasonably (functionally) certain about:

Continue reading

Views: 3275

MM #8: Timeline

This is the eighth of 18 installments in the Metastatic Modernity video series (see launch announcement), putting the meta-crisis in perspective as a cancerous disease afflicting humanity and the greater community of life on Earth. This episode provides several ways to develop intuition about the brevity and temporary nature of modernity.

As is the custom for the series, I provide a stand-alone companion piece in written form (not a transcript) so that the key ideas may be absorbed by a different channel. The write-up that follows is arranged according to “chapters” in the video, navigable via links in the YouTube description field.

Continue reading

Views: 1384

Outside the Fishbowl

Image by Jazella from Pixabay

One consequence of having developed a perspective on the long-term fate of modernity is a major disconnect when communicating with others. Even among people who have a sense for our predicament, my views often come across as “out there.”

Let me first say that I don’t enjoy it. Having different views than those around me makes me uncomfortable. I was never one to make a point of standing out or of having a contrary opinion for the sport of it (we all know those people). My favorite teams as a kid were the local ones (Falcons, Braves, Mocs), like everyone else around me. I wear blue jeans basically every day, blending in to Americana. No tattoos, piercings, or “non-conformist” affectations. It is, in fact, because of my continual discomfort at having stumbled onto a divergent view that I am compelled to write and write and write about it. I feel trapped between what analysis suggests and what almost everyone else around me thinks/assumes. The discomfort means that I keep trying to discover where I’m wrong (my life would be easier!), but the exercise usually just acts to reinforce the unpopular view.

In this post, I want to try to turn the tables: make members of the mainstream feel uncomfortable for a change. It probably won’t work, but I’ll try all the same. I could have titled the post: “No, You’re Crazy.”

My mental image for this post is one of a fishbowl in a vast and varied space devoid of other fishbowls. The fish living in the bowl have each other, the enveloping water, a gravel floor, fake plants, a decorative castle, and manna from heaven morning and night. Concerns of the fish need not, and in a way cannot extend beyond the boundaries of the bowl. The awkwardness is that the bowl is wildly different than the rest of the space in all directions. It’s the anomaly that the inhabitants deem to be normal. The analogy to ourselves in modernity should be clear…

What happens when the caretaker of the fishbowl disappears: when the food stops coming, and the environment becomes fouled? The artificial context of the bowl ceases to function or even make sense. The best outcome for the fish might be to get back to a pond or stream where they could live within their original context: woven into the web of life, enjoying and contributing to a rich set of “ecosystem services.” But getting there is not easy. Once there, figuring out how to live outside of the dumbed-down artificial construct presents another major challenge. As good as the fish seemed to have it, the fishbowl turns out to have been an unfortunate place to live. I invite you now to re-read this paragraph, substituting modernity for the fishbowl.

Continue reading

Views: 3088

The Anthropic Biodiversity Principle

When I proposed ten tenets of a new “religion” around life a few months back, the first tenet on the list said:

The universe is not here for us, or because of us, or designed to lead to us. We are simply here because we can be. It would not be possible for us to find ourselves in a universe in which the rules did not permit our existence.

While simply stated—perhaps to the point of being obvious—it is shorthand for a fundamental principle that has become a wedge issue among professionals who seek to understand the nature of the universe we live in, and the rules by which it operates. In this post, I will elaborate on the meaning and the controversy behind this deceptively simple statement.

The Schism

As a form of entertainment accompanying my journey through astrophysics, I witnessed a schism develop at the deepest roots of physics and cosmology. In brief, many physicists pursue a common quest to elucidate the one logically self-consistent set of rules by which the universe works: a Theory of Everything (ToE), so to speak. In other words, every mystery such as why the electron has the mass that it does, why the fundamental forces have the behaviors and relative strengths they do, why we have three generations of quarks and of leptons, etc. would all make sense someday as the only way the universe could have been.

An opposing camp allows that some of these properties may be essentially random and forever defy full understanding. Those in the ToE camp see this attitude as defeatist and point out that holding such a belief might have prevented discovery of the underlying (quantum) order in atomic energy levels, the unification of electricity and magnetism, reduction of a veritable zoo of particles into a small set of quarks, or any number of other discoveries in physics. Having self-identified in the “defeatist” camp, I knew for sure that the purists were just plain wrong about my position stifling curiosity to learn what we could. Our end goals were just different. I was content to describe the amazing universe—figuring out how rather than why it worked—and didn’t need to find a “god” substitute in an ultimate Theory of Everything (a big ToE).

The counter-cultural viewpoint I hold sometimes goes by the name The Anthropic Principle, simply because it acknowledges the fact that we humans are here—so that whatever form physics takes, it is constrained by this simple and incontrovertible observation to produce conditions supporting life. It amounts to a selection effect that would be insane to pretend isn’t manifestly true.

Scientists are perhaps too well trained to remove humans from the “equation,” and I can definitely get behind the spirit of this practice. After all, the history of science has involved one demotion after another for human importance: Earth is not the center of creation; the sun is not the center of the universe (the universe doesn’t even have a center)—or even the center of our galaxy; moreover, our galaxy is not special among the many billions. Ironically, even through the Anthropic moniker seems to attribute special importance to humans, the core idea is actually the opposite, translating to the ultimate cosmological demotion: our universe isn’t even special: a random instance among myriad possibilities. Yet, I suspect the name itself is a barrier for many scientists, as it seems superficially to describe an idea built around humans—which is a non-starter for many.

I can definitely sympathize with this reaction, as an avowed hater of human supremacy—a sworn enemy of the Human Reich. Don’t get me wrong: I’m not a misanthrope. I love humans, just not all at once on a destructive, self-aggrandizing rampage. Yet for all my loathing of anthropocentrism, I am fond of the Anthropic Principle. What gives?

Basically, I have to ignore the unfortunate label. A rose by any other name is still a rose. I propose using a less problematic name that gets to the same fundamental point: The Biodiversity Principle. I’ll explain what the principle is (by any name), and eventually how it relates to modernity and the meta-crisis as a compatible foundation for long-term sustainability.

Continue reading

Views: 2468

A Religion of Life

Image by Karen .t from Pixabay

The following discussion about belief systems may seem out of place coming from a recovering astrophysicist, and perhaps I am as surprised as you are. But my path has taken me to an unexpected place, so that I now think we would be wise to make a radical course change at the deepest level of what we believe.

Why should we consider a major change?

  • Because we don’t know everything, and never can.
  • Because what we do know tells us we’re on the wrong track, initiating a sixth mass extinction—not just from CO2, but from modernity itself.
  • Because we now (collectively) believe in the wrong things, like human supremacy and economics (gross).
  • These beliefs are actively hurting the living creatures of the planet, including us.

Science has revealed so much about the origins and rules of the universe, and how life came to be so exquisitely diverse. Let’s tap into what this tells us. Let’s also acknowledge that mysteries will always remain. Rather than continue to be paralyzed in this urgent time by what we don’t yet know, let’s fill in the gaps with belief—or even faith—rooted in the science we already do know. Let’s move beyond the current stories we tell ourselves in modernity, and fashion new ones that move us in a better direction—to the enduring benefit of all life on Earth.

I’m not sure I know how to tell this story, so please bear with me and accept my apologies for a long-ish read. For those who saw last week’s post, this one contains familiar echoes, but represents a fresh approach intended for a more general audience.

Continue reading

Views: 4890

My God: It's Evolution!

I never thought it would happen to me, but I’ve had a divine revelation, of sorts. Have you heard the good news?

While being brought up in a Methodist church, and educated for 8 years in Catholic schools (where I went to mass five days a week for the first three years), I abandoned Christianity midway through high school—tentatively traversing an agnostic phase on my way to calling myself an atheist. Now I eat babies and make my clothing out of puppy skins. Just kidding: I am playing off childish myths about atheists, though I now recognize how completely ludicrous and backwards this perception is. Atheists actually eat puppies and use baby skin for clothing. Ah—I can’t stop kidding around.

I’ll skip over all the physics training, astrophysics exposure, outdoor experiences, etc. that contributed to my worldview. Suffice it to say that I found no shortage of phenomena in the world worthy of awe and appreciation. It was all the more amazing to reflect on the simple origins of everything and the emergence of astounding complexity—especially in the spectacle of life. To me, the idea that our biodiverse world rests on a relatively simple set of physical laws makes the outcome FAR more interesting and dazzling than does the comparatively unimaginative invocation of a sentient creator.

The revelation at hand did not arrive all at once. An initial grounding is partly contained in the reading journey I laid out some while back. Most importantly, the writings of Daniel Quinn (who lived for a time in a monastery aiming to be a hard-core Trappist monk) played a major role—recently reinforced by Alex Leff’s excellent podcast treatment of Ishmael. The revelation finally matured in the context of my post from last week on free will, and the illuminating responses it generated.

Continue reading

Views: 4403

A Lifetime Ago

Image by brfcs from Pixabay

Having just ticked over a new year, it’s a fitting moment to think about time.  I have often compared modernity to a fireworks show—dazzling, short, then over—and indeed we often celebrate the New Year with a fireworks display.  Perhaps lasting 10 minutes, the display occupies one fifty-thousandth of the year.  This is like the past 50 years of explosive impact relative to the 2.5–3 million years of humans on Earth, or our 10,000 year agricultural period compared to the time since a different explosion: the Cambrian.  Our current ways are indeed as transient as a fireworks show, also marking a sort of culmination of a long era.  But let’s approach temporal perspective from a different angle.

Growing up, I thought of World War II as ancient history: long before my time. But now, I have lived more than twice the span that separated WWII from my birth in 1970. Only 25 years elapsed between the end of WWII and 1970, while we’re presently 54 years away from 1970. 25 years, I now realize, is nothing! When I was born, WWII was still fresh in the minds of many who had lived through it.  Indeed, both my grandfathers fought in WWII, carrying the physical and psychological scars to prove it.  To my grandfathers at the time of my birth, WWII seemed like “only yesterday,” as 1999 seems to me now.

A related trick is to keep track of the date that was as distant from your birth as you are today. In other words, the year in which your anti-self who lives backward in time would find themselves. For instance, I was born at the beginning of 1970 (which also happens to be the start of Unix time), and therefore find myself about 54 years from my birth date. Thus 54 years prior to my birth date is 1916, smack into the middle of the first World War. I can probably expect my backward self to make it past the turn of the century, before the fireworks show of modern life really got underway: before airplanes, for instance.

But the main point of this post is that the past, and all “history” isn’t really that far back. We’ll play a game based on the question: who was the oldest person alive when the oldest person today was born, and likewise back to the more distant past.

Continue reading

Views: 2675

Confessions of a Disillusioned Scientist

AI-generated stranger; I’m not so young/attractive

After a rocket ride through science, I am hanging up the gloves, feeling a little ashamed and embarrassed to have devoted so much of my life to what I now see as a misguided cause that has done more harm than good in this world.

The previous post details my views about the limits of science. In this post, I will focus more on my own reaction as a human participant in the enterprise.

As is so often the case, my trajectory, in hindsight, looks straightforward and linear. Halley’s comet introduced me to the sky in 1985–1986 at age 15–16, quickly leading to my building a 10-inch Newtonian telescope on a German equatorial mount (using plumbing parts from my plumber neighbor). Through this telescope, I saw all nine planets in one night (when there were nine), an individual star (supernova) 36 million light years away, and a quasar 2 billion light years away. I was a physics major at Georgia Tech and spent every-other-quarter at the Naval Research Lab working on optical communications for space. I had my pick of graduate schools, and chose Caltech for its idyllic setting, its relaxed, collaborative atmosphere, and access to “big glass.” Within a few months of starting, I had gone on observing runs to the venerable Palomar 200-inch telescope and the Caltech Submillimeter Observatory on Mauna Kea. What a dream I was living! Meanwhile, I enjoyed many outdoor adventures with fellow grad students, some of whom have become life-long friends.

I did not expect to stay in academia (the statistics were not encouraging to a middling student), and interviewed at a few “industry” jobs while also dipping a toe into “prize” postdoc fellowships and create-your-own postdoc adventures. I picked one from the “adventure” bucket, to start a lunar laser ranging project as a test of general relativity at the University of Washington. Abandoning my graduate expertise in infrared astronomical instrumentation was risky, but I saw this postdoc as a last hurrah in academia, deciding that I might as well have fun. I loved the people I worked with, and savored my time in Seattle. Unintentionally, this gutsy move looked very attractive to faculty search committees, two of which tracked me down based on the reputation of my graduate work and then put me on their short lists after learning of my new direction. One of these led to a tenure-track job at UCSD starting in 2003, where I kept the pedal to the metal on the lunar ranging project. During a 20 year career there, I was never turned down for funding my project, hit all the usual promotion steps at the expected times (tenure then full professor), and felt that I had “made it” by all traditional measures. Having written and reviewed a large number of peer-reviewed papers and served as panel reviewer for NASA and the NSF for far more proposals than I ever wrote, I knew the “game” quite well. I had a versatile set of powerful tools that I could bring to bear on what seemed like almost any problem. Science was, in some ways, the essence of my being, and I found plenty of reward in it—both intrinsically and societally.

So, what happened?

Continue reading

Views: 15169

Putting Science in its Place

Photo by Noam R

Although I might be described as a dyed-in-the-wool scientist, I’m about to say some things that are critical of science, which may be upsetting to some. It’s like those warnings on a movie or show: strong language, nudity, smoking, badmouthing science. So, before I lay into it, let me express some appreciation for what science does remarkably well.

  1. Science exemplifies careful observation—isolating confounding factors to focus on a particular interaction.
  2. Science follows a method that suppresses personal attachment to an idea: nature becomes the arbiter of truth.
  3. When it comes to elementary particles and fundamental physics, one can hardly do any better; although even an atomic nucleus is complex enough to defy exact treatment.
  4. Science advances by trying to tear itself apart, so that surviving notions are very strong.
  5. Because of science, we have a decent outline of how cosmology, evolution, and biophysical systems work.

It has its place.

But the very thing that makes science powerful is also its biggest weakness. It relentlessly pushes wrinkles aside, smoothing its zone of interest to the least complex system one can obtain for study. This is ideal when wanting to observe a Bose-Einstein condensate in isolation, or the genetic mechanism for producing a certain protein. Science also tends to dissect a problem (or literal critter) into the smallest, disembodied pieces—which then have trouble relating back to the whole integration of relationships between pieces. Other “ways of knowing” attempt to grapple with the whole, accepting it as it is and not applying reductionist tools.

Continue reading

Views: 4466