There has been wider conversation than usual on blogs I read about something
I’ve mentioned several times over the years; that we’re not just headed for a
global collapse involving much of (all of) the West, we seem to be headed to a
second Dark Ages. The talk has been centered on whether or not we’re
already in the New Dark Ages (which I’m going to shorten to NDA because I
expect to use that a lot). The talk seems to have started with
a post from Borepatch
in turn referencing a post about results from the James Webb Space Telescope
contradicting the “standard model” of cosmology, the Big Bang Theory
(TBBT).
Aesop at Raconteur Report
replies that he thinks we’re in the slide into the Dark Ages and that slide
started in the mid-1800s.
Let me begin by saying that the dates of something like the (original) Dark
Ages, or any period in history are arrived at by committee. They’re no
more absolute or valid by decree than something like the Big Bang Theory;
they’re a consensus. We will never know when the NDA starts (or started)
but years from now, historians will assign a date.
Both Borepatch and Aesop are right about the decline in science in the world
and that it has been going on for along time, and I've written about it many
times (for example). Does that itself indicate the NDA has begun? I don’t
think so, in itself. Yes, there has been a steady decline in new,
important science compared to the early 20th century, but there are other
explanations involved.
A good starting point is to ask what science is. I’m an extremely
hard-core advocate of the idea that if an experiment can’t be done to test
predictions, it’s simply not science. By itself, that says extremely
well-regarded things can’t be considered science; things like TBBT, the modern
Theory of Evolution, the modern “Climate Change” hustle and many, many
more. These are supported by observations and computer models, but
anyone who hasn’t realized those models can be made to say anything the author
wants hasn’t worked around computer models for any length of time.
Here, I fall back on a quote from a guy whom I’ve considered a role model
since I first came across him around 40 years ago, Richard Feynman: “It
doesn't matter how beautiful your theory is, it doesn't matter how smart you
are. If it disagrees with experiment, it's wrong.” Like Feynman, I'm
almost a militant
experimentalist. If it can't be demonstrated in a controlled experiment, it's not
science, it's faith.
As an example of the difference, as a design engineer in microwave
communications, I spent literally days at a time simulating how a circuit,
antenna or other thing would work before we built the first one. These
models are based on Maxwell’s equations, science that has been experimentally
verified for over a hundred years. The software was from independent,
competing, software companies that would spend hundreds of thousands to
millions of dollars a year improving their algorithms to accurately predict
what these circuits and things would do. They did this by designing
experiments and testing how accurate the predictions were. Remember,
these are competing companies, and they used their accuracy as a selling
point. How much is spent on the climate change models and verifying how
accurate they are by experiment? Does anyone ever talk about verifying
the models? I’ve read they don’t spend anything on that, but don’t know
for sure. Their Global Climate Models are parameterized and the
parameters are tweaked to agree with some measured data, which isn’t
necessarily accurate. They change the model to give them the answer they
want but don’t necessarily know that the changes improved the model for all
situations or just the ones they checked. They’re not founded on
established science.
That doesn’t mean that only physics is real science. The vast majority
of biology, chemistry, geology and the subjects taught in (what used to be
called) “Colleges of Arts and Sciences” are science, but there as aspects that
aren’t really science. Math is real science because mathematical proof
establishes that everything about it can be checked and is consistent.
Every kid knows (or should know) that any subtraction problem can be checked
by addition, division by multiplication, and the ability to prove correctness
carries as far as you care to go.
Virtually all of modern medicine is improved by constant experimentation,
although corruption has institutionalized things that haven’t been proven by
experiment. If an engineer did what some of these epidemiologists did
with their correlational
“he-who” studies
we’d be doing hard time in Federal prison. It can be hard to recognize
when something sneaks into those academic programs that can’t be verified by
experiment.
Until some organization can replicate the conditions of a developing/evolving
world, including tracking results for billions of years, I can’t consider it
anything other than an observation. Start with the best models of the
just formed world and watch one for a billion years or two. See if
anything spontaneously generates. Without experimental backup, I see no
semantic difference between saying evolution selected for some characteristic
and saying there was intelligent design. Either way, you haven’t
experimentally verified anything. Except the first one allows people to
feel better about themselves.
The things that rely on real world science and application - engineering - are
relatively healthy, still doing great and remarkable things, but a side effect
of that is as the specialization of the knowledge required goes up, the number
of people who can do it goes down. Take microprocessors - a tremendous
invention that has improved the world in uncountable ways. Last data I
have says there are only four companies on Earth that can work at the smallest
current geometries. Is another, smaller-transistor sized generation in
the future? Quantum processors? Processor speed hasn't really
improved in a decade or more. CPUs were running at 3 GHz 10 years
ago. If Moore's Law was still running, they'd be running at 12 or 15 GHz
by now. The fact that they aren’t implies operating at that speed is
fundamentally too hard. What if to get transistors to 15 GHz requires
massively expensive new semiconductor plants? Further, what if the
science that says that isn’t widely accepted as good science, and to do the
experiment would cost far more than any company would be willing to
gamble?
Think of analog signal processing. Yes, it still goes on and it's the
same way. There's a small handful of places that can do it.
While the semiconductor foundries are still big and still have many engineers
working there, the number of designers that can design the entire chip is
shockingly small. The number in the world would fit comfortably in
conference center.
Can it keep going?
The age of big construction projects and civil engineering projects is
apparently over, mostly because of NIMBY reactions blocking it, and those are
unfortunately too often linked to “junk science.” Could a modern Golden
Gate bridge be built? A modern Hoover dam?
The slow down in big new physics discoveries is tied to the expense and
difficulty of getting to the energy levels they need. The JWST
discoveries depend on things that have never been done before - the size of
the telescope, not just being in space but at the L2 point so that it can get
down to the temperatures required to see those wavelengths. I've read of
creating particle accelerators so big they need to be put in space. Will
any country or society do that? The increase in costs that hinder the
advance of physics limit the giant construction projects.
The other problem areas that come to mind seem to be education-related and I
think with the news showing what a hot mess education is, I don’t need to say
much. I think of the bridge that collapsed at FIU a few years ago,
apparently because of incompetent hires at some point in the process.
Add in the Boeing 737 issues and you wonder if everything gets higher risk and
more likely to kill you.
The reason I’m reluctant to say the NDA has begun is because my threshold
isn’t just that we’re not advancing properly or fast enough, it’s that the new
people take the old inventions so much for granted that it’s not just that
they don’t understand how to replicate them, they lose the idea that such
things ever existed. I think there’s a real possibility that relied-upon
integrated circuits that are old can fall into that trap.
Years ago, someone had the story about a Roman villa that was unearthed
somewhere in central Europe. (Yeah, I'm fuzzy on the details).
Like the modern discovery of Roman mosaics, it was gorgeous, and one of the
discoveries was the villa had a form of central heating. The people who
discovered it were puzzled to find burned marks on the floors in some
rooms. They came to the conclusion they were from fires set to warm the
place.
The people who lived there a few hundred years after it was built not only
couldn't run the central heat, they had no concept of what central heat was or
what it could do, so they lit fires on the floor. They knew nothing from
their past.
We're not at that point exactly, yet, but I think you can see it in the
future.
Image from the movie "I Am Legend." Not at all about New Dark Ages, but I think a Zombie Apocalypse would bring one.