There's nothing more certain than taxes and death,
but in the future, maybe not the latter.
We've discussed concepts like life extension before, and last week we discussed mind augmentation.
Most people have at least a passing familiarity with the notion that you might be able to
backup or upload your mind to a computer and in that way live forever, or pretty close
to it.
In the Civilizations at the End of Time series we've discussed the ways a civilization
might outlast the very stars themselves and, indeed, potentially flourish and prosper on
a scale that would make the entire stellar phase of the Universe seem like a brief prelude.
In that regard, we are barely into the first sentence of that prelude.
Humanity has existed for only an eyeblink of time compared to the Universe as a whole,
and it is still quite young.
We are about 1% of 1% of the way through the period of time in which stars will form and
die, and the period of time in which humanity has been around is only about 1% of 1% of
that.
Recorded human history is even shorter, just a percent of the time humanity has been around,
while the individual human lifespan is about a percent of that.
Only about a hundred generations have passed since the Roman Empire was at its peak, and
the dawn of recorded history lies only about as far again back from then.
For most of that time, people typically did not die of old age, and even what folks meant
by old age included a lot of circumstances we no longer view as natural causes of a long
life.
At the age of 37, I have already outlived the supermajority of all humans who have ever
lived, yet barring unexpected illness or accident, I should have just as much time left to me.
When we start talking about human lifespan extension people often recoil from the idea
as rather fantastic, but it is worth remembering that many of the things that used to kill
people regularly in the past have been outright eliminated or reduced to being exceptionally
rare.
Those people living when such causes were normal, indeed, more likely than dying of
old age, might have considered their elimination rather fantastic too.
We are just getting to the point where technology is hinting at ways to extend someone's life
enormously, potentially without any limit, so contemplation of this topic tends to fall
into two chief attitudes.
The first is disbelief, while the second is total acceptance, immortality is possible,
and perhaps even folks alive nowadays might enjoy it, not simply folks living in some
distant future.
This is our topic for today, because we need to contemplate some serious impediments to
extremely long life that come into play once simple aging is no longer an ultimate expiration
date on your existence.
Over a long enough period of time you are likely to have some sort of accident kill
you, or to be murdered.
Even if your odds were only one in a million every year, you'd have a fifty-fifty chance
of dying in the next 700,000 years and less than a 1% chance to make it to five million.
Now most folks would shrug at that, that's a lifespan on a timeline as long as humanity
has been around, more than enough, but it's important to keep in mind you need to do way
better on your survival odds if you want to be seeing the End of the World.
Typically folks would suggest you probably want to either get a digital backup of your
mind or go entirely digital yourself.
Now it's kind of debatable if you, yourself, can actually go digital or if you are really
just making a copy of yourself, and that's an important distinction since if you don't
view that copy as you, you might prefer to just have a copy that can go live if you die.
Many folks debate whether or not such a copy is you, but whether or not it is, it's not
something you can really prove or disprove.
Regardless, what matters is whether or not you think it is.
But even that doesn't necessarily mean you won't get backed up if the option is easily
available, since a person has reason to want a copy of themselves for more than personal
continuity.
Putting it bluntly, most of us have stuff we're willing to die for: friends, family,
causes, etc.
But we also have stuff we're willing to live for, or in this case, be resurrected
for.
Most of us have people or projects we would not want to see ourselves absent from; I would
like to know this channel would go on if I fell over dead tomorrow, and the most obvious
successor to operate it is me.
Nobody's going to finish writing that novel you've been working on or tend that garden
you've spent years improving, not the way you would.
Most of us have something like this we deeply care about, and that's not even including
our friends and family, let alone our kids.
This isn't some cliché scifi horror novel either; that copy of you isn't going to
go home to your spouse and little ones and turn demonic.
It is you, it isn't an 'it'.
It might be on a TV monitor for a while or in some sophisticated android till a body
is grown, or re-grown.
Barring murder, in a civilization that can backup memories, odds are only massive brain
damage that shreds almost everything will keep you dead, but it would be nice if the
tiny little robots repairing your neural connections had a backup copy to look at as a blueprint.
I've heard folks suggest they'd be freaked out by a copy of a dead family member, and
that's an entirely legitimate response, but that's because we're not used to it.
We're very good at believing what we want to be true, and by default we suspect such
a thing only because it's too good to be true and because we've seen a lot of horror
movies about bringing back the dead and it going all wrong.
We've got a lot more reasons to want to believe it's really them or close enough,
both from a personal desire and a strictly scientific perspective, so I'd rather imagine
most folks will increasingly tilt to regard them as a true copy or even just the original
who was away for a bit in the hospital.
So there's a lot of reasons to have such a backup around and not many reasons not to.
Lots of potential problems too.
Fortunately, one that isn't too big a problem is storage space.
While the human mind is still a more potent processor than our best supercomputers, even
if that gap is almost closed, human memory storage is estimated to be in the area of
10 terabytes to perhaps 10 petabytes, and you can buy 10 Terabyte hard drives these
days.
A modest gain in hard drive capacity and cost will permit affordable storage of data equal
to even that higher end estimate of human memory.
Right now you could buy the lower end value for a few hundred bucks, and the higher end
for around a quarter of a million dollars.
We never want to fall into the trap of thinking that Moore's Law and its parallels for other
aspects of computing like memory is an actual law.
There are no guarantees we will get computers even one penny cheaper per unit of memory
or processors than we have right now, but I think we can assume that by the time we
have brain scanning technology we will have at least another order of magnitude or two
knocked off the price per byte of memory.
Such being the case, the actual storage cost for a copy of a human mind should be affordable
even if it is that higher end, 10 petabyte value, and of course 10 terabytes already
is.
What this means is that we have every good reason to believe we could store a human mind
on something smaller than a human brain and a good deal cheaper than most annual life
insurance premiums.
A good analogy perhaps too, since a backup is essentially a type of life insurance.
You get those to take care of your loved ones after you're gone, a backup of you does
the job better.
But it also means you can probably have a ton of backups, especially if memory gets
a lot cheaper.
Most likely the cost bottleneck would be all about the transmissions themselves, and of
course the scanning equipment.
In a realistic scenario, your default 22nd-century individual, who is a little bit of a cyborg,
probably has a mind–machine interface that can double as a brain scanner.
Let's use a specific case, we'll say Steve from Austin, Texas.
Steve has neural lace woven around in his brain and scanners in his skull and little
nanomachines that help to monitor and repair his brain.
They tend to do most of their work at night and he doesn't use his bandwidth much then
so they assemble a snapshot of his brain and transmit it to a hard drive implanted into
his leg bone with a fiber optic running down to it.
That's his first backup, though really there's one already in his head, but it's ever shifting
and not much of a backup since it would likely be damaged by whatever damaged his brain.
Steve's pretty tough and those little nanobots could probably repair his brain on their own
if he got stabbed in the skull, probably without even having to check that backup black box
in his leg.
He's also not really worried about losing a day's worth of memory, so he only updates
his backup at night, once a day.
But he is transmitting all the time, like most people he has a deadman switch constantly
monitoring his lifesigns and recording sound and video near him for his secure storage,
that will be transmitted to a security company and then the police if anything goes wrong.
He can always watch those recordings if he loses that last day of memory too.
Whatever amount of bandwidth you might need for transmitting a backup is way more than
a few megabytes of video, audio, and biometric data.
Of course when Steve sends his backup daily, to a local datahub or warehouse, he doesn't
send a whole copy of his brain, even if that is only 10 Terabytes that's still a lot
of data to send down the wire or wirelessly.
He's probably just transmitting a comparison of his current mind with his last copy and
filing those changes.
All the new neuron connections and changes in potential and such.
Now Steve is a bit paranoid, so from that datahub the data gets sent to two different
companies, each of which keeps its own redundant backups too.
If you are in the brain-copying business, you want to make sure you've got a reputation
for having giant armored vaults with backups and the kind of security that makes Fort Knox
look like a convenience store.
He's got two backups against those companies failing.
It's like with getting yourself frozen, folks ask what you should invest your money
in to make sure you still have some when you get thawed out, to which the answer is, the
company that froze you.
If they are successfully thawing people out, their stock is probably looking pretty hot,
and if their stock isn't too hot, you probably ain't staying too cold, so the status of
your investments is the last thing on your mind, or what's left of it.
Multiple redundancies, prior copies that haven't been updated for a while just in case someone
has been tampering with your uploads or even hijacked your implants to send static or gibberish
so they could kill you later when all those stored copies are gibberish too.
Encryptions on those transmissions, regular checks with the technicians to make sure all
is well.
This makes somebody very hard to kill, for keeps.
They can still get your body of course, one immediate application of this technology is
that you can copy your mind into an android and go shoot someone with a death ray that
disintegrates them.
That would probably be a pretty common form of interplanetary tourism too, since it's
faster and probably cheaper to get your brain scanned over to an android on another planet
for a while.
You might get some interesting companies that provided anonymous brain scan storage or don't
ask don't tell android rental.
Though I think you could still track them.
But if you've got that tech you also have off planet storage too, just having androids
sophisticated enough to handle a human brain copied into them grants you way easier off-planet
construction abilities, but that's not too relevant, since all the tech needed for such
things allows really good automation and self-replication anyway.
So Steve's other copies are off at the Hyperion Data Repository on Titan, and with a small
city-state in the asteroid belt that is the Swiss Bank of the Brain Biz.
Those only get copied once a week, but they keep 100 prior versions before copying over
and Steve assumes that he only needs those if either Earth is getting blown up or someone
has engaged in an elaborate murder conspiracy against him.
At least that's what Steve's wife Jaime thinks.
Steve is pretty paranoid so he actually has a third copy she doesn't know about that
he only transmits once a month and never from home.
After all, no encryption is ever safe from someone with the password or access to the
account a password reset goes to.
Steve heard of a man killed by his wife after they had a very bad argument, where she didn't
want him dead, just wanted his memory of the argument erased.
Steve's fairly normal as folks go, in terms of paranoia and backups, he's not a secret
agent or a test pilot or something dangerous that might get his body incinerated.
But he's pretty hard to kill, and indeed probably a lot less likely to die, permanently,
than one in a million per year.
It can pretty much only happen to him if he is murdered, and we already have a fair few
places where the murder rate is nearly that low.
As we said at those odds your half-life was about 700,000 years.
So he might live quite a long time, with even lower odds.
This is even bigger for strictly digital entities, especially ones that don't feel a need for
speed.
As we've mentioned many a time before, the signals running around your brain generally
move slower than the speed of sound, often a lot slower, and it's only a millionth
the speed of light.
If you copied a human brain exactly, but spaced out to run at light speed but at the human
rate of consciousness, that brain would be the size of a planet.
Now you'd probably have that bundled into all sorts of nodes but it means someone can
spread their intelligence all over the planet and still maintain the normal human rate of
thought without needing a single extra bit of total processing power.
That's a very hard target to take out, when they might have a dozen redundancies for every
node and thousands or even millions of such nodes.
This doesn't even include actual backup copies of them elsewhere or even running clones.
There's a million asteroids in the Belt big enough to make comfortable city-states
for regular humans, and they could easily have tons more entirely digital people running
around in the background, potentially with their consciousness distributed over several
light seconds.
And if you don't mind going slower, or adding more processing, you could potentially distribute
your awareness like a gas cloud spread over entire solar systems or other solar systems.
If your big goal is to survive as long as the stars or longer, you might not mind if
a thought took a year, if it protected you from pretty much any conceivable attack.
Helps with boredom too, one of those big threats to continued existence.
That's the big one of course.
How do you kill an immortal?
In a fantasy novel, they've presumably got some special weakness, ya know, throw their
precious ring into a volcano, or there's a magic sword that can kill them.
We don't get those in the real world, but hypothetically, you could get them with some
virus that corrupted them and their copies, they presumably need access to those.
But they might have disconnected backups or ones that were read only in someone else's
vault as a last fallback.
You could go get all those storage repositories, but you need to know where they all are.
You need to be able to blow them up.
You need to be able to deal with the issue that other people are probably on those too,
and you need to get them all at once or inside the time window light lag provides or they'll
just copy elsewhere.
And all the security for this is to protect human lives, many human lives, so odds are
you are going up against tons of the best minds around who designed those security systems.
Barring such options, how can you kill an immortal?
The easiest way is to get them to kill themselves.
There's more to that than simple suicide, but we can start there.
People probably do not want copies of their minds floating around that they don't know
about or don't have access to.
That is your privacy after all, so having some backup you've removed from your own
memories triggered to go off if you committed suicide would seem a special level of paranoia,
and you can't update it all the time or you will need to know about it or otherwise
leave evidence it exists, so you could be losing years or decades of memory.
Excluding that as probably uncommon and impractical, a person should be able to kill themselves,
and it's unlikely any court would rule that any decision you made in the past, like a
clause preventing you from wiping your backups, was something you couldn't invalidate.
Maybe though, you might have some weird cases of divergent multiple persons with the same
original mind where one tried to delete the original backups and got refused.
Now it sounds kind of absurd that you could talk someone into killing themselves without
overt mind control, but two notes on that.
First, while I always say I can't imagine ever getting bored with life to the point
of wanting to end it, that is the most common rebuttal to life extension I tend to hear
when we discuss the topic.
Folks would get bored and choose to die.
I, of course, always say that's their own business and doesn't have any relevance
to whether or not such tech should be developed or if other people should be allowed to use
it.
I also don't think most people would get bored either, but I could be wrong.
Add to that, we expect psychology to keep improving especially when we are at the brain
scan and emulation level.
A trained expert working on someone who was already kind of bored might be able to talk
them into ending themselves.
They almost have to have access to every recent backup at least of their mind, so they can
delete those.
You could have things in place to prevent suicide, if you were worried that you might
be struck by extreme but temporary depression at some point in a multi-million year life.
But you probably don't want strong anti-suicide restraints in place either, considering you
could end up being truly immortal to the point you can't even kill yourself.
Your core architecture prevents you from even trying.
That's how you end up a trillion-year old miserable entity trapped for all eternity,
potentially even driven to steal resources from others to keep going after the stars
are gone because your internal mental monitors regard that as akin to starving yourself to
death and are allowed to force you to eat.
It's a little chilling to imagine a dark and cooling Universe populated only by resource
raiders cannibalizing each other, and in some ways that seems even worse if each of them
is secretly grateful when they lose and can finally die.
So while you might want to build in time constraints on deleting backups, waiting periods to confirm
the request and so forth, you still probably want that option.
If it is there that's one way to get an immortal, persuading them to take a nihilistic
view of life or even just convincing them that life will be exciting again with real
dangers and few to no backups.
Of course at a fundamental level, if your psychology is getting that good, and so is
your neuroscience, you might be able to 'kill' them just by changing them.
Fundamental life changes can have a big effect on a person, and permeate out to change their
attitudes on all sorts of things.
And that's without precision psychology or the ability to simply delete a given desire.
For a digital entity, one can presumably be sufficiently in control to flip a switch that
makes you like chocolate and hate strawberries, where before you didn't like chocolate and
loved those berries.
You don't have to move fast either, revenge is a dish best served cold, and you are presumably
just as immortal as they are.
If it takes you a million years of subtle manipulation to effectively kill off the old
persona with one so different it doesn't act the least bit the same, what do you care?
Which brings us to our last point.
You can use all these methods to extend your life indefinitely, but how much is that person
really you?
Yes, the changes are gradual, but does that really matter?
In general, we also have this same concern with Transhumanism.
If you boost the mind up to superhuman levels, is what remains actually you or did that person
die?
Not a seed the tree grew out of but just the decaying logs of a previous trees it used
for nutrients and root structure?
This is one reason I can imagine for why people might skip on mental augmentation or super
long lives.
They just don't believe that final product would be them, because it's beyond simple
gradual change so that the final person is no more them then we are the various dirt
and microorganisms that were around when the dinosaurs roamed the planet.
Now you could put safeguards in to prevent such major changes, to make sure you never
drifted outside the acceptable ranges of thought and behavior you wanted.
Have certain things hardwired as it were.
Or you might use resets, effectively being like Leonard Shelby from Christopher Nolan's
Memento, the reverse of the groundhog day where you keep repeating the same day, but
remember it; here you keep experiencing new days, but forget the previous ones.
Something like that could happen just from not being able to contain all your memories,
not just running out of space for them but for the indexing system you use to recall
a billion years' worth of life.
Faced with options like that, a lot of folks might decide enough is enough and opt to shut
down, not from boredom, but from recognition that they just weren't the same person anymore,
and either they archive all those old memories entirely outside their new mind or just end
it entirely.
Either way that original person is gone.
That's a fairly dismal outlook on eternity, but this is our Halloween episode after all.
There could be solutions too, we've hardly explored the options in detail for this problem
we can't experience yet anyway, and figuring those out will give future generations something
to do so they don't get bored.
For mind uploading, I would expect it to be a pretty major sector of interplanetary trade,
our topic for next week, simply because if it becomes normal, most folks would want a
backup far from where they lived.
It will also play a big role in how people approach warfare in the future too, when we
revisit that topic early next year.
For alerts when that and other episodes come out, make sure to subscribe to the channel,
and if you enjoyed this episode make sure to hit the like button and share it with others.
And join in the conversation in the comments below or at our facebook group, Science and
Futurism with Isaac Arthur.
Until next time, thanks for watching, and have a great week!
No comments:
Post a Comment