if a helmet that absorbs shock better is
bigger/heavier does that have risks? increase in spinal
cord injuries
if you have better helmets will people hit with
their head more
is the lack of safety part of the thrill
helmets with more shock absorption are bigger or
heavier, that increases the risk of spinal cord injuries
it isn't as easy as it would seem to use technology
to make us safer
in the chart on the right of football player weight
trends and the force of hits, Morris Badgro was a 190 lb
typical player in 1927, Haloti Ngata was a 335 lb. player
in 2006
How big does a risk have to be for the government to
regulate it?
the risk needs to be proven first--but how much
harm do you allow to happen while waiting for a higher
level of proof
if the risk cannot be easily evaluated by ordinary
people then there should be regulation
how much does the risk affect others or just the
person who takes the risk
is it enough to have safety warnings instead of
banning something
are the people taking the risks able to make good
decisions?
Technological Fix: we can fix something with
technology and not have to change our behavior.
Nye considers two kinds of examples, accidents and military
technology. He has a political view you may not agree
with on military technology, but you need to understand the
arguments he is making whether you agree or not.
Is a technology safe? This turns out to be a complicated
question
very complex technology bring special risk issues
because there are so many things that can go wrong that
you have to think about risk differently
for many big technologies research goes into trying
to predict the risk of an accident
you can never get to a risk of zero: sometimes risk
reduction is impractical and sometimes less severe risks
are tolerated if the cost of risk reduction would be
greater than the improvement
should a crewed spacecraft be 99.9% safe (likely to
complete a mission without harming the people) or 99.99%
safe?
"Apollo 8 has 5,600,000 parts and 1.5 million
systems, subsystems and assemblies. With 99.9 percent
reliability, we could expect 5,600 defects." Jerome
Lederer, Director of Manned Space Flight Safety
how much are we willing to pay for safety?
reduce risk by redundancy--have duplicate
systems--have greater safety at increased cost
redundancy doesn't work in all situations
that depends--do people choose to take the risk, how
much do we think the technology should be trustable...
what tradeoff do we make between risk and cost?
different kinds of risks are taken more or less
seriously
Why do we react strongly to technological disasters? (maybe
less so than 30 years ago)
we pay a lot more attention to one big accident than
many small ones
our perceptions of safety aren't necessarily accurate
we are more frightened of big accidents--more in
the news
people want to assume the systems we use everyday
are safe
but in many cases necessary maintenance has not
been done
there have been bridge collapses caused by bad
design (Takoma
Narrows)
some are caused by shoddy construction
this one was caused by maintenance issues (and weak
design)
we wonder where else are there similar risks
they get a lot of news coverage
because whatever trust we feel in our government
tends to carry over particularly to the technologies the
government develops to increase public safety
because we don't like being out of control--if
something is going to go wrong we want to feel we could do
something to prevent the situation or escape.
Therefore our tendency when faced with a disaster is to
find someone to blame
we don't like to think that accidents are
inevitable--any complex system will break down sometimes
because disasters attract readers/viewers so they are
played up by the news media, in oversimplified form
Disasters usually have complex causes
maintenance may intersect with design issues
sometimes what we do to prevent problems in one place
increases problems somewhere else--if you prevent floods
in one part of a river by building levees the water is
channeled downstream and may cause worse flooding there
sometimes the problem is the harmful consequences of
a technology that no one knew about in advance (or the
people who saw the problem developing were ignored)
sometimes accidents are caused by human error (this
is what is usually said about the Three
Mile Island nuclear power plant accident). But
could the system have been better designed to prevent
human error, or the operators better trained?
we had a speaker in spring 2006 who had worked on a
risk study of Oconee nuclear station in the 1970s--he
particularly remembered a valve with a label on the wall
next to it that said "this valve turns the wrong way"
how much can and should we do to save people
from their own stupidity
sometimes accidents happen because there are known
risks but they were small enough to have been ignored
the risk of damage caused by loose insulation that
caused the space
shuttle Columbia accident was known, but it was
one of a list of over 100 critical risks--it would have
been too expensive to fix them all
The case of the Challenger accident--they knew that
cold weather could cause components to fail and chose to
launch anyway
known measurable risks, known unknowns and unknown
unknowns
As our world becomes more
technological, risks increase
the more complex a system is the harder it is to
reduce risks
technological systems tend to become bigger and more
interconnected
we grow more dependent on technology
we can also use technology to reduce risk
we live safer lives
but the safety systems are complicated as well and
we become dependent on them
when we feel safer we take more risks
As technologies become more complex and more interwoven
into large systems it becomes harder to anticipate all
possible accidents.
blackouts are often caused by a small problem in one
place that resonates through the system in unexpected ways
critics for the originally plan for the strategic
defense initiative (sometimes called Star Wars) saw this
as a key issue
the idea was we could develop a system to protect
us from a nuclear missile attack
before that the strategy was "mutually assured
destruction"
to do this well you need to shoot down missiles
from satellites, and you only have a few minutes to do
it
the targeting is going to have to be done by a
computer program
some scientists at the time said it would be
impossible to debug the necessary computer program,
which would involve millions of lines of code and which
could not be fully tested until it was needed
at any given level of technology there are systems
that are so complex that we can't make them reliable
How much is worth spending on safety:
should all cell phone towers be required to have
backup generators?
is it worth increased costs to pay for that?
government would have to require it
you can do more and more--where do you stop?
All of these problems can apply to military technologies, but
in addition military technologies have gotten us into the
pattern of believing that greater destructive power makes us
safer. Nye questions this. The more weapons we
have the safer we are--is this true?
Consider the arms race of the Cold War--let me try to tell
the story more neutrally than Nye does:
the U.S. built missiles armed with nuclear weapons
because they had two advantages over long range
bombers--they were impossible to defend against and they
were cheaper
the Soviets started building their own nuclear
weapons and missiles sooner than we expected
the competition became which country could deliver
the most bombs (one missile carrying many separately
targeted warheads was a key technological innovation)
the official military strategy became mutually assured
destruction--if the other side strikes
first you want to be able to retaliate and cause
unacceptable damage to the other side. Submarine
launched missiles were particularly valuable to this
because they were unlikely to be destroyed by a first
strike. Also if the Soviet Union attacked we
wanted to be able to launch our counterattack before their
missiles landed
this worked, neither side dared start a nuclear war,
but it was expensive and risky
Reagan's idea of a defense against nuclear missiles,
what became the strategic defense initiative, threatened
to destabilize that--if one side developed a way of
defending against retaliation it would theoretically be in
their interest to launch a first strike immediately while
they had that advantage
the Soviet Union collapsed for political reasons, but
some people believe that not being able to keep up in the
expanded arms race Reagan created contributed to its fall
today nuclear war is still unthinkable and that is
limiting what the US can do to help Ukraine--if we
escalate will the Russians then use nuclear weapons?
People have often made the argument that increasingly
destructive military technology would make war so horrible
that governments would no longer dare start wars
If anything the opposite has happened, at least for small
wars: the U.S. uses technology to reduce the number of
American soldiers killed, which makes war more politically
acceptable
Take this back to risk:
advances in nuclear weapons have increased risk?
not the risk of total war
but we have weapons that could wipe out
life on earth (nuclear winter)
does technology make us too powerful
our everyday technologies bring us other
kinds of risks