X-Message-Number: 942
Subject: More Luddites
Date: Wed, 01 Jul 92 16:46:28 -0400
From: 


Thomas Donaldson <> says:
>The point to remember, more than any other, is that the workers are also
>the consumers of goods made by these factories. 

As long as humans are making most of the purchasing decisions, I agree
that humans will stay in control of the system.  I doubt the humans
will be making most of the purchasing decisions forever, though.
Corporations can consume just as well as individuals.

>So long as human beings command these tools to be made, they won't
>make them so as to put themselves out of business. Or make THEMSELVES
>obsolete.

This is a tragedy of the commons situation.  There won't be anyone
saying "let's build this new machine to make humans obsolete".
Instead, they'll be saying "let's build this new machine to make a
little bit more profit to make our stockholders happier".  Eventually
"this new machine" will be something that fulfills the role of CEO,
and most of the stockholders will be other corporations with
recently-installed artifical CEO's.  The economy will march happily
onward without us, assuming we can design a non-anthropocentric legal
system.  Assuming the legal system continues to be human-centered, the
humans will become figureheads.  ("Potted plants" was the term I used
in an earlier post.)

And  (David Krieger) says:

>Countless times in the past, occupations and
>the detailed technical abilities associated with them were "obsoleted"
>... by technological developments.  

I think we're using different meanings for "abilities".  I'm using a
model where a human uses a little bit of domain knowledge and a large
amount of skill common to all humans to get the job done, and your
model seems to put everything into the domain knowledge category.

>The computer was supposed to put hordes of file clerks and bookkeepers
>out of work -- and it did, but it also created even more jobs, for
>programmers, operators, ... [etc.].

Using "dexterity" as an abbreviation for all the skills people easily
learn but don't need to be explicitly taught, let's go through the
scenario for the file clerk:

The file clerk needs to know a little bit about how the file system
works, and a lot of "dexterity".  Now suppose a machine called a
PICK-EM-UP is invented that has artificial "dexterity".  So the
PICK-EM-UP is slightly modified to become a file clerk, and the file
clerk is displaced.  The clerk retrains and learns to do PICK-EM-UP
repair.  PICK-EM-UP repair requires knowing a little bit about how to
read the repair manual, and a lot of "dexterity".  Of course, by the
time the clerk finishes training, PICK-EM-UP repair is being handled
by PICK-EM-UP's.  

Now the clerk needs to try to learn a skill that isn't mostly
"dexterity" before the machines get that skill too.  The invention of
the PICK-EM-UP makes the problem of retraining a displaced worker much
more difficult than it ever has been in the past.  Before this time,
"dexterity" and a little bit of training could make a person
employable.  After this time, lots of training is necessary to make a
person employable.  As the amount of training required increases, the
desirability of instead designing a machine to do the job increases.

At some point the training ceases to be worthwhile.  There will be a
increasing chunk of the population that just can't keep up.

>The effect of technology is not to eliminate jobs, but to move the
>available jobs to a higher skill level.  

Right!  And eventually the required skill level is not achievable by
mere humans.

Tim

[ I see two issues here for cryonicists: (1) how do we improve our
  chances of being reanimated and (2) what will we do once we are
  reanimated?  The extent of decline in economic value of our (frozen)
  skills as technology advances seems to be the issue that this "Luddites"
  thread is now pursuing.  Suspension contracts, independent funding
  (Reanimation Foundation), and helpful friends and relatives are
  quite important factors, too.  Also, see the "Motivation for Reanimation"
  thread (msgs 86[37] 87[012] 875). - KQB ]

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=942