Re: Cloning and Halacha

Yaakov Menken (menken@torah.org)
Wed, 19 Mar 1997 09:15:24 -0500

I don't see anything terribly wrong with cloning research as currently
charted. I wonder what Rabbi Lau found objectionable. The fact that it
_could_ be abused is not a reason to stop a positive effort that could also
help to provide needed medicines, for example. I think the moral issue is
not the _creation_ of a clone, but how we might treat a human clone
afterwards.

My reason for this speculation is found in the Talmud: when the Sages were
capable of creating something known as "golems," humanoid beings which were
able to follow commands but not function as full humans, the distinguishing
factor between a golem which had no soul, and a human which did, was that
only the latter could speak. Although to the best of my knowledge a golem
couldn't speak at all, the intent here is to discuss intelligent speech -
even a parrot can repeat sentences. Human beings are different from animals
because of our ability to think, and we express our thoughts by speaking. A
golem didn't qualify as human because it couldn't think as humans do, it
could only follow instructions. {This is not to say, of course, that a
human who can't speak isn't human. We know that all humans think based on
the actions of the majority, and we know from the Torah that all - even
completely incapacitated people (r"l) - have souls.}

Over a thousand years later, the early computer scientist von Turing
proposed a nearly identical test to determine true artificial intelligence
- that one should be able to converse with a computer and have it respond
in such a way that one couldn't tell if he was reading words typed (or
spoken) by a computer, or another human being at the other end. Someone
told me that a computer came quite close in a recent challenge, but I don't
know any details of the test which was run.

Now I'm not saying true AI would be a theological problem, just that these
sources _seem_ to indicate that computer science won't truly arrive at
artificial intelligence. A computer will never have a neshamah [soul], and
apparently you need one to think as we do. Actually, look at computer
science today - is there any facet of this amazing field where we've been
so unsuccessful as with AI? Even the fact that computers can't recognize
continuous speech is because they can't adequately reason out the context
of sentences, and thus determine which of several similar-sounding words
the speaker probably said. With every other area of computer science, we
seem to exceed predictions from scientists and SciFi writers - whereas with
AI, HAL was supposed to be ready for operation in the early 1990's. Someone
who read the book or saw the movie (2001) can remind us when the first AI
machine was supposed to achieve consciousness. We missed the deadline, that
much is clear.

Now how does this all relate to cloning? My question is, does a clone get a
neshamah [soul], or not? Perhaps a neshamah is only given to a human
created by G-d in partnership with a father _and_ mother; but perhaps
implantation is good enough. An interesting thought.

In any case, this does mean that if clones are actually possible, a clone
will have an entirely new neshamah, and would be fully human in every
possible respect. Therefore, again, it would not be immoral to create a
clone, but to take out the heart of a clone to perform a transplant would
be murder, plain and simple. Or so it seems.

--YM