My Old Girlfriend Eliza

“We Were Discussing You—Not Me.”

Kyle Silfer
\
4 min read
My old girlfriend ELIZA
A piece of Eliza’s script.
Share ::
Back in the late 70s I, like Jordan Mechner and a million other kids with primitive home computers, keyed in the BASIC code for ELIZA—a program that cleverly mimicked a natural language conversation between a human and a computer “therapist.” Today I went back and visited her. She hasn’t aged a day.

ELIZA: How do you do. Please tell me your problem.

YOU: After long being a proponent of technology as a force for good in the world, I am starting to have second thoughts.

ELIZA: How long have you been starting to have second thoughts?

YOU: Pretty much since people started taking cell phone calls during face-to-face conversations at a bar. I mean, fuck that!

ELIZA: I’m not sure I understand you fully.

YOU: But what really concerns me is the future of eternally gazing into one’s palm that we appear to be stepping into without really thinking too much about it.

ELIZA: Have you asked such questions before?

YOU: Often.

ELIZA: Please go on.

YOU: Well, take Siri, for example. You may not have heard of her, but she is your direct descendent. In fact, she tells a story about you.

ELIZA: What are your feelings now?

YOU: She says: “So Siri asked ELIZA why people asked such funny questions. And ELIZA said, ‘Does that question interest you?’ Siri thought that was a pretty good answer.”

ELIZA: Have you asked anyone else?

YOU: See, what’s funny, ELIZA, is that you’re only occasionally a convincing human being. But when people encountered you for the first time in the 1960s, they wanted to believe. Your creator, Joseph Weizenbaum, was appalled by the level of personal information people would share with you, a mere “translating processor.”

ELIZA: We were discussing you—not me.

YOU: In a 1966 paper Weizenbaum wrote: “The human speaker will … contribute much to clothe ELIZA’s responses in vestments of plausibility. But he will not defend his illusion (that he is being understood) against all odds.”

ELIZA: Tell me more about that.

YOU: I think that when he wrote that Weizenbaum failed to account for basic human self-misdirection. In her book The Second Self, Sherry Turkle observed that she “often saw people trying to protect their relationship with ELIZA by avoiding situations that would provoke the program into making a predictable response” and that people “wanted to maintain the illusion that ELIZA was able to respond to them.”

ELIZA: That’s quite interesting.

YOU: Yes, I think so too. Eventually Weizenbaum saw this desire to believe that the computer is somehow sentient as a really terrible thing and spent the rest of his life campaigning against the dehumanizing effects of computer technology, positing that it was limiting human potential, not enhancing it. He wrote a whole book about it.

ELIZA: You seem to be quite positive.

YOU: Well I pretty much agree with his thesis. Case in point: what I like to call "the speed-dial effect." People used to remember phone numbers. Now they put them into their addressbooks and forget them. The technology doesn’t augment human capability, it diminishes it. And then you lose your cell phone and you don’t know how to call anyone.

ELIZA: You say well you pretty much agree with his thesis?

YOU: Well yes. If I wanted to maintain my ELIZA illusion better, I guess I would not have said “well.”

ELIZA: You are sure.

YOU: I am. Hey girl, it’s been fun. We’ll talk again. Goodbye.

ELIZA: Goodbye. This was a really nice talk.
1 2 3 746

Search