ELIZA was a program, John Henry had a soul

Some years ago a Canadian pen pal of mine happened to be coming through San Francisco and I invited her to stay with me – as one does.* Although it did later transpire that she personally knew another Canadian friend that I did know IRL** (not the biggest coincidence; all twelve Canadians know each other personally), I had never met her in person, or even talked to her on the phone, before she showed up on my doorstep and I invited her in and fed her dinner and handed her my spare housekey.

At some point during her stay I mentioned this to my dad, who was baffled and somewhat alarmed. “How do you know she is who she says she is?” he said. And the only way I could explain it was, she is who she says she is simply because she says it. Her identity to me is the things she says. The reason I invited her to stay is because we shared a subcultural affiliation – we liked a lot of the same weirdo indie Canadian hardcore bands – and anyone who could talk about those bands the way she did was a member of the subculture and thus would have been invited to stay on my couch.***

I thought about this when reading a recent Convivial Society newsletter, about the dangers of chatbots. The author, L.M. Sacasas, starts by talking about the ELIZA program, a proto-chatbot which convinced “quite normal people” to act as if they were talking to another human. That phase is quoted from ELIZA’s creator, computer scientist Joseph Weizenbaum, who wrote that “extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.”

On the one hand, what does it matter if the person on the other end is not real as long as they say the right things? One can still derive benefit from interacting with an illusion – think of novels! Think of cinema! Or think of my inept high school guidance counselor, whose primary utility was providing a hall pass and a free span of time in which to figure one’s own shit out oneself, while talking to his carefully composed I Care face. I would never advocate replacing human high school counselors with chatbots, but if you reframe the question as “isn’t it better for everyone to have someone to talk to, even if that someone isn’t anyone?” …well, isn’t it?

One of Sacasas’ points of warning, of course, is that chatbots don’t have a mechanism which will keep them saying the right things, and that they will tend entropically, as does the internet at large, to droop toward hideous racism, misogyny, incitements to violence, etc. And that’s even before you consider all those who might want to induce them to do that on purpose. This is a nontrivial problem, but it’s only one of the problems here.

The scene is set for this dystopian bullshit not just because we have the technology, but because American capitalism’s version of “care” is extraordinarily expensive and fundamentally uncaring. What’s more of a scam, a chatbot who you know is a chatbot, or medical insurance which charges for mental health coverage but whose baked-in institutional suspicion and chronic under-hiring ensures that almost no one will qualify to see a practitioner or be able to schedule an appointment? As Sacasas writes, “what if the problem was not that normal people became subject to delusional thinking, but that lonely people found it difficult to resist the illusion that they were being heard and attended to with a measure of care and interest?”**** I have no doubt that Kaiser Permanente is exploring the possibilities as we speak.

. . .

* …did?

** …who initially was also a pen pal who I had never met in person but who I invited to stay with me when he came through SF…

*** It was 2005 and things were different? I was different? Also, linguists, does this make “I really like the second Buried Inside record” a speech act?

**** And the shame, as David Bowie says, is on the other side: not on the individual humans who take comfort where they find it, but on the corporations who refuse to provide it in any other form

. . .

reading: um, several psychology texbooks, why do you ask
eating: Sultan Kurgan tofu, a savory and exciting recipe from Caroline Eden’s excellent central Asian travelogue/cookbook Red Sands
looking at: SNOW on my own motherfucking mountains!
listening to: sea chanteys and Bulgarian harvest songs for an upcoming show
making:
my own tortillas, and feeling like a genius

Leave a Reply

Your email address will not be published. Required fields are marked *