Hacker News Clone new | comments | show | ask | jobs | submit | github repologin
ChatGPT's artificial empathy is a language trick (theconversation.com)
8 points by devonnull 2 hours ago | hide | past | web | 10 comments | favorite





What’s the definition of empathy? To me the connotation has always been, “is able to feel the feelings of others” as opposed to sympathy which is more about “imagining the feeling someone is experiencing”.

Regardless, aren’t we all trained in the same way - reinforcement of gestural and linguistic symbols that imply empathy, rather than being empathetic? I guess I’m wondering if hijacking our emotional understanding of interactions with LMs is that far off from the interaction manipulation that we’re all socialized to do from a young age


I‘m quite sure, that a lot of people aren‘t trained that way. I know that anecdotical evidence doesn‘t count, but I know a handful of people that surely don’t know how to use symbols to „imply empathy“.

In the Eliza example, I find it astonishing how the chatbot was able to pick out specific words it could use as response, how did they achieve that in 1988?

It was a basic pattern matching, and a dictionary of patterns.

I always wondered the same; is the code avaible?

Was found around 2021 after having been considered lost: https://archive.org/details/eliza_1966_mad_slip_src

Likely more approchable is this reimplementation: https://github.com/anthay/ELIZA


Maybe this is a good way to learn when people are using that same language trick to make you think they have empathy.

It's all a language trick.

> Interacting with non-sentient entities that mimic human identity can alter our perception of [human entities that might only mimic sentience].

The more I see people investing in conversations with LLMs, the more I see people trying to outsource thinking and understanding to something which does neither.


I use LLMs as a kind of rubber duck and secretary i guess. As in, I’ll ramble about ideas and thoughts and let it analyze an provide feedback etc. I am not sure if it is causing brain rot to be honest. It does seems like it might make it harder to properly articulate thoughts if you do this to much maybe



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: