@clacke
Re. Not anthropomorphizing LLMs
I'm a sucker for this. Thankyou for writing about it. I'll apologise to an inanimate object if I walk into it.
I find useful practical tips for myself in following this to be:
1. Use the verb "I prompted" rather than I told or I asked.
2. State that the program "output" rather than it replied.
3. I don't discuss "confabulation" because it's an anthropomorphization (the reality is that the computer program is doing exactly what it is instructed to do by the user), but if I was compelled to anthropomorphize, I would use "confabulation" rather than hallucination.
I would be curious to know if you or any other readers had any more tips!
The following cartoon is from:
https://www.smbc-comics.com/comic/precise