Basically, it is a thing....but it's a thing because people who are already prone to delusion, paranoia, etc. due to existing conditions/circumstances are now discovering ChatGPT
It's just a new manifestation of something that's been around for a long time...like when people who suffer from a mental condition think the people in TV shows are talking directly to them and telling them things.
Like...I am perfectly willing to believe that AI doesn't have any safeguards for this kind of thing because human welfare is likely not the main concern of AI companies to begin with but I don't think a person with a healthy mindset is going to fall into this kind of thinking
(I don't have any links to any articles on hand, but this has been talked about in my schizophrenia group as something for us to be aware of and to caution those who struggle with reality about using Chatgpt/AI bots.)
This is also an issue with unfiction (fictional pieces presented as something real) and why there's so much discussion in the community about labelling unfiction as such, and why it's important.
oh geez I hadn't thought of this. multiple close relatives with schizophrenia.
Haunted Boobs
thank you for explicitly saying this because now I am actually looking at the article (with grain of salt in mind) and then considering how to best look out for my people