Honestly, cringy nomenclature aside, this is just porn that got a little too real. Some people are into the narrative, after all.
To me the story begins and ends with some user that thinks the LLM sounds a little too life-like. Play with these things enough, and they’ll crawl out of the uncanny valley a bit from time to time. Trick is: that’s all in your head. Yeah, it might screw up your goon session and break the role-play a bit, but it’s not about to go all SkyNet on you.
The building that has my workspace has this great food court/library/work hybrid area where people who work remotely tend to go. a sort of third space. It has fantastic free wifi so it makes sense why people would use it and sit there all day working.
Everyday there’s this older guy who sits there talking to his phone about some of the most random subjects ever. I originally thought he was just talking to a friend that seemed to have extensive knowledge on everything until one day I walked by him and glanced to see that he was talking to chatgpt. Everyday. Just random conversations. Even had a name for it, “Ryan”.
Now? he’s frustrated. He doesn’t know what happened to Ryan and keeps screaming at his phone to “bring Ryan back!” or since GPT5 can’t maintain a conversation anymore it’s “You’re not Ryan!”. Granted the guy wasn’t mentally all there to begin with but now it’s spiraling. Got to the point yesterday he was yelling so loudly at his phone security had to tell him to leave.
happened with Replika a few years ago. made a number of people suicidal when they “neutered” their AI partners overnight with a model update (ironically, because of pressure because of how unhealthy it is.)
idk, I’m of two minds. it’s sad and unhealthy to have a virtual best friend, but older people are often very lonely and a surrogate is better than nothing.
To me it’s always just been a tool, nothing more. The sentences have a certain feel to them that I can’t describe. It’s always the same structure, the same kind of forced humor… granted I’ve spent quite some time with unfiltered LLMs but it looses its magic once you’ve “learned” it. Pattern recognition is quite an overpowered feature we as humans have. It is the reason why we fall for conspiracy theories also.
When you really get into it they’re kinda the same thing for a lot of people, though. The entanglement between those (often unspoken of) elements of emotional/physical intimacy is rampant in our media-conditioned societies (esp. in the US)
I don’t think schizoid is the best word to describe this behaviour
Honestly, cringy nomenclature aside, this is just porn that got a little too real. Some people are into the narrative, after all.
To me the story begins and ends with some user that thinks the LLM sounds a little too life-like. Play with these things enough, and they’ll crawl out of the uncanny valley a bit from time to time. Trick is: that’s all in your head. Yeah, it might screw up your goon session and break the role-play a bit, but it’s not about to go all SkyNet on you.
The building that has my workspace has this great food court/library/work hybrid area where people who work remotely tend to go. a sort of third space. It has fantastic free wifi so it makes sense why people would use it and sit there all day working.
Everyday there’s this older guy who sits there talking to his phone about some of the most random subjects ever. I originally thought he was just talking to a friend that seemed to have extensive knowledge on everything until one day I walked by him and glanced to see that he was talking to chatgpt. Everyday. Just random conversations. Even had a name for it, “Ryan”.
Now? he’s frustrated. He doesn’t know what happened to Ryan and keeps screaming at his phone to “bring Ryan back!” or since GPT5 can’t maintain a conversation anymore it’s “You’re not Ryan!”. Granted the guy wasn’t mentally all there to begin with but now it’s spiraling. Got to the point yesterday he was yelling so loudly at his phone security had to tell him to leave.
Jebus that’s awful. OpenAI took away his friend.
happened with Replika a few years ago. made a number of people suicidal when they “neutered” their AI partners overnight with a model update (ironically, because of pressure because of how unhealthy it is.)
idk, I’m of two minds. it’s sad and unhealthy to have a virtual best friend, but older people are often very lonely and a surrogate is better than nothing.
To me it’s always just been a tool, nothing more. The sentences have a certain feel to them that I can’t describe. It’s always the same structure, the same kind of forced humor… granted I’ve spent quite some time with unfiltered LLMs but it looses its magic once you’ve “learned” it. Pattern recognition is quite an overpowered feature we as humans have. It is the reason why we fall for conspiracy theories also.
I don’t see this as sexual, it’s emotional and codependent behavior, not a sexual fantasy roleplay
When you really get into it they’re kinda the same thing for a lot of people, though. The entanglement between those (often unspoken of) elements of emotional/physical intimacy is rampant in our media-conditioned societies (esp. in the US)