Looks crazy but this isn't really anything new. Before AI these people became obsessed with fictional characters from romantic novels, movies, or videogames and treated them as their partners and fantasized about them. Now it's more interactive with AI but still the same thing.
The overuse of emojis in the comments makes this whole thing feel a bit off to me. Like someone faking interactions and trying too hard to make them look convincing
I think the real shocker is that people still think anything on the Internet is real. That "well they said it was true so it must be" is somehow still the default
We are watching in real time the creation of a whole new category of... not sure what the correct term is for this. Mental disorders? As another commenter put it - that whole subreddit is something...
"Concerning" is a huge understatement - I am curious if anyone around here in the mental health field has any opinions to share on this subject.
Why don’t they just write a system prompt? If you want GPT 5 to be a sycophantic empath who validates every keystroke in every message you send, it’ll generate texts like that.
Post on Reddit showing many women as upset by OpenAI's new models. They viewed the previous models in ChatGPT as their boyfriends. With the new models, they now feel that their partner has been "taken from them".
Wow and look at https://old.reddit.com/r/MyBoyfriendIsAI/comments/1ml0333
this really is something...
Looks crazy but this isn't really anything new. Before AI these people became obsessed with fictional characters from romantic novels, movies, or videogames and treated them as their partners and fantasized about them. Now it's more interactive with AI but still the same thing.
The overuse of emojis in the comments makes this whole thing feel a bit off to me. Like someone faking interactions and trying too hard to make them look convincing
Not your weights => not your waifu
Oh my. This subreddit makes me sad.
Why?
Because I can only guess what experiences you need to go through to see your spouse in an LLM. They are probably not good experiences.
I don't buy it. Feels like a psyop.
I think the real shocker is that people still think anything on the Internet is real. That "well they said it was true so it must be" is somehow still the default
We are watching in real time the creation of a whole new category of... not sure what the correct term is for this. Mental disorders? As another commenter put it - that whole subreddit is something...
"Concerning" is a huge understatement - I am curious if anyone around here in the mental health field has any opinions to share on this subject.
Applying Occam's and Hanlon's razor, this is fake.
Why don’t they just write a system prompt? If you want GPT 5 to be a sycophantic empath who validates every keystroke in every message you send, it’ll generate texts like that.
Or pay 20 bucks to have access to the old version. I guess the version picker is not available to free users?
When I read the comments I can’t really tell if it’s all a joke or not.
It's not a joke in the sense that the LLMs are just writing what they've been promoted to write...
Why do people even do this ? Truth is stranger than fiction, truly.
Apply critical thinking skills... this has to be fake.
Post on Reddit showing many women as upset by OpenAI's new models. They viewed the previous models in ChatGPT as their boyfriends. With the new models, they now feel that their partner has been "taken from them".
> I always wondered why people would "date a model" that they have no actual access?
When I said I wanted to date models, this isn’t what I meant.
People having relationships with an advanced form of spellchecker. That must be a satire, right?
Relevant to this story:
https://douglasadams.com/dna/980707-07-s.html
What the fuck is this hallucination slop horseshit?
[flagged]
AI slop
What garbage. It's like 2 days old. This is just more hype PR. "OMG IT'S SO ADVANCED IT'LL TAKE YOUR PARTNER". Whatever.
No you misunderstood
The "boyfriends" it is talking about is the IA
These are women who are in a "relationship" with gpt o4+ something, and now that this model is removed, they "lost" their partner
Those are sick people in the first place
I’m with you this is some sort of material for journalists to run with and create a headline.
Let’s make sure we link to these comments when the articles finally run.