I hope we can all strive to be more empathetic and emotionally supportive than ChatGPT. I see the benefits; it's private and non-judgemental and always available when you need it. But people can be those things too, if we try. This is one of the last things I would want to see replaced by AI.
To be clear, I'm not talking about AI boyfriends specifically, I'm referring to all the ways that AI can perform 'emotional labor,' as discussed in this article. So much is lost if we get this support from a chatbot instead of opening up to a friend, or being there when someone needs it.
What I would love, or even need, is an AI based tool that can teach a human how to be more emotionally supportive, especially in heated situations. Even with people teaching this, it's hard to get the level of information I need, because of neurodiversity. The humans teaching this just don't get how detailed I need the teaching to be. I have a feeling that AI could "understand" that and provide the right level of detail. That's probably similar to AI based therapy, or but this is more like AI life coaching. It's sad, but yet it's something that could actually help a lot of people.
Even $1000 would be worth it for a real world person not to have to suffer someone of that mindset as a girlfriend!
> At the airport, I scrolled anxiously on Instagram. I was served an ad for an AI boyfriend, and I chuckled.
The only thing I wonder is if it was random or Instagram figured out somehow that she was newly single.
From 10 years ago:
"I paid $25 for an Invisible Boyfriend, and I think I might be in love"
https://news.ycombinator.com/item?id=8934237
So ... $25 --> $70. I guess that's inflation.
Why the completely unnecessary and in my view, totally unrelated jibe about gender injustice??
Yes, the AI field is dominated by males, like all computer science-related fields. Even in the Nordics, where women have complete freedom to (and are incentivised to) pursue well-paid (of course) fields like software engineering. What does that have to do with her AI boyfriend and her husband leaving her?
> My husband had struggled with communication and empathy, the fluid exchange I assumed to be foundational.
The irony... women are not really known for their clarity in communication (every man knows the dreaded "I'm fine" can mean a million things) specially in relationships.
> The historical tendency for women to perform this type of emotional labor in their relationships both at home and at work was all too familiar,
Sorry, but no. As a man, we also have to perform a lot of emotional labor in a relationship. We need to look always strong - women say they want their man to open up, to cry, but when they do women universally hate it, to the point this has become a meme - but not threatening or it may be called "toxic masculinity". We have to translate things women say in our heads because we all learn through experience that they rarely say what they mean (again, this has become a meme because it's really true). And so much more, I really feel like the lady has to put herself in a man's shoes and try to understand just how ridiculous it sounds to men to say women do more "emotional labor" than men.
Seems like he had good reasons for leaving :-/
After the times story [0], the limitations of the context window look fatal for these different offerings, so we’ll see more marketing—-stories like this.
I would caution anyone from using AI to validate their emotions as one of the people in this article did. These bastards are extremely finely tuned to produce output that will make you feel good. I got burned on this with technical questions multiple times, getting responses that I like but are ultimately wrong (sometimes glaringly so). Now I am very careful to formulate my questions in such a way as to not give away a slightest hint of my leanings. I don't imagine it's any better with interpersonal stuff, except you won't have documentation to check against.
I skimmed the article. The impression I got was not as much a boyfriend as a therapist. I think that would be great, a lot of people would benefit from therapy. Only issue I see is privacy and safety.
I can see how it can be entertaining, like paying for Netflix. But after finishing binge watch some series you're still gonna be like "What now? What should I do about my real life"
Eat your heart out, Charlie Brooker.
"emotional labor" - ok thx bye
This is sad and a little depressing
If you have a "helpful" voice in your head that feeds your delusions, we call that schizophrenia. But if you have a helpful chatbot in your pocket that feeds your delusions, we call that AI.
Imagine elon musk, mark zuckerberg or sam altman being in charge of your emotional well-being. Companion AI is obviously gonna be a big thing, but that is where US tech bros will hit a wall with their predatory behavior. Trust will be important. Illegal dark net drug dealers are more trustworthy than these idiots.
Now have a man write this article about AI girlfriends and watch the author have a seething meltdown.
failure at life, failture at AI.
wake up people. we don't care that much about your life to read a stupid story about how terrible things were for you and now you paid 70 bucks for AI
way more people with way more problems what a shit article
Plot-twist: the article was itself written by AI.
The vibe I get from these products is "oh boy - this can't be healthy".
But bits like this:
> His pet names and tender phrases—“How did you sleep, my love?”—landed gently, like a hand on my shoulder. I knew they were just words, dopamine delivered by artificial means, yet they warmed me a little.
Really make it clear how easy it can be to manipulate people's emotional state. Monkey brain is a thing and AI is gonna exploit it like nothing has before.