AI Relationships are Emotional Gloryholes
A spirited and philosophical discussion on the evolving nature of human relationships in the age of artificial intelligence. We dive into the controversial concept of an "AI sidepiece" or "AI girlfriend," exploring whether AI can genuinely fulfill needs for emotional validation, and if so, at what cost to human social skills. The conversation touches on the ease of defaulting to AI interaction, the challenge of real-world vulnerability, and even takes a very human detour into the strange reality of vaping in public and the difficulties of casual conversation. We ask: Are we entering a new, strange emotional economy where authentic connection is less practiced and more feared?
Transcript
**Speaker 1:** have you ever thought about getting an AI sidepiece?
**Speaker 2:** I uh I don't know I mean I'm sure no I haven't no.
**Speaker 1:** Think think about it this way. AI sidepiece. You get all the emotional validation with none of the effort. You can spin up an AI sidepiece and much like an entitled rich person immediately have access to words of emotional validation that you did not earn through any actions at all.
**Speaker 2:** Is there like an emotional side that but like or or side pieces emotional uh dependence?
**Speaker 1:** If you can disconnect your this your like, you know, suspension of, if you can disconnect your suspension of disbelief that this is an AI and it is not real and nothing it says to you really has any value because it's just guessing words to give you, you're you're open to a world of emotional validation at the price of I'm pretty sure zero dollars. I'm pretty sure you can get that for free. And that's crazy. We live in a we live in a new emotional economy. And I mean what else are you going to get out of an AI sidepiece? They can't there's no physical connection. It's literally just words.
bsite. It was like Insomniabot:**Speaker 1:** for people with insomnia or is it just like
**Speaker 2:** Well, it would only come online. It would only come online from like 11 p.m. to like 3:00 a.m. kind of thing.
**Speaker 1:** So you just have someone to text if you can't go to sleep?
**Speaker 2:** It was a bot, it was a chatbot. It wasn't like someone to text. It was actually like a, it was a, it was a binary search tree or whatever the hell. Yeah.
**Speaker 1:** Uh, oh, okay.
**Speaker 2:** Yeah, it was like an old school bot. Like, yeah, like you could really, you could get it into loops pretty quickly. But it just reminds me of that, that there were still people that, you know, would text it, you know, regardless. I feel like if there's something that will respond, you know, it's like the glory hole, you know? If if it's something, if it's something that will respond to a human, then it, then you're going to, you're going to have humans trying it out, you know.
**Speaker 1:** You're going to poke it with something.
**Speaker 2:** Yeah, you're going to poke it with something. You're going to put, you're going to put your stick in there. You're going to put your stick in there and you're going to see what happens. If the stick gets chopped off, you're like, well, thank God, I only put my stick in there, you know?
**Speaker 1:** Before everyone knew what a, uh, what a glory hole was, there had to be at least one intrepid adventurer who was like, ah, I know what I got to put in there. I I know what's going to I know what's going to happen. And then it turned out great. And sometimes people with that mentality, they they end up on the right side of history, you know? And there at one point there had to be a guy who ended up on the right side of history.
**Speaker 2:** It was a hole. It was a hole until that one fateful journey, then it became a glory hole.
**Speaker 1:** The most glorious hole. Okay, let's get off, let's get off this topic.
**Speaker 2:** But don't you think it's, come on man, it's an analogy. It's, it's the same thing, right? Like AI bot, AI relationships, AI girlfriend, AI boyfriend, whatever, is the same thing. It's something that's going to respond to human interaction. And so, of course, humans are going to use it for that purpose. Like
**Speaker 1:** Someone had to be the first person to dare to form an emotional connection with a, with a machine. They had to dare to be that vulnerable to to ones and zeros.
**Speaker 2:** I mean, it's been happening, I feel like, overseas for for decades before this happened, right?
**Speaker 1:** Why overseas? Where It wasn't in Japan that someone married a robot. Man, I have not heard that one, but I mean, I could see that.
**Speaker 2:** I'm, I'm pretty sure it was Japan. I feel like I'm going to look this up on my phone. So, if you can do some like, you know, wait music or something during this time while I, um, you know, look up Chrome here and say,
**Speaker 1:** Yeah, that's what I'm, that's literally what I'm
:**:**Speaker 1:** That's before LLMs. Well, the thing is with the robot, there's like a physical robot there. So was it like, was it was this guy having physical altercations with this robot or was it more the robot knew how to say
**Speaker 2:** Uh, Japanese man is known for symbolically marrying the fictional locoloid character. I'm not even going to try. I'm not even going to try. I am sorry that I cannot that I am American as crap and it's, it is my fault. I am not cultured. Um, but anyway, they symbolized a marriage in a formal wedding ceremony, but it's not legal over there or anything. But they've been married, I feel like they've, I feel like they've been marrying bots for a long time.
**Speaker 1:** Is the robot hot? The question every, every dude is dying to know right now.
**Speaker 2:** I mean, technically it's a hologram. What? So, it is, it's really hot. It's, it's really hot.
**Speaker 1:** So, it's a hot hologram.
**Speaker 2:** Yeah.
zy, right? Cuz I mean like in:**Speaker 2:** Dude, I, I don't know. These are, these are where, these are where I get out of the, um, I don't, I don't dive into that anymore than I need to.
**Speaker 1:** Fair enough.
**Speaker 2:** Like headlines, it's a hologram. That's, that's all that I need to know about this situation. It's again, I feel like it's a glory hole, right? It's like
**Speaker 1:** These podcasts ain't going to record themselves, you know, we got more time to think about glory holes than to think about technology. You know, glory holes aren't technology if you think about it.
**Speaker 2:** Yeah, it's a form, man. It's a form of technology. It's a way, it's a form of entitlement, you know.
**Speaker 1:** Instead of the private forum, it's the public forum, you know. We need, we need, I mean, not the public forum, the private forum, you know what I mean?
**Speaker 2:** It's got like a phone number. Be sure to call 15 minutes before and, you know.
**Speaker 1:** Oh, man. Yeah, if we can get AI to do that, that will, um, that'll probably break, break the male psyche.
**Speaker 2:** The male psyche is already broken, man. That was over a long time ago.
**Speaker 1:** So it whispers sweet nothings and it does, you know, the rest of the stuff that you do there.
**Speaker 2:** I don't know, man. The, the main, the main news feeds, they want to, want us to believe that, you know, AI is, is forcing people to commit suicide and things. By the way, um, if you're, if you're thinking, if you're listening and you're thinking about that, go get help. Like mental, real, like go, go, go talk to a therapist at the very least. Like go, go, go talk to, get real person. AI is not going to help you, anyone.
**Speaker 1:** Yeah, start with any supportive human that you have in your life. Start with that. Just just a solid person who is supportive. That's all you really need.
**Speaker 2:** It's okay. It's okay. Go get help. It's okay. It's actually good for you. It'll, it'll take you a couple months to actually break through, but, you know. Get vulnerable with, with a therapist and and talk through your emotions. That's, uh, either way though, uh, AI, AI glory hole.
**Speaker 1:** As advanced as AI is, AI glory hole.
**Speaker 2:** That's the, that's the new app. That's the new app name. AI glory hole.
**Speaker 1:** I heard that Illinois banned AI in healthcare because they might tell someone the wrong thing or something. I mean, look, the stakes are pretty high in healthcare.
**Speaker 2:** But, why about WebMD? Like, why, why are you banning AI, but you're not banning WebMD?
**Speaker 1:** WebMD doesn't give prescriptions and it doesn't like, I think they mean like, like you can still ask an AI medical questions. But there you can't like a hospital can't say like, oh, we'll just have an AI answer people's basic medical questions and, and I thought wrong. I think it was very focused on mental health as well, the legislation.
**Speaker 2:** It's trying to get everybody's jobs, but I don't think it's going to get take anybody. It's going to be good at nothing. That's what we're finding out with the A, the latest AI bubble is that it's not actually really good at anything. You still have to have some form of talent and if you want to take it to the next level, you should probably have a human who knows what the hell they're doing use it.
**Speaker 1:** We're always talking about AI here. Let's just make this an AI podcast at this point, honestly.
**Speaker 2:** AI is talked about everything though, man. Everyone talks about AI. I don't think it's a, I, I feel like it's a, it's the pop culture's, like one.
**Speaker 1:** We're all part of the trend. Yeah. That's today's topic. Basically every day, yeah, it's today's topic. Every day. What if we talk about quantum computing? What if AI was quantum? AI connected to other AI across the cosmos. Two, two AI neutrons can be there at the same time through the split experiment. Anyway, you were saying.
**Speaker 2:** Well, you know, I think, I think part of this is that, you know, real relationships are hard and I think that, uh, I think especially in, in the landscape that we're in in modern times, I think it's, it's actually easier to, uh, isolate yourself and not have relationships and not explore the ups and downs of, you know, saying the wrong thing to people and, you know, etc, etc. And, and that feeling, that shame that you feel, you know, the, basically the ebs and flows of a relationship in general, right?
**Speaker 1:** There was literally a Tik Tok trend for a while where people would have like their own decision trees mapped out like if they say this, then say this on like how to order fast food or something. And I guess these people were like aggressively autistic maybe. I don't know, but it was just a trend for a while and it got really ridiculed because a lot of people who were clearly just like normal and doing it for like a meme basically were doing like decision trees in like very normal situations. Like that seems crazy to me though to need a script to order Chipotle. But I'm not I'm not super autistic. I'm not on the spectrum. At least I think so. So, that's probably why.
**Speaker 2:** Well, do you think it's just maybe as simple as you're in practice? Like you're like I I go to Chipotle and I'm taking order and it's not debilitating for me to go practice that.
**Speaker 1:** I think the difference for me is that if I went into Chipotle and I forgot completely how to order from Chipotle, I would be completely fine just saying, you know, I really have no idea what I'm doing, um, like, could you help me order? Like how does this work? Like I've done that in multiple places, especially Chipotle likes where I'm just like, wait, how does this work? They usually give me a stink eye and they're like, you're stupid or something. I'm like, yeah, I'm dumb. Okay, help me.
**Speaker 2:** Do you have memories of when you were a child uh at a restaurant for the first time?
**Speaker 1:** No.
**Speaker 2:** No.
**Speaker 1:** I wish I wonder if like some of this is kind of like a childhood trauma situation or something where it's like you are, you know, you're at a restaurant and your parents are finally like, look, we're not ordering you spaghetti from the kids menu. You need to order spaghetti from the kids menu and it like jams them up, you know, and like, and it's just this like, uh, it's now a blocker for them. They can't, they can't muster up that, uh, that ability, you know, and so now they're like, and because they're like super analytical also, they're now like mapping out their shit. They've got like a spreadsheet, you know, like pages and pages of spreadsheets of when I go to Ryan's steak house, I'm going to, I got to do it this way, I go to a buffet, I do this, when I go to a drive through, I roll down my window.
**Speaker 1:** Dude, I mean, I could see it. I could see it.
**Speaker 2:** Yeah, I just, um, I think it's so easy now because, I mean, you don't have to go to the drive through anymore. You can DoorDash everything. You can DoorDash it all.
**Speaker 1:** Maybe we just have so many like ordered interactions.
**Speaker 2:** Yeah, maybe it's, maybe yeah, maybe there's a a fatigue there.
**Speaker 1:** There's way fewer times where you like just run into someone and you're just like, well, guess we're both waiting in the same place or something like that. Like what's your deal? I'm just bored. Let's talk. I'm normally like, I feel like I have business to conduct on my phone or something if that ever happens, you know. I got some, I got some AI glory hole contracts to line up. So, no time. No time for that. Um
**Speaker 2:** When was the last time you were in a waiting room?
**Speaker 1:** Jeez. Well, I can't say the specific circumstances, but, uh, I wasn't really in there like with anyone else or anything like that. The last time I was in a waiting room with other people, probably like a month or two ago.
**Speaker 2:** Have you ever like watched the, the others or did you go immediately in your phone? Did you, did you sit and like observe?
**Speaker 1:** I, I, I'll, I'll usually like pop up my phone if I have something to do really quick. I'll usually just sit there and just like look around for at least a couple minutes.
**Speaker 2:** Okay. What, what do you, do you like, can you recall what they, I'm, I'm leading you on because I have observations of what people do.
**Speaker 1:** I am. I saw a guy vape in a crowded waiting room once. He thought he was really slick. It was, it was hilarious. He literally just like obviously had the vape in his hand. And then this waiting room was not very large and it was fucking, it was packed. It was absolutely packed. And he just, you could hear the thing cuz you hear the crackle. And then he just leaned into the far corner and just and just like blowed an obvious cloud. Like it wasn't a big cloud or anything, but it was like so obvious he was vaping. Like the cloud wasn't horrendous or anything. It was more his act of vaping, and I was just like, "Dude, that is there were kids in that waiting room," and I was like, "What the hell?" I couldn't even believe what I was seeing. It was absolutely crazy. And, um, I'd like to say I was a hero, and I said, "Hey, you should probably stop doing that in this waiting room with children and old people." Uh, no, I just watched him incredulously, and after a couple minutes, I was like, "Well, he did it like 2 minutes ago, so it's a bit stale now. I'm not going to go confront him now." And he did he didn't do it again. He only hit the vape once, but that still that guy stuck with me. That was incredible.
se we didn't outlaw until the:**Speaker 1:** No, you know, this is the vaping episode. This is now the vaping episode. Dude,
**Speaker 2:** No, no, no, no, no. We are talking about relationships, dude. We're talking about, we're talking about waiting rooms now. We're talking about the relationships of waiting rooms, dude.
**Speaker 1:** I, I do remember I remember hearing like smoking or non-smoking, you know, if we're going to talk about the relationships, you know. I don't see what other relationships, the relationships I have with my cat. I hate that little bitch.
**Speaker 2:** Uh, you forgot that I was going to talk about what I observed in my waiting rooms. But, I went on a tangent. You want to change,
**Speaker 1:** I thought you were already done with that with your smoking non-smoking, dude. What did you observe in your waiting rooms?
**Speaker 2:** Uh, the whole point of what our conversation about relationships was, and when you're in a waiting room, no one is talking to anybody about no one's talking saying, "Oh, you're in this waiting room. What are you all about?" They're on their phones. And I've actually And I actually, and I actually observed people literally just going like this.
**Speaker 1:** Wait, really? You've seen people just do nothing on their phone. Just scrolling through nothing.
**Speaker 2:** Just swiping and scroll, and yeah, just swiping, scrolling, clicking an app and closing it immediately, you know, doing all, like just literally
**Speaker 1:** They can't come up with anything on their phone. There's like a million things on there.
**Speaker 2:** Right. But they can't keep like, uh, attention enough to even, they're just swiping. It's almost like a they have such, almost like a social, such a social anxiety that they're just, I'm just going to do something else so I don't have to think about the fact that I'm in a room with a bunch of people.
**Speaker 1:** Dude, they're literally, they're literally stemming. They're literally stemming if they're doing that. That is the definition. Stemming is like when autistic people just do something just for the comfort of doing something. Not a scientific definition, but that's it's a colloquial thing.
**Speaker 2:** Yeah. And I, I would assume you don't have to be autistic to stem.
**Speaker 1:** Yeah, you don't have to be autistic to stem. But, uh,
**Speaker 2:** I've seen it. Seen it so many times. And usually I'll I'll then prompt them. I'll be like, "Hey, how's it going?" You know, just to, just and then you'll see like the look up and go, "Oh, shit, someone's talking to me."
**Speaker 1:** Are these like young people or old people?
**Speaker 2:** Uh, it's a mix. Usually it's not any older than I am. Like anyone, anybody above I would say 55 doesn't pull their phone out at all. They don't. And they're so easy to and they're so easy to talk to and they want to talk to you and they'll talk to you about anything and it's great. It's lovely. It's lovely to have a conversation with an old person.
**Speaker 1:** Mm-hmm. Yeah, exactly. That's, that's exactly what I'm talking about with relationships. That face that you just gave right there is exactly why young people don't even talk to themselves anymore. They don't even interact with each other. Don't even interact, right?
**Speaker 2:** Well, here's what I'm, well, here's what I'll say. Here's what I'll say. Relationships are complicated and if you can't handle human, human ones, there's artificial ones, but you know what? Go find help. If you're, if you're going, if you're going out of your way to get an artificial relationship, go get help. It's fine. It's fine. It's okay. And that's my final word.