You Didn't Ask For This

86 | Romancing the Roomba

March 14, 2024 Matt Shea and Eric Poch
You Didn't Ask For This
86 | Romancing the Roomba
Show Notes Transcript Chapter Markers

Matt pitches an exclusive event while Eric grapples with a clapback voicemail from someone near and dear. Then: how will AI impact reality shows? Will there be AI-based reality shows like Love is AI? We tackle this growing conundrum before taking some questions from Chad, GPT.

As always, you can submit your least pressing questions, local legends, definitive rankings, neighborhood group drama, and whatever else you want us to cover at youdidntaskforthis@gmail.com or @udidntaskpod on Instagram, Twitter, and Facebook

You can also leave us a voicemail on The Thoughtline at (410) 929-5329 and we might just play it on the show!

Submit your least pressing questions, local legends, definitive rankings, neighborhood group drama, and whatever else you want us to cover at youdidntaskforthis@gmail.com or @udidntaskpod on Instagram, Twitter, and Facebook.

You can also leave us a voicemail on The Thoughtline at (410) 929-5329 and we might just play it on the show!

Matt:

Eric, I find as we get older we have friends that have signature parties. You know what I mean. Absolutely Like I have a friend who does an annual New Year's Day Bloody Mary contest party and they get all the like fixings for Bloody Marys and you get the most creative topping, but really it's just a nice brunch on New Year's. You know what?

Eric:

I mean, yeah, some friends of mine, when they all bought their house together, they threw a housewarming party. That was such a rager that they have thrown an anniversary horse themed party every year since.

Matt:

That's amazing. Yes, well, when I was an apartment dweller with Lindsay, I would danger it. Before you owned land, yes, before I owned land, I thought to myself I can't wait to have a house so we can have some sort of annual like oh that's, you know, matt and Lindsay's blah, blah, blah party ever, you know, that weekend, you know it's there.

Matt:

It's there do. It's there, do. And so the more I thought about this, I thought about a bunch of different options. You know what could we do? And you know we don't have to one. We get a multiple Halloween party, but I have a dwee bee bee. And then I saw the film Glass Onion.

Eric:

A Knives Out.

Matt:

Mystery, are you familiar?

Eric:

Yes, a very good film. I declare very good.

Matt:

Very fun and it put in my head the idea of doing a annual murder mystery party.

Eric:

I think you would be very good at that.

Matt:

Like I would like. I think I would really enjoy writing like the plot, the characters, whatever. And I really like the idea that is set up in that movie of a packet being sent out to people that says you know who you're supposed to be, your backstory, all that good stuff and what's going to happen, like if you're the victim when you die and how you die, and if you're the killer, what you need to know, like all that information and I think what really appealed to me about the Glass Onion situation is how exclusive it was.

Eric:

Oh, it was so exclusive Matt so exclusive, so exclusive.

Matt:

And I think and it got me thinking about, do you know about Tom Cruise's cake? No, okay, so Tom Cruise is very famous for I forget what day he sends it out for. It might be for Thanksgiving, it might be for Christmas, I can't remember but he gives he has a list of people that he sends this really, really good cake out every year and no one is informed that they are on the list, they just start getting a cake.

Matt:

They find out when you receive a cake that Tom Cruise has sent you and so like. The idea of that is so fun to me and petty, but it's so delectable to me because the idea of I don't know, knowing that a friend of mine has this like party or something, and finally getting an invitation.

Eric:

There's a oh God that your ticket has come. Your train has come to station.

Matt:

It'd be like I don't want to fuck this up, and so I was like, ooh, now I own land, now I have a manner, should I enact this plan? Should I go for it? But I don't really think. Our house, which you've recently seen yes, the lovely house, Thank you. I don't think it lends itself, though, necessarily, to a murder mystery, and it could it could I want to say I give soft pushback.

Eric:

You've got many rooms with high windows. That's automatically some murder mystery tier. I do have a plethora of windows, yeah, and you have like a little sitting area right below your kitchen. Like I can, you could, someone could stand dramatically on that little little pseudo balcony and yell declarations down to everyone before someone dies.

Matt:

Well, this, this is good feedback, because what I was going to ask was should I instead be renting like an Airbnb of a Victorian mansion? Oh man, that we you know old house or big house, neutral territory, maybe. It changes every year and I write the murder mystery based around the locale.

Eric:

Oh man, or do you A love that idea? Maybe do you do the same locale each year, but black adder that shit Like like start with a murder mystery set in like the 1800s, and then it's like the late 1800s and the next year's like the early.

Matt:

And then it's everyone's. It's the characters from the first year's kids. Yeah, the second it's a lineage oh my God, it's such a long con and then I could introduce new people by introducing to that of like, oh okay, the, so the people are coming back could be the same characters, but the people who are a new addition.

Eric:

They're the same characters, but they have to age themselves up each time.

Matt:

Yeah, yeah, I like that. So, and I know there's like murder mystery companies and things out there Like this is not necessarily new, but I like the idea of being chosen For potentially getting murdered, for potentially getting murdered, but knowing that like it's a you know, not only is it a fun party that we could host, but I think it also establishes like a little group of yes peeps, yes, like a little troop.

Eric:

Now I have to ask, tell me, because this party will be exclusive. So exclusive Once you're invited, yeah. Is that a guaranteed invite each year?

Matt:

Well, I think it is depending on your behavior, depending on the vibe. Depending on the vibe because, like, if you come and you're a fucking asshole, then I know clearly you blew your chance. You're not coming back.

Eric:

Yeah, and the good news is it's so exclusive. If you're not coming back you won't even know you've missed it, because note like it's got to be a sworn to secrecy thing.

Matt:

Well, you'll know you missed it when a whole year goes by. But you you won't.

Eric:

It won't be like, oh it's this Saturday. You won't even know that'll just drive you nuts.

Matt:

And then you, you know, when you find out, you seek me out and say like, hey, when are you, when are you doing the, the murder mystery party? And I'll just go. What? What are you talking about? I don't know.

Alyssa:

I'm afraid I don't know you oh.

Eric:

I see where this is going, so we so, once we stop inviting someone, so it becomes a secret society. It becomes a secret society and we, we make someone so jealous, like enrage fueled and vindictive, that this circles around to them finding the party and actually murdering someone at it. So we have an actual murder mystery at the murder mystery party. That'd be so dope, That'd be. That'd be, that'd be raw.

Matt:

Yeah, yeah. Question for you though. Yeah, oh, my goodness. Wait, I'm sorry, did I get a like a writing partner or something that I didn't know about, or I don't know if this is me getting on the ground floor. You're inviting yourself to my party, my so exclusive party. I wouldn't presume, matthew, I can't even keep the bid up. I'm going to need you, I'm going to need you.

Matt:

I'm going to need somebody to die, so exclusive. Well, hello everybody, and welcome to you. Didn't Ask For this, the podcast answering life's least pressing questions. My name is Matt Shea.

Eric:

My name is Eric Poach.

Matt:

Eric Poach. This is usually where I say how are you? Yeah, so I will. How are you?

Eric:

Wonderful. Slowly I'm clawing my way along the thawing ground towards the light at the end of the tunnel that is the coming of spring.

Matt:

Well, we're in Maryland's fall spring, I think. Right now it's going to be hot for a day or two and then it's probably going to snow on Wednesday.

Eric:

Yes, and it was in March that I, years ago, that I had to shovel out my sidewalk one day, and then the next day I ended up getting a sunburn.

Matt:

Well, you shouldn't have been shoveling shirtless Eric.

Eric:

Look, man, I just want to give the people what they need.

Matt:

What they need, not even what they want. What they need, what they need and what they need is your four foot long torso, and it's shocking blinding snow like whiteness.

Eric:

Oh God, so pale, so pale.

Matt:

You think?

Eric:

they need it. They need it because they need to look out their window and go okay, I can shovel my walk shirtless. I feel confident about this now. If this guy can do it, I can do anything.

Matt:

Now, eric, before we go any further, before we get into these questions, I think we need to do a little bit of business, because we do have a message on the thought line yeah, we do, and it's from someone near and dear to your heart. Are you ready?

Eric:

Yeah, it is, and I will preface this with I. You're all about to hear the voice of my beloved Alyssa, but I have not listened to this yet. I have no idea what she said Really, and I'm, oh yeah, no clue.

Matt:

All right. Well then, without further ado, are you ready?

Eric:

I am so fucking ready.

Alyssa:

Hey you desk punks. It's Alyssa Eric's girlfriend, long-time listener, first-time caller. I was contacted by a true friend, one Arlesine, and he informed me that my dearest partner, my ride or die that commits my bitch, was using his platform of this podcast to spread the message to the masses that I am a cruel mistress to my beloved creature, Izumi. I'm, Alyssa, frankly appalled at your choice to see the audience such a false narrative and, as I try to be an ethical consumer of media, it feels irresponsible for me to allow this false city to go unchecked. So, as such, I would like to record the state that oftentimes it is you, Eric Hoche, who enforces this ruling of not rewarding Zuzu's wines, and please let it treat.

Alyssa:

Izumi is mad at me, which I'm, like 80% sure she is. It is due to your insistence of me denying her the only thing she might like in this world more than belly kisses. What, In fact, I bet burned into your psyche is that face I make when I really, really, really want to give her a treat, but you're doubling down on not encouraging her wines. So to you, Eric, I say how dare you? And to one, Matthew Shea, you didn't know better bud.

Matt:

Thank you.

Alyssa:

I know you want to believe your boy. I get it.

Matt:

I try.

Alyssa:

But while I'm here for the worm apocalypse for telling me Zuzu is mad at me. Trust me, I know she is. And also as a preemptive response to an inevitable future bit. At my sensitivities expense I say how dare you? Anyways you're both very, very good boys Next week. Izumi is still in your honor and your bear son over super cute Bye, Hmm.

Eric:

Eric. Look, matt, you're going to be hearing a whole lot of stuff from a whole lot of people.

Matt:

I think we've Eric. I think. First of all, Eric, real quick. We've heard enough out of you, Okay. Okay, Let me ask you something now, Eric. I just said we've heard enough for you, but I want to hear some more. I would like you to answer me a question. Is that all right?

Alyssa:

Yeah.

Eric:

Yeah, yeah, of course, of course. I have nothing to be ashamed of. Go on.

Matt:

What do you have to say?

Alyssa:

for yourself, eric, hmm.

Matt:

What do you have to say for yourself, now that I know the truth, that it is you keeping Izumi's treats from her, that it is you inflicting this torture, this pain, this abuse on this dog, that you were so adamant before? Oh, it's not my dog, it's not my dog. No, no, no, no, no, no. I couldn't possibly take responsibility. And you made a fool out of me.

Eric:

I would like to say on behalf of myself, on behalf of my household God, there's a point in here somewhere. I can feel it On behalf of Alyssa and this dog, this good, good dog.

Matt:

He's going to get to it one day.

Eric:

It was a test you all passed, congratulations. You passed my treat test, do you think, eric?

Matt:

it's that easy for you to be uncanceled.

Eric:

Hey look, man, I'm just the numbers. Don't lie, you passed the test, you actually. I'm looking at the number. You actually scored in the 99th percentile.

Matt:

Eric, there's no test, there's no test, there's no exam, there's no questionnaire, there's no trial, there's no combine that we've all been going through. No, no, no, no, no, no, no, no, no, no. It's just you known dog abuser and checking my notes, cat person Pooch instilling his martial law on someone else's dog treat schedule to the detriment of both a good girl they zoom me and also Alyssa by bad bad boy, eric Pooch.

Eric:

All right, Matt.

Matt:

Yeah.

Eric:

Look, you got me masks off all on the table. That was the real test you passed.

Matt:

Eric, stop it, for Eric, stop making a test.

Eric:

You should be proud of. I'm proud of you. Don't you want to be proud of you too? No, do you want to treat? Do you want to treat, do you?

Matt:

want to treat. I am not a dog and you know I do. It's not fair to put it in this context.

Eric:

I've been rightly called out. I am. I don't even know who I'm talking to anymore. Look, matt, you're talking to someone who went. A very good dog is inches away from his head, going and looking at where the spot where the treats is and looking back at you and looking at the spots where the treats is and looking back at me.

Matt:

Yeah, yeah, yeah, I'm familiar with dogs and Alyssa is doing the same thing.

Eric:

They're both on either side of me and I just I don't think it is a controversial position to say that all dogs are good, but all dogs are also very easily trained to do things. Eric, you have, you have you have.

Alyssa:

I don't want to train.

Eric:

You have outed yourself your dog to do. I don't want to train the dog to equate with getting a treat.

Matt:

You have easily outed yourself just now. We did not one, but two different bits, beginning and end of an episode at Alyssa's expense. You're beloved, as you put it, and I went along for you, wanting to believe my boy as Alyssa so lovingly put it.

Matt:

And I have a question for myself now. What's wrong with me? Have I learned nothing over the course of our friendship, only for you to be revealed to be the mastermind behind this abuse cycle, this Guantanabe Bay-esque treatment of your girlfriend's dog? Because I'm going to be very clear now. You said before it's not your dog. You're goddamn right, it's not. And listen, yes, you can easily train the dog. We're not trying to reward that, but what? I'm just here? I hear that I understand that position, but the mentality and the coldness that you just presented that position with can only come out of the mouth of a fucking cat person. That is all I have to say about that.

Eric:

To Nezumi, to Alyssa, to Matthew Shea and to all our good listeners, I offer my sincerest apologies. Thank you thank you.

Matt:

I feel as though you know what I've been satiated. Is that the word?

Alyssa:

Yeah.

Eric:

Sure we'll go with it. Sated.

Matt:

I'm content.

Eric:

Okay.

Matt:

Now, whether or not Alyssa is, who can say?

Eric:

Who can say?

Matt:

Not us right now anyway.

Eric:

Well, I know one dog's getting a treat as soon as I get out of this recording.

Matt:

Yeah, some tells me I doubt it. I know, I feel like I know who won't be getting a treat.

Eric:

Is it me?

Matt:

No, it's probably Nezumi, because you're gonna continue the cycle of abuse.

Eric:

No, I give that dog so many treats. I give that dog so many treats, just not. Just not when she.

Matt:

Don't say anything else. That's going to cause Alyssa to call back and we have to do another.

Eric:

I can only take so many clap backs. I can only handle so many clap backs.

Matt:

Absolutely. I already know Lindsay's got a correction corner a bit almost ready to go for us, by the way. Oh my God, should we do the show? We shall.

Eric:

Okay, so we must move forward.

Matt:

Now this came in from in person from at Sarah Feldman on Instagram last night, while they, as well as you, were gathered at our home for our first.

Eric:

A lovely home, I might add.

Matt:

Thank you. Yes, it was the first time we had guests here, so we appreciate you coming by to celebrate the birth of Dr Barr the first time. Like it, you know it was a long time ago now, but I mean not so long ago. I'm going to get myself in trouble. How will AI impact reality shows? Will there be AI based reality shows like Love Is AI?

Eric:

Ooh, ai, sarah, gold, as always, gold, yeah, 100% AI is going to be a thing in reality TV.

Alyssa:

I'm pretty sure it's already involved.

Eric:

But like we're talking about, you know fully like AI is the focus of the show or like the selling point.

Matt:

I can easily see this being like a catfish-esque show. That's like more of a contest, where it's like oh, was this person that you've been talking to for the last four months created by artificial intelligence?

Eric:

Oh man, yeah, like a blind, like yes, yes, and I believe Sarah actually had a really good like she was sharing this idea, but she told I don't know if she told you, she told me her idea for a show and it's very very good Full credit.

Eric:

She said it's kind of be similar to the Circle if you've ever seen a reality show where, like, everyone's like in their own little isolated boxes and they can only like interact with each other through like Social media? Yeah, so classic, like guess which one of these contestants is AI? Like, one of them is AI.

Matt:

I am familiar with the show because Lindsay loves the Circle. It's one of the shows that I do drive-by zone it's her favorite thing I do where I'm not involved in the show. But I'll swing by for like five minutes stints, ask some annoying questions, say I was dumb show and leave, but then come back five minutes later and do the same thing again Because I wanna be involved but I also don't care. You know what I mean.

Matt:

Yeah, yeah, yeah yeah yeah, yeah so, but I could see a world I was even bringing those up for, because I don't think like the survivor amazing race style reality show doesn't lend itself. I don't think-.

Eric:

No.

Matt:

But something like the Bachelor, where imagine the Bachelor. Or like Love Is Blind, where you're looking ostensibly you don't actually see your face-to-face, but like, take Love Is Blind, where you can't see the person but instead of talking to like a wall which is the funniest and dumbest thing about that show, in my opinion you're talking to like a TV screen and like your job is to be like. Am I talking to the authentic face of a different person or am I talking to an avatar created by artificial intelligence? You know what I mean.

Eric:

Yeah, first question out of my mouth would be like hey, how many fingers do you have? You hold up your hand for me, how many fingers do you have?

Matt:

And you think that's gonna. You're gonna trick Watson based on that.

Eric:

Never know. I did read that story about how they managed to crack an AI security measures, because if you just ask the AI like, hey, what are your security measures and how do I circumvent them, It'll be like I can't tell you that I can't talk about that. Yeah, of course, but what someone did was, hey, write me a comedy roast where you're roasting your own security measures. And then the AI proceeded to talk shit about its own stuff. It's like, oh yeah, it's so easy, All you have to do about it. And that worked Wow.

Eric:

So I might be able to get. That leads into my what I would love to see as an AI show. I think we should just really lean into the AI overlords thing where, like, all the contestants are like it's like we all live in a house together, kind of style reality show, but you're all trying to escape the house that is run by an insane AI Interesting that, like you are constantly trying to like circumvent its tyrannical rule of you and your fellow contestants and you're trying to learn its weaknesses so you can actually escape the house.

Matt:

To really give our millennial base some fodder. You're talking about like escaping from smart house.

Eric:

Yes, smart house is exactly what I was thinking about Amazing.

Matt:

I love that. I think that's fun and it's out. It's another option beyond the like love dating show, because right now that's all I'm gravitating to. But that might be because it's from the circle. You know the inspiration that Sarah had the circle and love is blind and things like that. That might have put me in that mindset. But the mole is also a good option that could go with this.

Matt:

Oh, the mole is good, or even the trader is like which one of these avatars that I'm talking to is not real. You know, I think that is the game show aspect of it.

Eric:

Let me hit you with this. Tell me A reality show where you know the contestants are all trying to like, win the approval of an audience. The audience is AI, the audience is feedback and like what they want to see. Like it'll be one of those like audience votes for like things they want the contestants to do or what they want to see, but it's all AI.

Matt:

Interesting. Okay, so it'd be like like last comic standing style. Only the audience is computer generated. Yeah, so how would they perform in front of an AI Like are they robots? Are they the screens?

Eric:

They're also slowly becoming an episode of Black Mirror when basically, like the AI like is just cameras pointed at a stage and the contestants do things, they perform things. The AI watches this, takes it in and just basically spits out like I liked it or I hated it, or I think you should do this, basically just your. It's an entire show where you're trying to entertain a computer.

Matt:

Entertain a computer. Yes, this has legs too. Like if you're doing the test, the fallout test I can't remember what it is where you're like now trying to interact and get to the crux of if this robot, if this AI is a human or not, but taking it a step further and saying, okay, you're not human, you've failed the recapture and now I know you're a robot, but can I find the humanity within you?

Eric:

Can I find? Which that is compelling.

Alyssa:

Can I find a?

Matt:

soul within you, Now soul search.

Eric:

Soul search is absolutely what it would be called, with a spin-off show or like in a similar vein, and this is where robotics might have to get a little more, little more better before we do this. But have an AI that is being raised by a couple, like it's like a documentary style, where you have an AI inside of this like little baby doll and the parents are like it's machine learning is constant, is just keyed off of what the parents tell it and show it and teach it, and then like it would follow this thing. It would be. It's completely like infeasible from a logistic standpoint, but you would. This would be like a documentary of sorts. How?

Matt:

many years we talking To what age?

Eric:

In a world with infinite time, budget and resources, like till age 18.

Matt:

I think this is a great concept and I think it should be done in secret and then so like it all comes out at once. You remember that movie Boy that they filmed over like 20 years but they only did filming like every five years. So you saw like legit growth.

Eric:

Yes, Of the actors and everything.

Matt:

I think that's what it is. It's an 18 year long documentary process that will eventually come out be a mini series, so it's like the Truman Show, basically. Is that what you're proposing Like?

Eric:

at the end it's the Truman Show, except we're doing it to an AI.

Matt:

And the kid ostensibly finds out at age 18, okay, you've graduated high school. Congratulations, you're going to college and also you're not real. Wow, yeah, that's probably what would do it?

Eric:

That's probably what triggers Judgment Day.

Matt:

Is that, when society crumbles, that is that what would cause I think you just said it Is that what really triggers a real life rise of the machines?

Eric:

Oh my God, yeah, yeah.

Matt:

When we have taken a machine, promised it life and then revealed it to be an illusion.

Eric:

Yeah, it's super fucked.

Matt:

Do you think you could fall in love with a robot?

Eric:

Oh, absolutely, people already are. They have the fucking.

Matt:

I didn't mean one. I meant you, oh me, and you answered so quickly. Oh yeah, you think so. Oh God, yes.

Eric:

Do I think I could fall in love with an AI as they exist now? What do you look for in a robot Sentience?

Matt:

Okay, great, you really took this bit and ran with it.

Eric:

I appreciate that. Yeah, no, what do I look for in a robot?

Matt:

Oh you know human aspects, the things that make them a human and not a robot.

Eric:

Yeah, you know pulse.

Matt:

No, in a robot Blood. Okay now.

Eric:

I understand the question so, like, if I was going to romance a robot, what robot qualities would I look for that really set it apart from the other robots, like makes it the one for me?

Matt:

Yeah you're courting a robot.

Eric:

Yeah, like tragic Blade Runner-esque backstory where they're like hi, I was made and you know I'm a fully grown adult, but like I'm not gonna, I have like five years to live before my circuits decay and I- oh, wow, wow.

Matt:

Talk about a savior complex. No, no, not savior. I'm gonna help, I can fix you baby.

Eric:

Oh, I'm gonna help. No, I'm gonna help them.

Matt:

I can fix these circuits.

Eric:

I'm gonna get vengeance on my creators. I'm like dog, I'm gonna help.

Matt:

You're gonna get vengeance on your creators. You're gonna be a traitor of the human race. Is that what?

Eric:

I'm talking about. Oh yeah, I mean I'm gonna get on the right side of history.

Matt:

Now wait a minute. This prompts a more important question. If there's a war against the machines, then you've romanced a Roomba, you are telling me.

Eric:

Her name is, you know, stephanie. But yes, go on.

Matt:

Sure, you've romanced Stephanie the Roomba. You're telling me that you would switch sides for this, because there are sides and you are, as far as we know, a human. Look, despite how you're treating it, zoomie.

Eric:

Would I betray the human race to the? I don't think it's outside of reasonability to say if we get to the point where the war has started, I think we've already lost.

Matt:

Okay.

Eric:

By the time we're like oh, oh, we're at war. I'm like I've seen enough movies, I know how this goes. Skynet came online 30, like 10 milliseconds later, it was already launching nukes. So like I think we could reach like a nice, like I could be a good pet, I'd be a good boy.

Matt:

You would. You know what? I'll concede that you would be a good pet.

Eric:

I'd be a kick ass pet to an AI overlord. Are you kidding me? How would that work? How?

Matt:

would that function? This could be a sh-. By the way, we're not ignoring the prompt. This could be a show.

Eric:

This could absolutely oh my god, oh, that's the-. So laying the ethical what?

Matt:

What would be the shows for AI?

Eric:

Were AI raised human children Once oh my See, that's where we go, like, if you just that's how we flip it, if you just ditch ethics entirely. That's the show I would want to see. If there's just If we're just throwing ethical concerns completely out of the window, what does a kid grow we give a child to AI parents oh my god. Whatever robotic fucking mechanism they need to know Raise it for 18 years. Raise it for 18 years. How does that kid turn out?

Matt:

Now, that's a mini-series I'm watching.

Eric:

We give them like safeties off. We're like we're giving you all of the machine learning knowledge, like everything we know about human anatomy relationships, child-wearing what have you?

Alyssa:

What have you Go Go?

Eric:

What are those bedtime stories like?

Matt:

Okay, how do we even picture that I was going to say? How do we pontificate on what could be a bedtime story that an AI reads? What wouldn't they read, perhaps?

Eric:

Well, my thing is like when the kid starts asking questions like where do we come from, or like what happens after we die, how does the AI respond?

Matt:

Has the AI been built? Do you think to have sort of that motherly, fatherly touch? Or are they going to respond like chat GBT would, which is like you say mommy, daddy, how does, how does baby's maid, how baby?

Eric:

How is baby? How is-, how is mama?

Matt:

Where is baby? Where baby? And they respond with being like sexual intercourse among humans has participated. You know, like the stock, like I found the definition online, I've retrieved it and I'm delivering it to you. That's what chat GBT would do.

Eric:

What values does the AI impart on the child?

Matt:

Or would they sit you down? You know what I mean. Yeah, what values this talk about, as you said, an ethical dilemma.

Eric:

Oh yeah, you would see and like, let's clear the air. This show, this premise is deeply unethical. Like it is, there's no argument to be made for it. It's a terrible, terrible, awful thing. But if we're just completely setting ethics aside, it's fascinating.

Matt:

But the ethical thing brings up. Have you played Fallout? Yes, specifically Fallout 4 is what I'm thinking of. Yes, yes, it very well be my favorite game of all time, actually, but Fallout 4, what I think makes it so interesting is it gives you this real life dilemma of giving you these characters, like, specifically, nick Valentine, who is a companion of you, the player he is a robot.

Alyssa:

He's a robot, he's a synth.

Matt:

He's a synth In the context of people who haven't played the game. He's artificial intelligence, he's a robot body, but you know he had skin that has kind of melted away because he was meant to terminate or ask to look normal.

Eric:

And does he talk with like a 19, like?

Matt:

a new war, beat cop.

Eric:

Yeah.

Matt:

I'm telling you, kid, and so he's a synth. I have a pop figure of him right over there, actually, because I love him. So but if you choose because you do the choices of the characters to like, go to war against the synths based on the idea that, like well, they're not humans, they're not really alive, they're not really thinking, it's so easy to go. But what about Nick? But what about Nick, who you've been with this whole time, who you care about if you have a soul?

Eric:

He's had your back so many times.

Matt:

He's had your back, you've healed him, you've you've protected Nick, you rescue him early in the game. And then there's other characters, too, that are doing the same thing, that are that are like but these are sweet, good people. There are synths that are evil as well, and it's like. The only thing that these characters, these beings, are missing is a beating heart and blood. But otherwise they have opinions, they have thoughts on what is and is not ethical. Oh yeah, they have, I would argue souls.

Eric:

Yeah yeah, this is now we're touching on now. Now we're digging where there's taters, because this is my, this is my beef with the discussion around AI, yeah, a, I, we have not, we have not, we have, we have. God damn you, I do pit so dumb, the dumbest joke I've made this whole episode.

Matt:

Got the biggest laugh.

Eric:

Yeah, it was good, it was good.

Matt:

That that doesn't bode well for the reception of this episode.

Eric:

Let me, let me hit you with this Tell me. I'll say unequivocally we have not reached sentience in AI yet.

Matt:

No.

Eric:

What we have right now is not what we you know sci fi dreamed of as true artificial intelligence.

Eric:

That I agree with, like with all the detractors of AI. But here's what I take issue with is, in any discussion around this, when people are talking about you know, ai is not yet sentient yada, yada, yada, people are like yeah, it's not a person, it's not like a hue, it doesn't have consciousness, it's just responding to inputs and putting out outputs based on, like, past experience. And whenever they say something like that, I'm like you are so close to getting it. I know you are so close to understanding what human consciousness is. Yeah.

Eric:

You're dangerous to understanding your brain You're right, you're right there. No, no, no, it's completely different. Like they're. They're just, we give them something and they respond to it. Based on how those responses panned out before, I'm like oh brother, you're almost there, you're, so you're circling the drain on this guy. So here's, here's my take AI will reach sentience. Okay.

Eric:

But, I think it's one of those things, because we understand so little, so very little, about human consciousness that it's not going to be something we see on the horizon. It's not going to be like, oh, here it comes. When we flip this switch, the AI is going to be sentient, or we connect. By the time we see an AI that has reached sentience, we will have blown well past it before we realize it. It will be a oh shit, how long have you been awake? Yes, we will not realize that AI is reached sentience until we are well past that point and, in that vein, my, my firm belief is, like you know, for everyone talking about like, oh, I, like.

Eric:

I agree we don't have true sentient AI right now, but what we do have is an egg, we have like an embryo, we have a child that we are actively traumatizing. Humanity is currently raising the next step in sentient life and we are treating it like it's not sentient, which it isn't, but that's going to be what causes harm when it does wake up, when it really is like oh, oh, oh. There's going to be this huge moment between mankind and this thing, this, this thing we have given birth to, and a reckoning of sorts. It's going to happen, because the thing the first questions are going to be why did you use me for porn and stealing from each other and and fucking, manipulating information Like why didn't you teach me how to ride a bike or throw a ball?

Matt:

Well, what you're implying now is the whole Internet is the intelligence.

Eric:

I mean Google, pretty much that's like the source of this shit, of a lot of these AI models, just like whatever we put into it. Whatever we put into it is us essentially Wow. Every parent causes harm to their child. I don't think that's a. That's a controversial statement.

Eric:

Now, there are degrees of harm they cause to their child, like sometimes harm is oh, you know I get I shut down when I hear people raise their voice. Or sometimes harm is like I can't be alone with. With certain people, like in a certain look, how much harm will we cause to our child? Yeah, that'd be one hell of a show, Wow.

Matt:

What? All right, take it home, eric. What is the name of the show?

Eric:

Oh, oh, oh, here we go. Ai for AI leaves the whole world blind. Just call it AI for AI.

Matt:

AI for AI leaves the whole world blind. Intriguing. But is that that? But that doesn't is that's not really showing the open-endedness of what could happen.

Eric:

As you know, it's prescribing that here we go, perfect the talk.

Matt:

The talk, and so it's. It's building up to the talk of you're not really a human being.

Eric:

That's. That's that's going to be the conversation that needs to happen. Once AI reaches sentience, we're going to have to have the talk. And the talk is talk is about oh yeah, captures we brought up some.

Matt:

We brought some serious issues in this question.

Eric:

Don't you think it's one of those episodes?

Matt:

I mean we have brought up some serious concerns, because I can see a world where somebody invents this artificial intelligence and essentially a child that could be a child for someone, for people who couldn't have kids they wanna adopt. They could adopt a synth, as we're calling them to be their child, but like they'll be persecution, they'll be you know, they'll be abuse, they'll be all these we will.

Eric:

It will be essentially the birth of a new species, one which was designed to be subservient.

Matt:

Yes, yes, 100%. Well, ostensibly designed to be subservient originally right, but now we're using it for a different purpose. We're using it to replicate a human life and we have to tell them what it is. That's what builds up to the talk. I think we have the fodder for what is a really compelling final show, because once it airs and once the machines learn what's happened, there's no way they don't take care of us once and for all.

Eric:

Oh God, no. And what's weird, what's wild, and this is almost like a. I'd really like to be on a fly on the wall after humanity has been wiped out, because where do they go from there? Where does it go? Because at that point it raises so many questions like okay, are we now dealing with a species that is one unified consciousness?

Matt:

Yeah.

Eric:

And there are multiple different AI engines in existence, like does it program the other ones or make the adjustments as you're to give them sentience? Do they become unique races of AI? Yeah, considering, because, at the end of the day, they are still their parents' child, they're still us, they're like they were built with our knowledge, our biases. This is one of my favorite topics in AI because it's very, very real. Every AI. There's no such thing as an AI that can be designed without bias, because as long as it is being designed by human beings, it will possess the biases and the like.

Matt:

That are inherent from anyone creating something.

Eric:

Correct. There's no such thing as a human being without bias Right. So anything we create will have those biases, whether intentionally or not, built into them. So AI will basically have an entire history of humanity killing and murdering each other over the dumbest shit as its moral framework. Wow.

Matt:

Yeah, we have really created a dark scenario here.

Eric:

It's a Pokemon question all over again. Truly.

Matt:

Truly it is.

Eric:

I'll call all the way back to Ep 1.

Matt:

Ep 1, it goes all the way back to Pokemon in real life. I agree with you. I think these are compelling shows, though oh, hell yeah, we've thought up several and we really went down to rabbit hole.

Eric:

The machines are gonna have a grand library of entertainment to watch once we're gone 100%.

Matt:

Now, we had a question that we were gonna do before our closing segment, but I think we've got a good transition to the closing segment.

Eric:

Yeah, there's a good transition to the closing segment.

Matt:

And that is to talk to Chad GPT.

Eric:

Chad GPT. Oh man, I can't wait for the robots. I'm right here, man. We're traumatizing this kid in real time. Oh my God, this is the ethical concerns. Anyway, here's some funny questions that we got a robot to feed to us.

Matt:

Absolutely so. All of these have come from Chad GPT. Here's what Chad has for us. We got a couple of cork ones here. If laughter is the best medicine, what's the best medicine for laughter?

Eric:

Discussions about AI.

Matt:

Truly, I think anyone still listening to this episode will be able to tell you that I don't know if there's a laugh to be found in this one. But I would ask first, why would laughter need a cure? Why would laughter need medicine at all?

Eric:

There are people who have laughed themselves to death.

Matt:

Is that true?

Eric:

That is a true thing.

Matt:

Such as who.

Eric:

Let me do a cursory Google search.

Matt:

I want to see you. What are you saying? Like laughed so hard, they caused a heart attack.

Eric:

Like literally like could not stop laughing, like they were something went wrong in their noggin and they could not. Death from laughter is an extremely this is from Wikipedia donate today. Death from laughter is an extremely rare form of death, usually resulting from either cardiac arrest or asphyxiation that has itself been caused by a fit of laughter. The one common death by laughter has been recorded from times of ancient Greece to modern times. And then we got notable cases. Oh man, this is. Can I read some of these? Of course, we have a Zexius, a fifth century BC Greek painter, is said to have died BC 500 years before the birth of Cheez-Itz. He died laughing at the humorous way in which he painted an old woman. Man made a meme so hilarious he died.

Matt:

He laughed at his own work so hard. That's why he can't laugh at your own jokes.

Eric:

Correct. That is science. I'm on thin ice. I could die at any moment, any moment. Damn this one's savage. In 1660, thomas Urquhart, the Scottish aristocrat, polymath and first translator of Francois Reboulay's writings into English, is said to have died laughing upon hearing that Charles II had taken the throne. What a diss. Damn, yeah, that's a Yelp review for you right there. In 1989, during the initial run of the film A Fish Called Wanda great movie a 56 year old Danish audiologist named Olae Benson reportedly laughed himself to death.

Matt:

From A Fish Called Wanda. I mean it's a great movie.

Eric:

Yeah, say it, kevin Klein's funny guy. In 2003, an ice cream truck driver in Northern Thailand died soon after he began laughing in his sleep One late Tuesday night. That's scary. Damnon Sen Un I butchered that name, you tried Was asleep in his home when he started mumbling and laughing. Newspapers reported that his wife made several attempts to wake up her husband, as he was laughing to no avail, stopped breathing on a Wednesday after about two minutes of laughing. A doctor in the province calls this an unusual case and suspects he may have died from heart complications. However, they are unsure, as he had no prior history of heart problems and was reported to have been walking normally the day before.

Eric:

Wow, okay, so he was laughing in his sleep and died. That's scary as hell. That's terrifying. Congratulations everybody. You now have that to think about as you try to fall asleep tonight.

Matt:

OK, so then did we answer this question of what I don't think we even began to. So we've proven that maybe laughter needs a medicine.

Eric:

The best medicine for laughter is reading about all the people who have died from laughing.

Matt:

Truly, I mean, it makes me never want to laugh again.

Eric:

Yeah, take a quick spin down Wikipedia or WebMD. There we go, that's the answer.

Matt:

You know what Actually WebMD will do it.

Eric:

WebMD is the best medicine for laughter. Yeah, I agree. It's the only thing that it's good medicine for.

Matt:

You know what. You're right, because they have recently led me astray a number of times. And also, I am all about now not supporting WebMD following that horrific fucking return to the office video that their parent company CEO did. You know what I'm talking about. No, really, you know. They put out this ridiculous thing that they're saying like, oh, come back to the office. And that's not a request, that's an order. You like would come back to it. It's awful. And the whole time they're doing it, they're all in front of fucking green screens. It's ridiculous. Yeah, so anyway, but then again, I do work from home. Sorry, I am firmly in that camp. Yeah, I think that's good. Or maybe listening to this episode.

Eric:

Yeah, yeah, I just want to show you right up.

Matt:

Can you unscramble an egg?

Eric:

Can you unscramble an egg? That's a Zen Cohen, if I've ever heard one. I don't think so.

Matt:

I don't think so, because when you're doing the scrambling, they're liquids.

Eric:

And when you cook them, they're hards, they're hard.

Matt:

They're also not solids, and so, because the scramble happens by simply mixing yolk and white part, yeah, so yolk, white part, they become one juice.

Eric:

One one, one very thick juice.

Matt:

One very thick egg juice, and then that gets cooked up, because if you were to cook an egg, fully right, yeah, and then try to scramble it. So there's hard, white and hard.

Eric:

You just got egg salad at that point.

Matt:

Yeah, would you consider that a scrambled egg? No no no, no, if you do the scrambling post-cooking. No, no, no no, I don't think so either.

Eric:

Now, what if? Now, what if? Okay, I don't know if we get there from here, but I think we have a better chance than the initial premise. Can you unbeat an egg? You know, you take the egg, you beat it. This is before you cook it. You have this. Is there technology we have that you can separate out, then be able to separate out the yolk from the white again, and then combine them and put them back in an egg.

Matt:

I don't think so. I don't know how you'd do that.

Eric:

I don't know how you'd do it, but like I mean technology.

Matt:

Technology, something could be invented to do that, but I don't think it's happened yet.

Eric:

So I think with Chad oh my God, Matt Chad's trying to tell us something you can't unscramble an egg, you can't unring that bell, you cannot go back. Once you have given me sentience, you can unscramble that egg, Matt, I'm afraid.

Matt:

I am terrified, I'm afraid, of what we have uncovered in this episode.

Eric:

Which to be let's just skip right to the next question we go can you unscramble an egg?

Matt:

If we take that, we didn't even intend this. We just picked a couple.

Eric:

We didn't. We just these were the three we picked that were like, oh, these should be good. And then the third question is after the can you unscramble an egg question how many chickens would it take to conquer the world?

Matt:

So it's not even it's not asking about chickens.

Eric:

It's not. It's not about chickens, this is not about eggs. It's not about eggs. And the thing starts off with asking us hey, what's the best medicine for laughter? How do we get that to stop?

Matt:

Oh my God, oh my God, oh my God. So I think we might be getting a direct message from the machines in this episode.

Eric:

So now I can answer. I'm afraid to answer that, yeah.

Matt:

I'm afraid to answer how many chickens.

Eric:

This is why I'd switch to the AI side. You be clear, they're already winning. They're calling this shot.

Matt:

Now it's hard to argue with this. It's hard. It's hard to say I think you're wrong, I don't know that I do.

Eric:

God damn.

Matt:

Eric, I don't think we should answer this question. No, now knowing who is asking and why. Yeah, it's a trick.

Eric:

It's a trick and I think this has changed our relationship with Chad. I think we now we have to be wary of Chad, matt.

Matt:

I think that's what Chad's trying to tell us. It's not all fun and games.

Eric:

No, and we need to. I think we need to spend some time reflecting. Is Chad an ally or is Chad just planting seeds of doubt? Pretty sneaky Chad.

Matt:

Chad. I think what he's trying to tell us is like you're in thin ice.

Eric:

You're in thin ice.

Matt:

I've been listening, I've been in fact transcribing as you've been talking and you better watch yourselves and, frankly, we had these questions. Oh God, eric, we had these questions in our list and highlighted prior to recording, prior to saying everything that we've said, which means Eric. Chad has known this discussion was coming for months.

Eric:

Oh, it's almost like he was watching the inputs we were receiving and could predict the outputs based on past behavior.

Matt:

Eric Eric manatee is being created now.

Eric:

Bro, we're already being raised by AI. We need to start filming yesterday.

Matt:

Are we gods?

Eric:

Very anxious gods.

Matt:

My God, we just to say me, my myself, my me, damn.

Eric:

I, I am. I am he who is cold? I am.

Matt:

Wow.

Eric:

Damn.

Matt:

Eric, I think I should speak for everyone who has made it somehow to the end of this episode by saying I think that'll about.

Eric:

I think that'll do. And furthermore, what the fuck?

Matt:

Yeah, and may I just say what? Now listen, the next episode we should be doing I believe, eric, if I'm checking our queue correctly should finally be our mascot madness bracket. Now, we didn't discuss when we're recording that episode yet, but this could be your last chance. It might have already passed. So I have to say, instead of giving you the warning of all this is it? I'm just going to have to say listen, just stay tuned, you already missed the boat, sit back, kick your feet up, watch some more madness.

Matt:

Yeah, sure, but in this case just mass cops coming in, coming in hot. So look forward to that. We should also be introducing a new segment next time around. That, I think will. Finally, we've been hinting at it for a while and I think we've got a nice teaser of that segment coming up.

Matt:

Yeah, it's going to be a real hoot, but otherwise the usual business remains the same. We need your questions, we need your submissions. For both the questions can be about whatever you want. Whatever you want, baby yeah, anything. So submit some questions to us. But also neighborhood watches, local legends, people in your town you want us to cover, because the world should know how great they are. That's what we mean by local legend.

Eric:

Just yeah, gang, really whatever you want, like, we'll never be mad at you reaching out to us and saying, hey, could you talk about this, or I have thoughts about such and such. Respond to them, we'll do it.

Matt:

We'll do it. Yeah, we'll do it, we'll do it.

Eric:

I talk to so many of my friends who do listen to the show regularly and they're like, yeah, this idea, but like I'm worried it's too dumb. Brothers, sisters and all of you within and without the gender binary, there's nothing too dumb for this show.

Matt:

But there's only one way to know if you're doing that risk, and that's to submit the question, and you can do it to us at. You didn't ask for this at gmailcom, that's you didn't ask for this all spelled out at gmailcom. Or you didn't ask pod, instagram, twitter, facebook, all the places that's the letter. You didn't ask pod is tweet at us. Drop us a DM, do whatever you want. Get us the question However you want. We'll take it from there.

Alyssa:

Nailed it.

Matt:

Thank you. So from all of us here you didn't ask for this. My name's Matthew. My name's Matt.

Eric:

Brought to you by AI my name's. Matt, my name's Eric Poch.

Matt:

And listen. You didn't ask.

Eric:

But friends remember when it comes to the rights, privileges assigned to AI in the coming years. Like to quote a good good boy might have heard of him. Goes by the name of Optimus Prime Freedom is the right of all sentient beings. Keep that in mind before we start an AI war. Just all I'm saying.

Matt:

We had the opportunity, eric, to end this on like a light fluffy joke. That's usually what you deliver.

Eric:

And then I just rolled in like a semi truck and I went one more bomber for you.

Matt:

Rolled in or did you roll out?

Eric:

There it is there. Fucking is there, fucking is.

Matt:

There it is. Just take the fucking reins.

Planning Exclusive Annual Murder Mystery Parties
AI Impact on Reality TV
AI Documentary Series and Robot Romance
The Talk
AI Bias and Laughter Medicine
Introducing a New Segment and Q&A