Episode 391 - Replaced by AI?
Transcript:
Jen: Hello, Peter.
Pete: Hello, Jen.
Jen: The other night, I went out to dinner with a friend of mine. And we were talking about the future, as friends do.
Pete: As one does.
Jen: Sharing our hopes, our dreams, our fears. And this friend said to me, "I'm afraid, Jen, that you and I could be replaced by AI. Because we've put enough of our stuff out there that someone could just say to ChatGPT, 'I have this problem. What would Jen Waldman tell me to do about it?'" And I said to my friend, "Fear not. We shan't be replaced by AI, and here's why..."
Pete: (Dot, dot, dot.) Oh my gosh, I need to know. Help me, help me, help me. Because I think I and many others have probably had this same fear / conversation with friends. This is The Long and The Short Of It.
Pete: Enlighten me, please, Jennifer.
Jen: Pete, I know to my core that I, Jen Waldman, and you, Pete Shepherd, and my friend who I had dinner with cannot be replaced by AI, because we are able to do something that AI will never be able to do.
Pete: Oh my god, what?
Jen: And that is to love.
Pete: Oh, that's so nice. What do you mean?
Jen: Pete, I love my clients. I'm obsessed with them. I think about them when they are not in front of me. I go to the theater, I think of them. I go on a trip, I think of them. I eat ramen, I think of them. They are on my mind and in my heart all the time. And I care so deeply. And I always want to know more. And I'm dreaming on their behalf. And I know things that happened to them twenty years ago, and I also know things that happened to them twenty minutes ago because I was right there with them, loving on them. And so, AI can talk like me. It is true. You could teach AI how to answer exactly the way I would answer. But AI doesn't actually care. So I will never be able to be replaced by a robot, because I love too much.
Pete: Oh, I mean, what does one even say to that? That's just lovely. Like, I imagine if I'm a Jen Waldman client right now out there hearing this, I'm like, "Oh, Jen, that's so lovely." It's so true. So, okay, here's where my brain goes. AI is very good at trying to say what it thinks you want it to say.
Jen: Yes, it is.
Pete: You know what? It's almost like all of the people pleasers got into one room and were like, "We should program this thing to just do what we do, which is people please all the time." But there's a difference between people pleasing and love. There's a difference between care and statistically trying to think of the right thing to say based on what you think this person wants you to say, which I guess is a little bit of a version of what AI does at the moment.
Jen: What's interesting, Pete, in the online space, there have been, for well over a decade, a handful of people who have sort of like sat at the top of the online creation world, making courses and things of that nature. And many of those people have decided to take their hands off of that steering wheel and move over to group coaching models, for this exact reason. That AI can basically recreate all of their course literature and content, but AI cannot love on the client like in real time, for real, with an understanding of what it feels like to have a dream. AI can ask you about your dream. But it doesn't know what it feels like to sit with it and really, really want something, and want to make a change in your life, and actually be afraid of doing something as small as sending an email because of all of the fears of what, you know, the cascade of consequences. It just, it can't do it. And I know this because I've asked it to answer me like me, and I would never say things that it said if it knew the context in which I was asking.
Pete: Yeah. As an aside, when you were talking, I was wondering to myself, "Does AI not have FOPO?"
Jen: Well, that's the thing that makes it so great, is it has no feelings.
Pete: Right, right, right. FOPO being (for those that have just joined us, welcome) fear of people's opinions. Jen and I had this theory in an old episode called The Fear Onion that if you keep peeling back all the layers of fear, of the fear onion, the stinky, messy fear onion, at the core, at the root, we think might be the fear of other people's opinions. And that often drives our behaviour or our non-behaviour, because we're afraid of what other people think. And so, in the example that you were sharing of like what feels like a low stakes thing to an AI as sending an email can feel so high stakes for an individual, because of the stories they tell themself about the person that they're emailing and the work that they do and the fear of what might happen if they put a typo in this email, god forbid. And so, that FOPO is hard to translate into an AI model. Is that FOPO going to save us?
Jen: Well, it could. FOPO and love.
Pete: FOPO and love. I've always thought of FOPO as being not a good thing. I need to get better at not, you know, being driven by fear of what other people think. But maybe that's the thing that's going to save me?
Jen: Isn't that so interesting? Well, it is certainly the thing that helps us understand each other.
Pete: Right. Yeah. And relate to each other.
Jen: Like from a coaching perspective, when you're working with a client and they're up against an obstacle, a lot of the times the obstacle is built with FOPO at the center.
Pete: Yeah. This is, I think, adjacent. I think it's an aside. I think it's relevant. When you were talking about the example of a course creator, just to call on that for now, who has had a realization that the model of disseminating information is a model that an AI could do really well, that it can take my content and distribute it in a way, it could probably even create a video based on a video of me that actually kind of looks and feels and sounds like me, which is a whole crazy aside. But what it maybe isn't as good at is, yeah, this like dissemination of the humanity. And I'm thinking about a friend of mine, Mark Dombkins, who once said to me in a workshop that I was running, he observed the way we approached this workshop as, "We don't disseminate information. We disseminate culture." And that's a flip on every meeting he'd ever been a part of, every workshop he'd ever been a part of, which feels a lot like a, "Let me tell you all the information you need to know," as opposed to, "Through a conversation and through questions and through understanding and caring about the people in the room, can I create a certain culture of how we do things around here that is rooted in care and love and support and guidance? And there is no right answer." So there is actually necessarily no information to disseminate, because maybe there's a different answer for every individual depending on who you are. So I was just thinking about Mark, as you were sharing, of like disseminating information versus disseminating culture. Or instead of culture, maybe it's disseminating care, in this context. That, a group coaching experience is amazing and unique because you can't script it. You have no idea what's about to happen. And you have to be able to respond in the moment. Which sure, an AI could do. It could respond in the moment. But you respond in the moment with the context, with the care, with the fear, with the love, with the hopes, the dreams, the fears, the aspirations of that person in mind, and your own in mind, which creates a really unique experience, dot, dot, dot. I don't know why I'm spelling out my punctuation today.
Jen: I appreciate it, period.
Pete: I've been talking to Siri too much, full stop. Send.
Jen: That's funny. Yeah. You know, to look at the flip side of this, when I'm encouraging clients to use AI, it's usually when it would be helpful for them to take something from an unemotional angle or observe something from an unemotional point of view. Because AI doesn't have hopes, dreams, fears. And maybe most importantly, it doesn't care about you.
Pete: Yeah.
Jen: It doesn't care about you. So you say, "I want to connect with Person X, and I don't know what to say. Ask me some questions. And, you know, give me a shitty first draft." Actually, I don't like it to write my first drafts for me, but that's beside the point. And so, it does that. And you now don't have to battle through your fear to get the first draft on the page. That's great. But tomorrow, it's not going to lose sleep at night, wondering whether or not you found the courage to send the email. It's not going to come up with it itself (unless you prompt it to) to give you the nudge when you need it or even when you don't know you need it, to reach out and say, "Hey, I know that email is going out at two o'clock today. You are on my mind. Let's celebrate at 2:05."
Pete: Yeah. I was thinking about that. Yeah, that well-timed message, because you've been thinking about something and you care about someone, to say, "Good luck." I also feel, just as a quick aside for the techno-optimists listening that are like, "Hmm, I don't know, Pete," I feel like maybe just saying, "...yet." I feel like all of this is true now. I have no idea what's going to happen with AI in the future, as it continues to get better and smarter and god knows what (depending on who you listen to) it might be able to do. But I agree with you, at the moment. I absolutely agree with you. The thing that now is coming to mind for me is this idea of, you said "love", we've also said the word "care". And I'm thinking about something we mentioned on a previous episode about AI from Seth Godin, which I think relates. Which was, I, Pete Shepherd, care so much about my clients. And at the moment, a really smart way to use these tools, which Seth inspired me to think about, is, "How do you use these tools to add value to these clients that you really care about?" Not, "How do you use it to make things easier for you?" Because the unique contribution or the unique thing that I have, I guess, is obviously the skills that one possesses to be able to serve our clients, but before that, it's the care that we have to even use those skills.
Jen: Right.
Pete: And so, if we care enough to think about ways to serve our clients, then we might be able to come up with ways of using these tools that we hadn't previously had access to. So, how did...I'm just curious. How did your friend react, perceive, build on this idea? Was he like, "Yes, but no." Was he like, "Yes, and...". And was he excited? Did he have anything to add to this conversation?
Jen: Well, I think he was very bought in to the idea. Because like me, he's a teacher and a coach working with artists. And I mean, I'm sure this is true across industry lines, that this is just a universal truth, that people want to be cared for. Just generally speaking, people would like to be cared for.
Pete: Right.
Jen: But in our line of work in particular, people are basically taking their beating hearts out of their chest and placing them in your hands. It's like, "Here's the most vulnerable thing I could possibly do, share something I've created. And I'm handing it to you and saying, 'Can I get some feedback on this?'" I mean, that's a really vulnerable thing. And I don't take that lightly. I may have mentioned this on the podcast before, but I do like to remind myself that by assuming the position I've taken in these clients' lives, that I am also assuming responsibility for taking care of their hearts. And a friend of mine, a dear friend of mine who also was my assistant on many projects, gave me a bracelet, a silver bracelet that literally has a charm of a human heart on it.
Pete: Wow. That's so cool.
Jen: So cool. Just as a reminder.
Pete: So I feel like the corporate version of this isn't necessarily, "I'm taking my bleeding heart out and handing it to you, board member, this report that I put together." But it's just this notion that there are people that one works with or comes across in the corporate world that really and truly give a shit, that really give a damn, that really care. And you can tell. And I feel like they're the people we're talking about, that are hard, if not impossible, to replace with AI. Because they're the ones, to your point, thinking about things in different ways, thinking about things / projects / ideas that they could bring to a business when they're on the train commuting. They're the ones using these tools to think about how to serve clients better. Because they care. Because they give a crap. And on the other side of that, there are people that don't quite give as much of a shit, don't quite give as much of a damn. And they've got their own reasons and they've got their own worldview as to why that's the case. But I think that becomes harder to justify, as to how an AI machine might not be able to replace some of that. I feel like care is a differentiator, is kind of what I'm taking away from this. The love, the care, the support, the humanity of it all.
Jen: Yes. And for the listeners out there who we don't know personally...and Pete, maybe you can either confirm or deny this...I'm not that lovey-dovey, touchy-feely a person.
Pete: I can confirm.
Jen: This is not like the Hallmark Channel kind of love. This is like a deep, profound sense of care.
Pete: Yeah.
Jen: I saw a video on social media of a CEO of a company, a female founder and CEO, talking about how she asks her employees to tell her about their dreams, their dreams for their lives outside of work, their dreams for adventure, their dreams for their children, their dreams for their parents, just all of their dreams. And she also asks them, which I think is amazing, "What are your dreams for after you leave here?"
Pete: I love that.
Jen: And that just shows so much care, knowing that where you go to work is not the only part of your life that matters and not the only thing you dream about. I felt very inspired by that.
Pete: I love it. It's like a leadership of, "I care so much about you and creating the conditions for you to succeed, that I want to know all about what makes a successful life for you beyond these four walls." Yeah, that's nice. I like that a lot.
Jen: So what do you think, Pete? Replaced by AI, question mark?
Pete: Right now, I'm with you. I don't think that people like you, like me, like your dinner friend, like anyone listening who cares about the work they do and loves the work they do and the people that they serve, can have that care, that love, that human-centered focus be replicated / replaced by a robot (dot, dot, dot) yet.
Jen: And that is The Long and The Short Of It.