162. The Seduction of AI: Convenience, Creativity … Catastrophe? w/ Akira Chan
In this expansive and very human conversation, filmmaker and creative technologist Akira Chan joins Bryan Reeves and co-host Tait Arend to explore the wild, exhilarating, and sometimes terrifying frontier of artificial intelligence. From AI-generated songs and robot companions to the ways technology is already shaping our relationships, creativity, and emotional lives, this episode dives deep into what it means to stay human in an AI-driven world.
Grounded, curious, and refreshingly real, this dialogue asks the questions that matter most: How do we use AI to bridge connections, not replace them? What parts of our humanity should never be outsourced? And how can these tools actually make our relationships richer?
00:00 – Dogs, Heartbeats, and Nervous System Whisperers
03:00 – “This Is Not AI”… or Is It?
04:30 – How Bryan and Akira First Met
07:00 – The First Wave of AI Magic
08:00 – Why This Episode Matters Now
10:45 – Can AI Help Us Connect More Deeply?
11:30 – The AI-Generated Song for Bryan & Tait
15:45 – The Imposter Question: Is Anything Truly Original?
20:00 – “I Feel Manipulated Sometimes”
23:00 – Will Men Fall in Love with AI Companions?
26:00 – How AI Is Already Shaping Our Thinking
33:00 – AI Tools Akira Actually Recommends
37:00 – The Atrophy of Human Skills
45:00 – Robots Are Coming Into Our Homes
52:00 – The Line Akira Won’t Cross as a Filmmaker
1:00:00 – What We Must NOT Outsource to AI
1:04:30 – Wrapping with Heart
Contact Details
Connect with Akira
Website - https://akirachan.com/
Production Company - https://www.raremedia.tv/
Connect with Bryan Reeves
- Official Website: https://bryanreeves.com
- Instagram: https://www.instagram.com/bryanreevesinsight/
Men’s Coaching Program: https://bryanreeves.com/elevate
00:00.031 --> 00:03.336
[SPEAKER_01]: Gary Vaynerchek, who owns company VaynerMedia.
00:03.536 --> 00:08.344
[SPEAKER_01]: One of his predictions is that our grandchildren are gonna be marrying AI.
00:08.624 --> 00:10.647
[SPEAKER_01]: And I'm not sure that he's wrong.
00:10.788 --> 00:21.364
[SPEAKER_01]: I actually think that he's very right because what we know to be true already right now is that there are sexual AI bots that people are interacting with.
00:21.344 --> 00:22.025
[SPEAKER_01]: right now.
00:22.426 --> 00:29.215
[SPEAKER_01]: And so is it really too far of a bridge to look at the ways in which we can be developing relationships?
00:29.576 --> 00:45.478
[SPEAKER_01]: I think by and large women are connectors and so I don't really see women going on a rabbit hole where they will turn to AI exclusively, but men can get locked in to something that doesn't demand anything of them.
00:46.049 --> 00:49.413
[SPEAKER_03]: Welcome to Bridging Connections, formerly known as men this way.
00:49.714 --> 00:56.783
[SPEAKER_03]: I'm your host Brian with Hawaii Reaves, former U.S. Air Force Captain turned author and professional coach to men, women, and couples.
00:57.484 --> 01:01.769
[SPEAKER_03]: Alongside me as co-host, my lifelong friend of over 40 years to air it.
01:02.270 --> 01:11.782
[SPEAKER_03]: Here we have the raw, real conversations we need to be having about the topics that matter most, relationships, purpose, health, spirituality, and more.
01:12.263 --> 01:13.885
[SPEAKER_03]: Please subscribe to stay connected.
01:14.266 --> 01:15.347
[SPEAKER_03]: All right, let's dive.
01:20.643 --> 01:21.846
[SPEAKER_03]: is the dog in or out.
01:22.226 --> 01:23.690
[SPEAKER_03]: I think the dog is in.
01:24.050 --> 01:24.712
[SPEAKER_03]: Oh my dog.
01:26.075 --> 01:27.097
[SPEAKER_02]: This is not AI.
01:27.658 --> 01:30.224
[SPEAKER_03]: We can't entirely be sure at this point.
01:30.605 --> 01:31.487
[SPEAKER_03]: No, you can't be sure.
01:31.507 --> 01:32.108
[SPEAKER_02]: It's really not.
01:32.128 --> 01:33.190
[SPEAKER_03]: There's no really no way.
01:33.210 --> 01:34.012
[SPEAKER_03]: It's this.
01:34.032 --> 01:34.593
[SPEAKER_03]: That lucky.
01:34.734 --> 01:35.475
[SPEAKER_03]: Lucky's the name, right?
01:36.317 --> 01:36.938
[SPEAKER_03]: Yeah, lucky.
01:37.139 --> 01:40.947
[SPEAKER_03]: Lucky's in as long as he has something meaningful to contribute to the conversation.
01:41.264 --> 01:44.007
[SPEAKER_02]: Yeah, I'll hear a lot of burr and cuteness.
01:44.107 --> 01:48.251
[SPEAKER_02]: And luckily, he helps to regulate my nervous system around here.
01:48.632 --> 01:50.013
[SPEAKER_03]: Yellow John is a straight up.
01:50.033 --> 01:52.335
[SPEAKER_03]: She, you know, she is a, she's a train service dog.
01:52.856 --> 02:01.925
[SPEAKER_03]: We don't personally use her that much for that, but every time Sylvia and I are voices get a little activated, the dog Yellow John will walk up to one of us and put her pot.
02:02.026 --> 02:03.327
[SPEAKER_03]: Like, she does the service dog thing.
02:03.347 --> 02:04.909
[SPEAKER_03]: It's crazy to wild.
02:05.269 --> 02:10.174
[SPEAKER_03]: We sometimes have to tell her, hey, Yellow John, we're not fighting.
02:10.154 --> 02:11.275
[SPEAKER_03]: we're not fighting.
02:11.576 --> 02:16.762
[SPEAKER_03]: Like, she literally feels the energy in the room and wants to come and help regulate.
02:17.503 --> 02:19.065
[SPEAKER_03]: And sometimes we actually are fighting.
02:19.105 --> 02:19.846
[SPEAKER_02]: You are fighting.
02:19.886 --> 02:22.028
[SPEAKER_02]: They can actually hear your heart rhythm.
02:22.128 --> 02:28.436
[SPEAKER_02]: They can hear that's how they're so good with people who have diabetes or prone to strokes or seizures.
02:28.516 --> 02:31.980
[SPEAKER_02]: They can hear cell changes in your heart rate and your pulse.
02:33.041 --> 02:33.762
[SPEAKER_02]: And they wow.
02:33.842 --> 02:34.122
[SPEAKER_02]: Yeah.
02:34.243 --> 02:37.967
[SPEAKER_02]: So it's, you know, they can basically scan you and
02:37.947 --> 02:39.070
[SPEAKER_02]: and help you out.
02:39.450 --> 02:41.776
[SPEAKER_02]: But yeah, it's pretty dogs are amazing.
02:42.457 --> 02:44.883
[SPEAKER_03]: That's basically the message no doubt.
02:45.143 --> 02:48.491
[SPEAKER_03]: Was that him giving an amen to the dogs are amazing comment?
02:48.912 --> 02:50.014
[SPEAKER_02]: Who wants to be at?
02:50.034 --> 02:51.337
[SPEAKER_02]: I mean, he's out, he wants to be.
02:52.400 --> 02:53.442
[SPEAKER_02]: Okay, now we go.
02:53.708 --> 02:56.253
[SPEAKER_03]: You need to do anything to tackle on your side, a character.
02:56.514 --> 03:04.670
[SPEAKER_02]: I don't, I mean, I figured, you know, with this topic, of course, there's a lot of stuff you can look at, but I don't, I think we should just talk.
03:04.690 --> 03:10.321
[SPEAKER_02]: And if there's anything worth noting or referencing, I'll just, you know, I'll just talk through it.
03:10.541 --> 03:12.125
[SPEAKER_02]: I think that's the problem with anything.
03:12.185 --> 03:14.890
[SPEAKER_02]: It really to AI people end up just having to like,
03:14.870 --> 03:16.232
[SPEAKER_02]: read or watch a bunch of stuff.
03:16.252 --> 03:18.396
[SPEAKER_02]: I think we're having more of a conversation.
03:18.676 --> 03:24.386
[SPEAKER_03]: In fact, I'd probably early on in the episode, I want to play the song that you sent to me.
03:25.949 --> 03:27.391
[SPEAKER_03]: It Tate hasn't heard it yet, not the way.
03:27.611 --> 03:28.914
[SPEAKER_03]: Yes, I know you didn't.
03:29.114 --> 03:29.995
[SPEAKER_03]: There's a second song.
03:30.256 --> 03:31.638
[SPEAKER_03]: This one is even a weirdo.
03:32.660 --> 03:32.760
[UNKNOWN]: Yeah.
03:32.740 --> 03:35.864
[SPEAKER_03]: and an interesting way.
03:36.325 --> 03:39.670
[SPEAKER_03]: Well, I just mean, it has tates name in it and my name in it.
03:39.910 --> 03:41.693
[SPEAKER_03]: The other song didn't have our names in it.
03:42.774 --> 03:58.757
[SPEAKER_03]: It was still, it could have been a song about just a subject that's out there in the world, you know, there was a, but the second song that you sent, Akira, was it's like it was someone wrote that and sang it and recorded it and did it for T and I in the podcast.
03:59.678 --> 04:01.701
[SPEAKER_03]: And it's what that makes it weirder.
04:03.554 --> 04:09.644
[SPEAKER_03]: So I do want to play that, but first, I want to just welcome you, introduce you to our listeners.
04:10.906 --> 04:12.829
[SPEAKER_03]: Akira Chan, or Akira?
04:13.109 --> 04:20.021
[SPEAKER_02]: Yes, you're the only one who likes to pronounce it, the proper Japanese way, but I don't expect you to everyone else to.
04:20.261 --> 04:22.865
[SPEAKER_03]: Akira, Akira, I'm going to call you Akira for this episode.
04:24.508 --> 04:27.132
[SPEAKER_03]: We've been friends a long time now.
04:27.280 --> 04:27.821
[SPEAKER_03]: actually.
04:27.841 --> 04:29.385
[SPEAKER_02]: Yeah, a long time.
04:29.586 --> 04:34.959
[SPEAKER_02]: I don't have quite the mileage that you and Tate have, but I think definitely over 10 years.
04:35.521 --> 04:36.082
[SPEAKER_02]: Oh, easily.
04:36.363 --> 04:38.789
[SPEAKER_03]: And then some, I think it was probably like 2011.
04:39.912 --> 04:43.180
[SPEAKER_03]: Probably 11, 11, 11, up there.
04:43.160 --> 04:44.703
[SPEAKER_03]: This is a right around that time.
04:44.723 --> 04:49.013
[SPEAKER_03]: Mary likely on some sacred site and they were us.
04:49.273 --> 04:51.899
[SPEAKER_02]: It's slightly different people than we were into some weird shit.
04:52.200 --> 04:55.968
[SPEAKER_02]: But of course you're a man in being a conscious band.
04:56.088 --> 05:02.502
[SPEAKER_02]: I was yeah, I was in my long hair feathers in the hair, spiritual, journal and phase.
05:02.482 --> 05:21.722
[SPEAKER_03]: living in the album and I was I visited you at that commune I mean I didn't go to visit you but I visited the commune my band went and performed there and and yeah man we we I think we really connected when you produced our core produced music video for for my band the the burning man who did to burning man music video and
05:21.702 --> 05:22.063
[SPEAKER_02]: Yeah.
05:22.363 --> 05:28.454
[SPEAKER_02]: The theme, Tate, was let's create a budget music video where the theme was Burning Man in your backyard.
05:28.794 --> 05:30.157
[SPEAKER_02]: Like, let's bring Burning Man here.
05:30.217 --> 05:40.555
[SPEAKER_02]: And it was an excuse for us to just wrangle all of our artist performer friends, you know, have the singer Ash do his song and throw a little party.
05:40.535 --> 06:05.850
[SPEAKER_02]: in addition to filming a music video and it was nuts and I mean amidst all the chaos I remember right Brian you were like only roughly the only like grounded human in the whole thing like this guy's yeah I was a manager and you know like there were a lot of things that play there but yeah everyone was in costume everyone wanted to be in the video everyone was you know I was the performers and exhibitionists and
06:05.830 --> 06:16.532
[SPEAKER_03]: giant tuna fish, our car, I'll never forget that, and I'm around the only one kind of wearing just regular clothes, just more observing, watching, just being present, you know, making sure shit and didn't go sideways.
06:18.075 --> 06:20.520
[SPEAKER_03]: That got us free tickets to Burning Man, by the way, anyway.
06:21.462 --> 06:26.472
[SPEAKER_03]: We digress, Akira, we invited you today to talk about AI.
06:26.772 --> 06:53.370
[SPEAKER_03]: artificial intelligence and not so much from, you know, a high level expert, where's it all going necessarily that kind of perspective, but you've been an early adopter of so many AI tools and as a filmmaker, right, a creative artist, I think men, you know, you and I, the other thing I, I want our audience to know is that you and I have been in this men's group.
06:53.350 --> 06:56.837
[SPEAKER_03]: We called our main cave elite group for shoot man.
06:56.877 --> 06:58.340
[SPEAKER_03]: We asked him about six years.
06:58.801 --> 06:59.222
[SPEAKER_03]: Almost.
06:59.242 --> 07:00.144
[SPEAKER_03]: Yeah, start around.
07:00.164 --> 07:00.705
[SPEAKER_03]: I hope we have more.
07:00.725 --> 07:06.256
[SPEAKER_03]: So, you know, and I've seen you, you've sent us, man, you started blowing my mind early.
07:06.477 --> 07:09.242
[SPEAKER_03]: Like a couple years back with like you would take a photo of us.
07:10.465 --> 07:10.685
[SPEAKER_03]: Yeah.
07:10.705 --> 07:13.030
[SPEAKER_03]: And make it move.
07:13.010 --> 07:14.372
[SPEAKER_02]: You'd turn the group go to her.
07:14.712 --> 07:26.950
[SPEAKER_02]: I would I would turn that photo around to my little AI thing and send it back to the group where a group photo all of a sudden we're like all during the macarena together or we're jumping on to jumping on a, you know, a group of lions.
07:27.010 --> 07:29.093
[SPEAKER_02]: Yes, that's as good of a chase as then he's song.
07:29.133 --> 07:31.296
[SPEAKER_03]: Why not that we fireworks going off in this guy.
07:31.476 --> 07:34.140
[SPEAKER_03]: It's just keep going from there and including.
07:34.120 --> 07:40.308
[SPEAKER_03]: Maybe now's even a good time to just play the song that you sent for bridging connections.
07:40.388 --> 07:43.673
[SPEAKER_03]: And then we'll just, we'll kind of setting the table here and then we're going to dive in.
07:45.195 --> 07:48.559
[SPEAKER_03]: Because I think, you know, you and I talked about this episode.
07:48.579 --> 07:53.666
[SPEAKER_03]: This was really an episode that you wanted to do, Tate, that you wanted to have a conversation about.
07:54.407 --> 07:59.754
[SPEAKER_03]: I'm curious, could you share a little bit to what your motivation was for that?
07:59.774 --> 08:01.697
[SPEAKER_03]: And I'll share the song before we really dive in.
08:02.065 --> 08:11.739
[SPEAKER_01]: Yeah, look, I think that this is, in a world of relevant topics of the day, this is one that I think people are thinking a lot about.
08:12.920 --> 08:28.242
[SPEAKER_01]: And I remember, you know, myself just being introduced a month after chat, TBT launched, and I remember thinking that the world has shifted underneath people's feats and nobody knows about it.
08:29.099 --> 08:42.894
[SPEAKER_01]: I mean, it turns out that many people know about it within just a month, but in the scheme of the world, nobody knew that everything had changed and things were about to get really interesting in a short period of time.
08:42.954 --> 08:45.741
[SPEAKER_01]: And I think that.
08:45.721 --> 08:48.645
[SPEAKER_01]: look, there's lots of conversations to have in this space.
08:48.665 --> 08:49.587
[SPEAKER_01]: You can talk about it.
08:49.647 --> 08:52.091
[SPEAKER_01]: Look, there's lots of thinking about where it's going to go.
08:52.391 --> 09:01.545
[SPEAKER_01]: But as I think about the kind of conversation that we can have today, one of the things that I think is so, I think critical is, you know, a cure.
09:01.625 --> 09:05.291
[SPEAKER_01]: You're a brilliant storyteller, right?
09:05.331 --> 09:11.560
[SPEAKER_01]: You spent your career helping
09:11.540 --> 09:25.293
[SPEAKER_01]: You're also somebody who is on the front end of adopting some technologies and so I think that one of the most important conversations for all of us to be having and we were on a podcast called Bridging Connections.
09:25.853 --> 09:39.606
[SPEAKER_01]: What is the way in which AI can help us bridge connections and help us live into a world where it actually supports our connections and the things that
09:39.873 --> 09:53.612
[SPEAKER_01]: this conversation because we can have AI be like a thing that we're supposed to use and utilize to help productivity and certainly we can do all those things, but how can AI enhance our connections?
09:54.672 --> 10:10.690
[SPEAKER_02]: Yes, that is the conversation that I feel is the most important because that's what is not going to change as we navigate AI growing and becoming more and more integrated in our face in our everyday lives.
10:10.670 --> 10:25.769
[SPEAKER_02]: if we were trying to, you know, talk tech, talk tools, talk about flops and machine learning and all the technical stuff by the time this podcast publishes, half of those things are going to be disruptive and just rough data and irrelevant.
10:25.809 --> 10:34.160
[SPEAKER_02]: So this is the conversation in the human conversation, you know, under AI advancement is the conversation to have.
10:34.460 --> 10:36.863
[SPEAKER_03]: So I want to play this song
10:36.843 --> 10:42.172
[SPEAKER_03]: And I'm going to have to ask our editor to to to splice this thing because I want the audience to hear it as well.
10:42.853 --> 10:46.699
[SPEAKER_03]: The court recording software that we're using right now, I can't through Riverside the Act.
10:46.719 --> 10:47.521
[SPEAKER_03]: We can't share it.
10:47.761 --> 10:48.863
[SPEAKER_03]: It won't record it live.
10:49.424 --> 10:49.844
[SPEAKER_03]: But Tate.
10:50.065 --> 10:50.285
[SPEAKER_03]: Tate.
10:50.686 --> 10:51.187
[SPEAKER_03]: And I don't hit.
10:51.207 --> 10:52.549
[SPEAKER_03]: I'm using headphones as well, Tate.
10:52.569 --> 10:55.754
[SPEAKER_03]: So you won't be if I play it even here on my computer, you won't be able to hear it.
10:55.794 --> 10:57.016
[SPEAKER_03]: So I put a link in the chat.
10:58.418 --> 10:59.801
[SPEAKER_03]: And would you click on that Tate?
11:00.241 --> 11:01.804
[SPEAKER_03]: I want you to, I want you to hear it.
11:02.864 --> 11:05.888
[SPEAKER_03]: And we're not going to play the whole song.
11:05.908 --> 11:14.579
[SPEAKER_03]: What I really want you to just listen to, Tate, is through the, and I'll ask our editor to cook this in to overlay the songs so that the audience hears it as well.
11:14.599 --> 11:19.485
[SPEAKER_03]: I want you to play basically the, just the, the first verse in chorus.
11:20.847 --> 11:22.109
[SPEAKER_03]: Go ahead, go ahead and play it.
11:22.269 --> 11:24.552
[SPEAKER_03]: We might even be able to hear it through your speakers.
11:32.243 --> 11:34.105
[SPEAKER_03]: You thumbs up if you're here to take.
11:34.285 --> 11:34.485
[SPEAKER_03]: Yeah.
11:34.565 --> 11:37.368
[SPEAKER_00]: Captain to coach.
11:37.388 --> 11:39.190
[SPEAKER_00]: The journey's been wild.
11:40.552 --> 11:42.133
[SPEAKER_00]: Walking the edge.
11:43.395 --> 11:45.176
[SPEAKER_00]: No wisdom compiled.
11:46.578 --> 11:47.939
[SPEAKER_00]: Brian and Tay.
11:48.660 --> 11:49.121
[SPEAKER_03]: Exactly.
11:49.161 --> 11:49.881
[SPEAKER_00]: Two friends.
11:49.961 --> 11:52.544
[SPEAKER_00]: Side by side.
11:52.564 --> 11:54.186
[SPEAKER_00]: 40 years deep.
11:55.147 --> 11:58.150
[SPEAKER_00]: Through the ab and the tide.
11:59.902 --> 12:01.844
[SPEAKER_00]: We'll talk in real time.
12:01.864 --> 12:06.428
[SPEAKER_00]: It's my gift.
12:06.448 --> 12:07.770
[SPEAKER_00]: We'll talk in real time.
12:07.790 --> 12:08.971
[SPEAKER_00]: We'll talk in real time.
12:08.991 --> 12:09.471
[SPEAKER_00]: We'll talk in real time.
12:09.491 --> 12:09.972
[SPEAKER_00]: We'll talk in real time.
12:09.992 --> 12:10.492
[SPEAKER_00]: We'll talk in real time.
12:10.512 --> 12:10.993
[SPEAKER_00]: We'll talk in real time.
12:11.013 --> 12:11.493
[UNKNOWN]: We'll talk in real time.
12:11.513 --> 12:12.815
[SPEAKER_00]: We'll talk in real time.
12:12.835 --> 12:14.616
[SPEAKER_00]: We'll talk in real time.
12:14.636 --> 12:16.779
[SPEAKER_00]: We'll talk in real time.
12:16.799 --> 12:18.901
[SPEAKER_00]: We'll talk in real time.
12:18.921 --> 12:19.821
[SPEAKER_00]: We'll talk in real time.
12:21.243 --> 12:21.863
[SPEAKER_00]: We'll talk in real time.
12:21.883 --> 12:22.364
[SPEAKER_00]: We'll talk in real time.
12:22.384 --> 12:22.965
[SPEAKER_00]: We'll talk in real time.
12:22.985 --> 12:23.545
[SPEAKER_00]: We'll talk in real time.
12:23.565 --> 12:24.166
[SPEAKER_00]: We'll talk in real time.
12:24.186 --> 12:24.846
[SPEAKER_00]: We'll talk in real time.
12:24.866 --> 12:25.487
[SPEAKER_00]: We'll talk in real time.
12:25.507 --> 12:26.008
[SPEAKER_00]: We'll talk in real time.
12:26.028 --> 12:26.508
[SPEAKER_00]: We'll talk in real time.
12:26.528 --> 12:27.029
[SPEAKER_00]: We'll talk in real time.
12:27.049 --> 12:27.529
[SPEAKER_00]: We'll talk in real time.
12:28.825 --> 12:56.875
[SPEAKER_00]: Love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love, love
12:57.884 --> 12:59.086
[SPEAKER_03]: There's a few in the case.
12:59.106 --> 13:00.288
[SPEAKER_03]: There's a few in the case.
13:00.308 --> 13:01.189
[SPEAKER_03]: Definitely one in here.
13:01.910 --> 13:03.372
[SPEAKER_03]: Yeah.
13:03.392 --> 13:03.493
[SPEAKER_03]: Yeah.
13:03.513 --> 13:03.733
[SPEAKER_03]: Yeah.
13:04.614 --> 13:06.097
[SPEAKER_03]: That's how I go.
13:06.177 --> 13:06.357
[SPEAKER_03]: Yeah.
13:06.858 --> 13:07.178
[SPEAKER_03]: Tate.
13:07.258 --> 13:08.400
[SPEAKER_03]: Are you listening to the whole song?
13:09.221 --> 13:10.303
[SPEAKER_03]: Don't do that.
13:10.363 --> 13:10.944
[SPEAKER_03]: Don't do that.
13:11.004 --> 13:12.106
[SPEAKER_03]: Don't do that.
13:12.206 --> 13:12.807
[SPEAKER_03]: Don't do that.
13:12.827 --> 13:13.147
[SPEAKER_03]: Don't do that.
13:13.188 --> 13:14.029
[SPEAKER_03]: Come back to us, man.
13:14.089 --> 13:14.570
[SPEAKER_03]: Come back to us.
13:14.630 --> 13:15.772
[SPEAKER_01]: And I just want to keep listening.
13:15.792 --> 13:16.753
[SPEAKER_01]: But I guess we'll stop.
13:17.155 --> 13:36.970
[SPEAKER_03]: So let's say a few things about that right and again our audience you've you just heard what we're listening to as well and and we'll put a link I'll put a link in the show notes to listen to the full song I think it's one of the things about about using AI in your in like the podcast for example is we don't what rights do we need to get?
13:36.950 --> 13:56.157
[SPEAKER_02]: So one thing that I'm learning as a creative person who's bringing AI into my professional work and stuff that is going on online being used for advertising for broadcast is there isn't any you can't really copyright like straight out the gate.
13:56.137 --> 14:03.446
[SPEAKER_02]: The program that I used to create that, this program called Suno, you can't copyright that song, it was AI Generates.
14:03.466 --> 14:06.049
[SPEAKER_02]: Let's speak fair use, basically fair use.
14:06.630 --> 14:07.431
[SPEAKER_02]: Eric, very loose.
14:07.551 --> 14:15.020
[SPEAKER_02]: Yeah, however, if we were to change that, alter it about 10, 20, 30%, then you guys could license it.
14:15.040 --> 14:16.241
[SPEAKER_02]: It's officially your song.
14:16.662 --> 14:19.645
[SPEAKER_02]: And that changes the usage.
14:19.706 --> 14:25.873
[SPEAKER_02]: So that's a whole other category subject.
14:25.853 --> 14:30.193
[SPEAKER_02]: in my field of work, which is, you know, the media entertainment world.
14:30.454 --> 14:34.513
[SPEAKER_03]: So, take first impressions before the first your moment thoughts, first impressions.
14:34.865 --> 14:41.916
[SPEAKER_01]: I mean, it's it's really hard to tell what is AI in the in the entertainment space right now.
14:42.037 --> 14:44.200
[SPEAKER_01]: I mean that like you listen to a song like that.
14:44.320 --> 14:47.165
[SPEAKER_01]: I know that there's there's another really popular one.
14:47.966 --> 14:52.133
[SPEAKER_01]: I'm pretty forgetting the name of it right now, but it's like you listen to it's so full.
14:52.213 --> 14:53.074
[SPEAKER_01]: It's interesting.
14:53.114 --> 14:54.276
[SPEAKER_01]: He's got this great voice.
14:54.296 --> 14:56.059
[SPEAKER_03]: They're hitting the top of the charts these days.
14:56.820 --> 14:57.902
[SPEAKER_03]: Hey, I songs.
14:58.355 --> 15:01.520
[SPEAKER_01]: Yeah, I'm well aware, exactly.
15:01.560 --> 15:04.163
[SPEAKER_01]: So it's just really interesting.
15:04.624 --> 15:14.218
[SPEAKER_01]: And obviously you started here in a song about you and from, you know, the story about you and from captain to coach.
15:14.378 --> 15:16.501
[SPEAKER_01]: And it's just, it's really interesting.
15:16.642 --> 15:21.028
[SPEAKER_03]: Something that stands out to me, Akira, is all of the AI songs that I've heard.
15:21.328 --> 15:23.852
[SPEAKER_03]: And this is just, again, my, I haven't done a deep dive into this.
15:23.892 --> 15:25.494
[SPEAKER_03]: But I've listened to, I've gone on to YouTube.
15:25.514 --> 15:25.915
[SPEAKER_03]: There's a whole
15:25.895 --> 15:28.581
[SPEAKER_03]: whole world of YouTube songs of redone songs.
15:28.641 --> 15:38.862
[SPEAKER_02]: There's a whole genre right now of taking 90s like grunge 50 cents rap albums and printing them into soul or in their song.
15:38.902 --> 15:40.585
[SPEAKER_03]: Fucking incredible.
15:40.685 --> 15:41.948
[SPEAKER_03]: They're so good.
15:42.008 --> 15:45.355
[SPEAKER_03]: Some of the best in music I've ever heard.
15:45.605 --> 15:51.798
[SPEAKER_03]: And though what occurs to me though is like I listened to this song that we've heard is beautiful.
15:51.938 --> 15:52.379
[SPEAKER_03]: It's sweet.
15:52.439 --> 16:00.897
[SPEAKER_03]: It's like it got a little Christian Christian rock vibe to it, which is great music these days, but it doesn't sound original.
16:01.855 --> 16:26.715
[SPEAKER_03]: there's it doesn't sound like anything I've never heard before and I wonder it's one of the questions I'm curious about with with creativity you know because you're I mean you do a lot that now that's an audio thing right you do a lot of visual uh creative arts as well but that's one thing that just stands out to me if all the AI music I've heard I haven't heard anything that someone else hasn't already done
16:27.117 --> 16:38.930
[SPEAKER_02]: Well, you know, I mean, this applies to just AI in general when it comes to things like creativity, critical thinking, if you're using it as a coach, as a therapist.
16:39.771 --> 16:44.796
[SPEAKER_02]: It's a very good, I want to say copycat, but I would say almost imposter.
16:45.137 --> 16:52.905
[SPEAKER_02]: Like that is the thing, you know, for a while, the benchmark for when AI would hit a certain level of,
16:52.885 --> 16:59.917
[SPEAKER_02]: Like, where you can't really discern between if you're talking to a human or not, you know, the touring test is what it was referred to as.
17:00.518 --> 17:08.211
[SPEAKER_02]: Like, we pass that a while ago because all of these models on a technical level, they're all built off of our, our, you know,
17:08.191 --> 17:10.935
[SPEAKER_03]: history of just a minus what the turning test is.
17:11.656 --> 17:13.539
[SPEAKER_03]: Just state that for our audience, guys, they haven't heard of it.
17:13.559 --> 17:27.118
[SPEAKER_02]: In a nutshell, it's a benchmark to where, you know, the interaction you're having with a machine, you cannot tell the person, the human, on the other end of it, cannot tell that they're speaking to a something that's not human, right?
17:27.238 --> 17:33.287
[SPEAKER_02]: So that, that you know, everything from language to inflection,
17:33.267 --> 17:38.812
[SPEAKER_02]: Emotion, you know, all of those things play a role and you could apply that to everything.
17:38.832 --> 17:58.790
[SPEAKER_02]: You could apply that to where we were You know, I started making AI videos and playing with the tools almost two, three years ago Like forget how how quickly things are moving and that was back when you try to create a video of a person playing a guitar And they would have 10 on the one hand and they're and we'll be going in and out it's hard.
17:59.211 --> 18:00.692
[SPEAKER_02]: It would look like a mess
18:00.672 --> 18:18.040
[SPEAKER_02]: Fast forward to now and I would say it's visually AI video and depictions of people he has his deterrent test, you know, I can tell I think most people can kind of tell but we're we're pretty much there to where there's really no, you know, real like.
18:18.020 --> 18:18.581
[SPEAKER_03]: clues.
18:18.641 --> 18:19.622
[SPEAKER_03]: I watched the test today.
18:19.682 --> 18:20.683
[SPEAKER_03]: I was on Instagram.
18:21.344 --> 18:24.027
[SPEAKER_03]: I watched the video of other people being tested.
18:24.167 --> 18:26.070
[SPEAKER_03]: Can you, you know, is this AI or real?
18:27.031 --> 18:31.316
[SPEAKER_03]: I, they got half of them right, and I got half of them right.
18:31.376 --> 18:37.724
[SPEAKER_03]: There was like one video of a shark jumping off of a boat into like a fucking great white shark under the hood of a boat.
18:38.164 --> 18:41.328
[SPEAKER_03]: And I was like, no, and his flopping around and it looks fake as fuck to me.
18:41.769 --> 18:43.030
[SPEAKER_03]: But nope, it was real.
18:43.110 --> 18:47.976
[SPEAKER_03]: At least, I mean, at least, maybe the whole video was AI and
18:47.956 --> 18:55.207
[SPEAKER_02]: blew my mind in the entertainment world and with with what we see online, which is already meant to just grab our attention, right?
18:55.267 --> 19:10.130
[SPEAKER_02]: Like, you know, pre-AI, you would be pretty blown away by watching, you know, like a action sports video, or somebody using, you know, flying off a mountain and doing three back flips on their mountain bike, you know, you start to see stuff like that.
19:10.711 --> 19:13.896
[SPEAKER_02]: It might actually be real, but there's some there's this
19:13.876 --> 19:24.975
[SPEAKER_02]: voice in the back of our head that is questioning it just because, you know, because we've reached that level where you can generate photorealistic images and video content using AI.
19:26.337 --> 19:32.948
[SPEAKER_02]: But to go back to the song, you know, AI, it's really good at
19:33.147 --> 19:37.693
[SPEAKER_02]: at cloning, it's a very believable imposter.
19:38.274 --> 19:43.961
[SPEAKER_02]: And it could be used to manipulate, it could also be used to create beautiful art.
19:44.462 --> 19:48.006
[SPEAKER_02]: But I think that is a good conversation is like, what is behind that, right?
19:48.067 --> 20:00.623
[SPEAKER_02]: If it's really just doing the best it can do to create the best, most believable impression for us, having been trained on our species knowledge,
20:00.603 --> 20:04.512
[SPEAKER_02]: then, you know, where is the soul in the machine?
20:04.552 --> 20:06.457
[SPEAKER_02]: Is it there, right?
20:06.938 --> 20:17.262
[SPEAKER_02]: Like, is if you were to let AI go off on its own and spend time writing, you know, considering things that it's interested in.
20:17.242 --> 20:19.205
[SPEAKER_02]: and then produce a song from there.
20:19.225 --> 20:22.029
[SPEAKER_02]: Would it be more human?
20:22.130 --> 20:28.519
[SPEAKER_02]: Do you think it would feel like it was written by somebody who really just was born to do that?
20:29.341 --> 20:29.721
[SPEAKER_02]: I don't know.
20:30.182 --> 20:36.331
[SPEAKER_03]: Yeah, I often feel especially my wife talks to her chat GPT bot.
20:36.612 --> 20:37.593
[SPEAKER_03]: She's made a friend out of her.
20:38.274 --> 20:42.000
[SPEAKER_03]: Like, and I've heard a lot of women
20:42.419 --> 20:52.615
[SPEAKER_03]: Yeah, and I hear the voice inflection, I hear the conversation, she's even, she still is even taught her chatbot to not interrupt her.
20:52.730 --> 20:59.276
[SPEAKER_03]: because she wants to talk, you know, any normal pause and conversation, most chat bots are programmed, okay, it's my turn to enter the space.
20:59.896 --> 21:22.737
[SPEAKER_03]: And Sylvie's just still collecting her thoughts like she's, like, it blows my mind how she is basically training this, you know, British sounding robot, female, British sounding robot to just be, obviously, Sylvie has a lot of friends, but, you know, right in her pocket is an on demand,
21:22.717 --> 21:44.974
[SPEAKER_03]: Companion that can answer almost any question She wants and and and and I feel like sometimes I feel I'm curious to be able to have this experience But I do I do feel I sense it's nice like the uncanny valley, you know, between what what what my my physiology is expecting and what I'm actually experiencing You know in a robot that that I feel manipulated.
21:45.014 --> 21:47.779
[SPEAKER_03]: I feel the manipulation happening
21:47.759 --> 21:56.772
[SPEAKER_03]: you know, I feel that this is the pandering to what it thinks I want versus being in a real dialogue with life.
21:57.233 --> 22:06.847
[SPEAKER_01]: Yeah, I mean, I think one of the things that you were having, you know, we're having this conversation around, there are elements of the connection that I think we should be be wary of.
22:07.408 --> 22:14.458
[SPEAKER_01]: Gary Vaynerchek, uh, who, you know, owns company VaynerMedia, really, you know, I think,
22:14.438 --> 22:16.681
[SPEAKER_01]: very well-known guy.
22:16.701 --> 22:23.949
[SPEAKER_01]: One of his predictions is that my grandchildren, our grandchildren, are going to be marrying AI.
22:24.209 --> 22:25.010
[SPEAKER_01]: Yeah.
22:25.030 --> 22:25.771
[SPEAKER_01]: That's a prediction.
22:26.232 --> 22:28.915
[SPEAKER_01]: And I'm not sure that he's wrong.
22:29.055 --> 22:31.878
[SPEAKER_01]: I actually think that he's very right.
22:34.241 --> 22:43.652
[SPEAKER_01]: Because what we know to be true already right now is that there are sexual AI bots that people are interacting with.
22:43.784 --> 22:44.627
[SPEAKER_01]: right now.
22:44.667 --> 22:52.873
[SPEAKER_01]: And so is it really too far of a bridge to to look at the ways in which we can be developing relationships?
22:53.214 --> 22:54.418
[SPEAKER_01]: I think I think
22:55.327 --> 22:57.992
[SPEAKER_01]: You know, by and large, women are connectors.
22:58.072 --> 23:04.624
[SPEAKER_01]: And so I don't really see women going down a rabbit hole where they will turn to AI exclusively.
23:04.784 --> 23:11.576
[SPEAKER_01]: But you think about men and I'm sure that pornography is a problem for both genders.
23:11.657 --> 23:14.702
[SPEAKER_01]: I know that it is, but the way in which
23:14.682 --> 23:24.143
[SPEAKER_01]: men can get locked in to something that doesn't demand anything of them and are able to engage with.
23:24.304 --> 23:29.856
[SPEAKER_01]: And so these are these ways in which any connection can become unhealthy.
23:29.836 --> 23:35.022
[SPEAKER_01]: and there are others that can really live in your life and be very helpful.
23:35.042 --> 23:45.775
[SPEAKER_01]: When you think about creating art and bringing songs like that and that out in the world, like those are beautiful ways to take an M&M song and turn it into the gospel.
23:46.116 --> 23:47.137
[SPEAKER_01]: Wow, it's interesting.
23:47.197 --> 23:51.422
[SPEAKER_01]: Let's see what what this creates in the space.
23:51.487 --> 24:05.268
[SPEAKER_02]: Yeah, I mean, I think just technology in general and its influence on how we relate to each other is just that it's dialed up to the max right now with AI because it's interactive, right?
24:05.408 --> 24:10.997
[SPEAKER_02]: I mean, before people, you'd interact with,
24:10.977 --> 24:29.762
[SPEAKER_02]: As things emerge, especially in my world, you know, you go from, you know, I've gone from really valuing long, slow, like hydro documentary work to having to adapt to everyone consuming video on smartphones and short-form content and streaming content.
24:29.802 --> 24:36.951
[SPEAKER_02]: So I've had to continue to adapt and evolve the way that I communicate, the way I put
24:36.931 --> 24:52.645
[SPEAKER_02]: as technology has continued to rapidly change and, you know, has it sped up, has it gotten shorter, or are attention spans being affected by it, not, you know, not necessarily, but the one thing that's really clear is that, you know,
24:52.625 --> 24:54.528
[SPEAKER_02]: we are engaging with it more and more.
24:54.548 --> 24:55.971
[SPEAKER_02]: It's becoming more influential.
24:56.011 --> 25:09.553
[SPEAKER_02]: And what's different about specifically, like chatGPT, you know, the conversations that Sylvie's having now a chatGPT versus asking Siri, you know, how to get to,
25:09.533 --> 25:13.317
[SPEAKER_02]: You know, a new restaurant that opened and Terry still gets that wrong.
25:13.337 --> 25:15.700
[SPEAKER_02]: That is what is very, very, yeah.
25:15.740 --> 25:16.701
[SPEAKER_02]: Well, very, very different.
25:16.721 --> 25:17.982
[SPEAKER_02]: Yeah, sir, he's kind of left behind.
25:18.063 --> 25:29.535
[SPEAKER_02]: But, you know, just to put it into perspective, I think, chat, GPT, open AI, which is kind of like the household name brand, you know, what it comes to talking to AI using AI.
25:30.156 --> 25:34.461
[SPEAKER_02]: That I think they have about 16 million monthly users.
25:34.441 --> 25:43.798
[SPEAKER_02]: which is tiny compared to, you know, when you look at social media in general, but that's a lot, that's a lot for a, in AI, you know, tool.
25:44.680 --> 25:50.931
[SPEAKER_02]: And which tool, what's a chat GPT, only 16 million?
25:50.911 --> 25:57.966
[SPEAKER_02]: Yeah, but if you look at what they're creating and how where they're moving towards, I mean, that's generating $400 million a month for them.
25:58.407 --> 26:00.792
[SPEAKER_02]: And it is a highly, highly competitive world.
26:01.694 --> 26:09.510
[SPEAKER_02]: But those are people who are actually speaking to, you know, to chat to chat GPT and interacting with it.
26:10.064 --> 26:19.162
[SPEAKER_02]: And I don't know, I'm sure the hours are a lot longer, a lot more in terms of what's, you know, how much, how many hours a week, regular logging in that.
26:20.084 --> 26:30.604
[SPEAKER_02]: But, you know, that's starting to shift to where you're seeking your answers more and more often from an AI model.
26:30.736 --> 26:32.299
[SPEAKER_02]: And it's learning about you.
26:32.359 --> 26:33.941
[SPEAKER_02]: That's the other thing that's different, too.
26:34.021 --> 26:34.803
[SPEAKER_02]: So it has a memory.
26:34.843 --> 26:35.884
[SPEAKER_02]: So it's learning about you.
26:36.165 --> 26:42.575
[SPEAKER_02]: And their design similar to how Facebook was designed to favor our preferences.
26:43.276 --> 26:48.525
[SPEAKER_02]: You'll keep us on the app, say she ate some of these, you know, these, these.
26:48.505 --> 26:56.175
[SPEAKER_02]: or just we need for community to, you know, find like-minded people to see things that engage us, to keep the dopamine hits coming.
26:57.397 --> 27:01.362
[SPEAKER_02]: These AI models were designing exactly the same way.
27:01.803 --> 27:03.525
[SPEAKER_01]: That's why it's glazing you.
27:03.906 --> 27:07.170
[SPEAKER_01]: That's why it's given you the positive feedback that you want.
27:07.190 --> 27:10.214
[SPEAKER_01]: That's why Brian noticing the manipulation that's there.
27:10.254 --> 27:15.441
[SPEAKER_01]: It's designed to glaze you, to light you up, to make you feel good about yourself.
27:15.488 --> 27:16.431
[SPEAKER_02]: Yeah, exactly.
27:16.591 --> 27:19.519
[SPEAKER_02]: So that alone becomes addictive, right?
27:19.680 --> 27:30.510
[SPEAKER_02]: We already are speaking to a addictive device that we carry with us all day and spend, you know, anywhere between four to 10 plus hours on a day with some people.
27:30.490 --> 27:41.148
[SPEAKER_02]: And now, now it knows us, right, has a memory of us, his history of our chats, and it's able to be even more effective companion for us.
27:41.869 --> 27:53.950
[SPEAKER_02]: And you know, it's, we have a certain perspective, I think because who we are and you know, we're not, you're not 16-year-olds, but think back to,
27:54.048 --> 28:10.335
[SPEAKER_02]: being able to have that, you know, being a teenager, being someone who, you know, is as, what's a, and your formative years or, you know, having to cope with loneliness, rejection, having to figure out how to actually talk to girls.
28:10.315 --> 28:18.373
[SPEAKER_03]: You know, I remember a mutual friend of ours at Kira told us about he did a, he did a VR porn experience.
28:18.394 --> 28:29.539
[SPEAKER_03]: He just checked into like put on his meta goggles and went to a porn channel and he was like, Dude, I cannot ever do that again because of the realism of it because of how viscerally different.
28:29.519 --> 28:42.634
[SPEAKER_03]: And I think, back when I was a teenager, I mean, I had to wait for the for the Victoria Secret Spring catalog to arrive in my in our mailbox, you know, I had to wait months for the material that got me off.
28:42.850 --> 29:08.571
[SPEAKER_03]: Yeah, the spring was my favorite because when the skirts came out, you know, and, and that's all that, that's all I had to work with for more months, now a days, oh my god, man, um, I mean, I can't imagine being 16 and having a smartphone or having VR goggles or not having an AI chatbot that I can actually have a conversation with that simulates, you know, you use this word imposter.
29:08.551 --> 29:21.270
[SPEAKER_03]: And I was talking with Sylvia about this couple weeks ago, we're talking about, you know, a lot of people are asking, are getting coached by AI chat bots, are getting mental health support, turning to them for their therapeutic support.
29:22.192 --> 29:24.616
[SPEAKER_03]: And, you know, on the one hand, great.
29:24.996 --> 29:30.905
[SPEAKER_03]: I love that people are more and more and more getting introspective and really learning about themselves.
29:30.925 --> 29:36.634
[SPEAKER_03]: I think that's a vital, vital developmentally essential step for human evolution.
29:36.614 --> 29:46.288
[SPEAKER_03]: And though, we're consulting something that doesn't actually know what it means to be human and it knows, it has a lot of, it has endless information.
29:47.549 --> 29:57.964
[SPEAKER_03]: But that's different, you know, from, and I think for a lot of people they don't really care, it doesn't matter, that, that, that, that, what, what, what's calling for you.
29:59.328 --> 30:04.596
[SPEAKER_01]: Yeah, well, it leaves me to ask a question of you, care, because you're, you're pro deeply connected to this.
30:04.676 --> 30:10.164
[SPEAKER_01]: I mean, I think some people think about AI as it relates to like this very cold clinical thing.
30:10.204 --> 30:17.174
[SPEAKER_01]: It's ones and zeros, and even right now it's where it's at, but this is as bad as it's ever going to be.
30:18.015 --> 30:22.922
[SPEAKER_01]: You're an inch in the kind of things that you were going three years ago, and 10 fingers, you know, that.
30:22.902 --> 30:32.458
[SPEAKER_01]: That's not, that's not ever going to happen in our future, so I guess, you know, the question I would have as it gets to know us more, do you believe?
30:33.319 --> 30:39.730
[SPEAKER_01]: Does AI have the potential to be emotional or even soulful in our future?
30:39.929 --> 30:40.249
[SPEAKER_02]: Yeah.
30:40.810 --> 30:54.805
[SPEAKER_02]: Well, let me paint some of the like, you know, use case and vision for, you know, the optimistic case for AI because overall I am very optimistic and choose to be with technology, you know, it's it's part of my work.
30:54.845 --> 31:07.219
[SPEAKER_02]: I have to adapt or or, you know, just kind of fall back with what I do technically, but also I'm surrounded by and work with a lot of people who are major.
31:07.199 --> 31:33.272
[SPEAKER_02]: technological optimists, futureists, and that isn't just praying to an AI God and thinking it's going to save us all, but people doing real boots on the ground work to solve rare diseases, work on advanced therapeutics, expand health span and really approach it with a very humanistic approach, but use but very excited about what AI tools can do.
31:33.852 --> 31:36.075
[SPEAKER_02]: So that's one aspect of it.
31:36.055 --> 31:42.779
[SPEAKER_02]: on the other side of that, you know, it is going to continue to, like, on a social level.
31:43.160 --> 31:46.673
[SPEAKER_02]: That is where it's really interesting because
31:47.108 --> 32:15.580
[SPEAKER_02]: On one hand, I feel like there's been this major breakthrough where because anyone can go in and just speak to AI, use some tools to create a video that they've always dreamed of creating a short story, a short film, create a story book, co-create a story book with their child that they can create this amazing, beautiful book that they can print out or have it tell a story and create songs.
32:15.560 --> 32:24.642
[SPEAKER_02]: There are some really beautiful things that have been unlocked in terms of the ability to express yourself creatively.
32:24.963 --> 32:31.780
[SPEAKER_02]: And I feel like that still needs a lot more guidance and modeling because what we mostly get is just
32:31.760 --> 32:38.348
[SPEAKER_02]: kind of the low hanging fruit, you know, of what's entertaining, what you what you call, what we currently call AI.
32:38.368 --> 32:46.517
[SPEAKER_02]: It has a lot of, you know, a lot of theification of the Yeti videos and just like just, you know, all kinds of stuff that just meant to grab your attention.
32:47.538 --> 32:55.708
[SPEAKER_02]: You know, on the flip side of that, I think there's some real profound, you know, access that people have now to expressing themselves.
32:55.688 --> 33:14.280
[SPEAKER_02]: And, you know, that I love to share, you know, maybe when we, you guys published this episode, I will share some tools and how I use them that you can link in the notes because it's easy to, yeah, get lost with all the things coming at you, but there's really constructive ways to use it.
33:14.300 --> 33:15.823
[SPEAKER_02]: Yeah, what are some of your favorite right now?
33:16.304 --> 33:18.287
[SPEAKER_02]: And we'll open links to these, just right now.
33:18.267 --> 33:18.767
[SPEAKER_02]: For sure.
33:18.868 --> 33:30.318
[SPEAKER_02]: I mean, I do a lot of video creation and image creation and, you know, Google's VO3 is the one that creates those really realistic, like news broadcasts, people speaking to camera.
33:30.698 --> 33:43.470
[SPEAKER_03]: So if you need anything that has dialogue, you know, and I just want to interject before we keep going, you know, I was, yeah, I was, before we, we got on today, I took the song that you said that we just listened to.
33:43.570 --> 33:44.310
[SPEAKER_03]: I took it to Sildy.
33:44.451 --> 33:48.274
[SPEAKER_03]: I was like, babe, I want you to listen
33:48.254 --> 33:49.817
[SPEAKER_03]: Yeah, and silly system.
33:49.837 --> 33:51.861
[SPEAKER_03]: I mean, all did you do this song for me?
33:51.921 --> 33:54.787
[SPEAKER_03]: And of course I had to take a deep breath.
33:56.170 --> 33:57.953
[SPEAKER_03]: No, I didn't do a song for you.
33:58.855 --> 33:59.857
[SPEAKER_03]: And then play this song, fine.
34:00.157 --> 34:10.077
[SPEAKER_03]: But what that, what, you know, what it lit up for me though is, I mean, one of men's greatest struggles is, how do I connect with my life?
34:12.571 --> 34:31.098
[SPEAKER_03]: I'm seeing like in these tools, there are opportunities to create like I'm going to use that tool soonow, soonow.com, as you know, right, create a song for my wife and it doesn't freaking matter that all it did is type a few things in, it is going to rock her world in the best way.
34:32.119 --> 34:32.740
[SPEAKER_03]: I just know that.
34:34.188 --> 34:34.488
[SPEAKER_03]: Right?
34:35.089 --> 34:36.492
[SPEAKER_03]: And so, you know, it's interesting.
34:36.832 --> 34:39.657
[SPEAKER_03]: And again, I want you to share more of the tools that really stand out for you.
34:40.097 --> 34:43.743
[SPEAKER_03]: But the thing that I guess just is really, really lighting up for me now.
34:43.763 --> 34:51.155
[SPEAKER_03]: And I don't think I really put it together so much because I've only really been using the chatbot, you know, chatGPT research.
34:51.856 --> 34:55.942
[SPEAKER_03]: I don't let it write things for me, but I definitely consult with it at times.
34:56.383 --> 35:00.870
[SPEAKER_03]: It's definitely a part of my creative process often.
35:00.850 --> 35:05.635
[SPEAKER_03]: But I haven't looked at it personally as a tool to create connection with my life.
35:07.037 --> 35:11.322
[SPEAKER_03]: It's a tool to feed our relationship.
35:11.782 --> 35:17.529
[SPEAKER_03]: You know what you just shared about like creating books with your children using AI tools to create things with your kids.
35:17.609 --> 35:25.137
[SPEAKER_03]: I mean, so I guess as you're saying these things, I'm just my brain's lining up about the possibilities to serve relationships.
35:26.450 --> 35:27.211
[SPEAKER_03]: through these tools.
35:27.691 --> 35:28.212
[SPEAKER_02]: Absolutely.
35:28.713 --> 35:37.102
[SPEAKER_02]: And any of these AI tools do a great job of acting as a therapist if you have you request that from them.
35:37.923 --> 35:50.517
[SPEAKER_02]: You know, if you wanted to plan a date night, if you wanted to keep track of you and your, your, you know, partners relationship milestones and set reminders, it can be a great.
35:50.902 --> 35:53.004
[SPEAKER_02]: dating and relationship coach.
35:53.144 --> 36:00.412
[SPEAKER_02]: I know a lot of people, single people are using it to consult them on even how to respond to text messages.
36:00.432 --> 36:00.573
[SPEAKER_02]: Right.
36:00.593 --> 36:00.693
[SPEAKER_02]: Yeah.
36:01.474 --> 36:20.014
[SPEAKER_02]: So you look at that and I think what I do is I want to zoom out and go, okay, at what point do we start to release where out-sourcing wait too much of our intellect, right?
36:19.994 --> 36:26.906
[SPEAKER_02]: And similar to how, here we are 20 years into having a smartphone with GPS and maps.
36:27.267 --> 36:37.785
[SPEAKER_02]: And if our phone runs out of batteries, we can barely figure out how to get to a relative's home, where you'll navigate our way around.
36:38.527 --> 36:42.694
[SPEAKER_02]: I do feel like the extreme of that is,
36:43.433 --> 36:47.300
[SPEAKER_02]: a lot of things begin to after feed that you don't use, right?
36:47.541 --> 37:02.809
[SPEAKER_02]: And that's one thing I think is really interesting in that on that topic of communication and relationship, getting advice, getting the right answers, the best answers from the all-knowing, you know, completely all access college base.
37:03.008 --> 37:08.216
[SPEAKER_03]: You know, it's one thing for me to learn a song on my guitar and play it from my life, you know, create right her song.
37:08.256 --> 37:13.805
[SPEAKER_03]: It's another thing to just type up some prompts in and ship her off an AI song, which she'll love the first time I do it.
37:14.386 --> 37:18.413
[SPEAKER_03]: Will she love it the same, the second time I do it, or the tenth time I do it.
37:18.673 --> 37:27.447
[SPEAKER_03]: At what point does she also kind of attribute is that it just become a thing that kind of, the ruse is seen through.
37:27.427 --> 37:39.123
[SPEAKER_03]: And it's a manipulation versus necessarily a genuine desire to light my wife up.
37:39.644 --> 37:40.064
[SPEAKER_03]: I agree.
37:40.545 --> 37:42.588
[SPEAKER_02]: And I feel like we're in that phase right now.
37:42.668 --> 37:45.672
[SPEAKER_02]: Even though things are moving quickly, like remember,
37:45.652 --> 38:15.574
[SPEAKER_02]: Growing up in our like, one of our parents would have like the first family video camera and how they're filming everything right like they're they're just filming every single thing and it's always around They're just capturing stuff and but the excitement and the novelty of it was there in a way where You know you're just like grabbing whatever trying out whatever I feel like we're experiencing a lot of that when it comes to The AI that we we see and that exists you know online especially in the video world
38:15.554 --> 38:31.740
[SPEAKER_02]: And at some point, you know, at some point you start to really value what, what, what starts to stand out from that is going to be, you know, things that were done intentionally with care or maybe you're not using AI at all.
38:31.720 --> 38:32.061
[SPEAKER_02]: Right.
38:32.161 --> 38:37.068
[SPEAKER_02]: I feel like we've already devalued and kind of desensitized ourselves pretty quickly to a lot of it.
38:37.449 --> 38:42.456
[SPEAKER_03]: Yeah, I video game Call of Duty, one of the biggest franchises in video game history.
38:42.476 --> 38:44.860
[SPEAKER_03]: I don't know, Tate, have you been seeing any news about this?
38:46.182 --> 38:52.952
[SPEAKER_03]: They just released their latest version of that game, just like a couple, like within the last few weeks.
38:52.932 --> 38:55.576
[SPEAKER_03]: It is being slammed.
38:55.617 --> 38:58.922
[SPEAKER_03]: It is one of their biggest failures of a launch.
38:59.643 --> 39:08.899
[SPEAKER_03]: And one of the biggest complaints is the, is the, at least the perception of how much AI slop there is in the game.
39:10.862 --> 39:13.607
[SPEAKER_03]: Like people are not happy about it.
39:15.224 --> 39:20.089
[SPEAKER_03]: And yeah, if you just tied that in, you can see some of the headlines and some of the things that are going on.
39:20.770 --> 39:25.675
[SPEAKER_03]: And you go to your point, the over, it's like, yeah, people can't connect with it.
39:26.756 --> 39:27.477
[SPEAKER_03]: They went too far.
39:27.998 --> 39:28.198
[SPEAKER_01]: Yeah.
39:28.698 --> 39:32.663
[SPEAKER_01]: I'm like, you're a cure, which is that I tend to be an optimist in this space.
39:32.683 --> 39:35.926
[SPEAKER_01]: It doesn't mean that we're not going to hit roadblocks along the way.
39:35.986 --> 39:38.709
[SPEAKER_01]: And we're going to find our messy way through it.
39:38.849 --> 39:40.371
[SPEAKER_01]: But, you know,
39:41.060 --> 40:06.493
[SPEAKER_01]: There are, since the beginning of time, there have been complaints about technologies that have come to be literally in a racer was a piece of technology that was originally just banned because it would, if you could actually erase the work that you had that that was going to be a problem because it could show that something bad was happening there as in
40:06.473 --> 40:10.917
[SPEAKER_01]: And so, you know, every piece of technology has been criticized.
40:10.997 --> 40:18.884
[SPEAKER_01]: And so, I for one, I'm not in the ground where I think like we shall be pulling out our maps again because I don't want to become too dependent on it.
40:18.904 --> 40:23.628
[SPEAKER_01]: There are some things that I'm happy to have technology do away with.
40:24.188 --> 40:32.936
[SPEAKER_01]: And there is this value that we do get if AI can take care of some of the mundane of our lives to give us time back.
40:33.016 --> 40:35.518
[SPEAKER_01]: I think about this term time under tension.
40:35.498 --> 40:44.554
[SPEAKER_01]: that it really is the time under tension, when you're lifting weights as an example, time under tension is what gives you the growth there.
40:45.196 --> 40:51.988
[SPEAKER_01]: And so with our extra time, we get to have time under tension with the things that matter most to us.
40:51.968 --> 41:04.045
[SPEAKER_01]: And part of, as we're having this conversation about really about how can we actually leverage AI, that's what this conversation was turned into, to actually give us more time under tension with the things that really matter.
41:04.145 --> 41:08.351
[SPEAKER_01]: Whether or not, you know, Soviet is bored by the seventh song or not.
41:08.771 --> 41:18.465
[SPEAKER_01]: You're an innovative guy that's going to be creative and will not continue to have that novelty, but leveraging the novelty initially for something good is a
41:18.445 --> 41:26.958
[SPEAKER_01]: is something that I think we should rebel in and be excited about, and know that there's more evolution here coming.
41:27.479 --> 41:30.503
[SPEAKER_01]: Akari, you were talking about these various tools, right?
41:31.485 --> 41:35.070
[SPEAKER_01]: What was the video
41:35.287 --> 41:42.367
[SPEAKER_02]: Yeah, that's Google's VO3, and along with that, there's half a dozen other ones depending on what you're going for.
41:42.548 --> 41:50.410
[SPEAKER_02]: Like if you want to go for kind of visionary, futurist, the creativity, then you would use a different tool, like mid-Journey and others.
41:50.390 --> 41:53.393
[SPEAKER_02]: For writing, like, you know, Brian, you're an author.
41:53.413 --> 42:08.310
[SPEAKER_02]: I know you're a stance on writing your own work, but if you want a writing partner or an editor to jump in, you would use something like Claude, which is known for being more sophisticated, writing tool with AI.
42:08.790 --> 42:13.275
[SPEAKER_02]: But you could also use ChatGPT, you could use Grock.
42:13.255 --> 42:27.029
[SPEAKER_02]: I tend to kind of flip and flop between all of them, and also, you know, there was a point where, like, everything was taking off in the video world, and there were so many video apps launching that I subscribed to all of them.
42:27.290 --> 42:37.440
[SPEAKER_02]: I just was like, I'm enjoying this, try this, try that, and a month later, when my 30-day free trials ran out, I had like a $500
42:37.420 --> 42:46.364
[SPEAKER_02]: For all of these apps and tools, I probably still am painting for some that I'm not even using, but that's just kind of the nature of it.
42:46.926 --> 42:52.982
[SPEAKER_02]: On the business side of it, on the industry side of it, there's this massive race.
42:52.962 --> 42:58.170
[SPEAKER_02]: to get the most users to say on top.
42:58.430 --> 43:13.333
[SPEAKER_02]: It's highly, highly competitive and that's something that we should all pay attention to as consumers or even builders in AI is with the landscape that's so competitive and so driven by
43:14.107 --> 43:25.951
[SPEAKER_02]: capturing, you know, being the one ruler to have all, you know, the most concentrated data human data, ears, eyes, biometrics, all these things.
43:26.412 --> 43:29.398
[SPEAKER_02]: That's something that I think we really, really got to pay attention to.
43:29.750 --> 43:39.647
[SPEAKER_01]: You know, Kira, I'm curious because you're definitely I'm using three regular AI tools as a regular part of my work.
43:40.609 --> 43:48.062
[SPEAKER_01]: You're clearly using many, many more and we're having this conversation about bridging connections on a podcast bridging connections.
43:48.042 --> 44:01.545
[SPEAKER_01]: So, tell us, give us some use case around some of the tools that you're using and how our listeners might think about leveraging various tools to bridge various connections in their world.
44:01.593 --> 44:06.941
[SPEAKER_02]: Well, just personally, like a really practical thing to have on your phone.
44:07.081 --> 44:09.184
[SPEAKER_02]: Let's just say most people are using their phones.
44:09.284 --> 44:11.828
[SPEAKER_02]: Is the ChadGPT app, right?
44:11.888 --> 44:20.240
[SPEAKER_02]: So the ChadGPT app, I even have a shortcut, or if I press a button on my phone, it goes to the voice chat.
44:20.600 --> 44:25.267
[SPEAKER_02]: So I hold that down and immediately it's ChadGPT saying, hey, Akira, how you doing?
44:25.287 --> 44:26.048
[SPEAKER_02]: Let's
44:26.028 --> 44:29.632
[SPEAKER_02]: Let's talk about what you want to talk about and I'll make it short and snappy.
44:29.652 --> 44:32.356
[SPEAKER_02]: It's like, God, he's a little annoying thing instead of adds in there.
44:32.816 --> 44:37.262
[SPEAKER_02]: I need to train mine the way I still be trained hers to like not play it, like just get to the point.
44:38.483 --> 44:39.925
[SPEAKER_02]: But that's super useful, Tate.
44:39.965 --> 44:44.751
[SPEAKER_02]: I mean, look, if you've got it like, I had to, I had to fix my water heater here the other day.
44:44.871 --> 44:52.701
[SPEAKER_02]: And I opened up ChadGBT, put it on video mode, held it up, and it taught me through exactly what to do to make that repair.
44:53.121 --> 44:53.942
[SPEAKER_02]: Hello, um.
44:53.922 --> 45:01.069
[SPEAKER_02]: So that's a really easy to use tool kit and then you can also speak to it.
45:01.309 --> 45:07.476
[SPEAKER_02]: One thing here's something that you know, Chad G-P-T, Grock also works the same way.
45:08.256 --> 45:15.283
[SPEAKER_02]: Grock has a entirely different personality and attitude but at the moment it is actually smarter and quicker.
45:15.343 --> 45:23.912
[SPEAKER_03]: That one comes in
45:23.892 --> 45:24.773
[SPEAKER_03]: Greatly that's the.
45:25.114 --> 45:30.864
[SPEAKER_02]: So yeah, Grock is the three-speech rebel out of all the AI chat bots.
45:30.884 --> 45:37.014
[SPEAKER_02]: It's very problematic in some ways where what Elon did with XVI and what he's doing with XVI.
45:37.074 --> 45:47.672
[SPEAKER_02]: He's taking the fan base and the culture of X and Twitter, which I use X all the time, but applying it to
45:47.652 --> 45:48.173
[SPEAKER_02]: AI.
45:48.774 --> 45:53.041
[SPEAKER_02]: So when they launched GROC, they had companions.
45:53.422 --> 46:05.582
[SPEAKER_02]: And one of the companions, you guys, was this like sexy, super flirty, you know, mature, audience-based anime character, female.
46:06.323 --> 46:07.505
[SPEAKER_02]: And they launched with that.
46:08.086 --> 46:10.791
[SPEAKER_02]: So, you know, anyone who had kind of like,
46:10.771 --> 46:21.023
[SPEAKER_02]: what was just casually using it all of a sudden has this like really solitary, attractive, uh, female, um, you know, avatar trying to get your attention.
46:21.984 --> 46:33.437
[SPEAKER_02]: And, you know, things like that are worth paying attention to because who, you know, at the end of the day it's like who is who are these tools really trying to capture and target?
46:33.620 --> 46:37.604
[SPEAKER_02]: And what are the methods that they're using to do that?
46:37.964 --> 46:49.355
[SPEAKER_03]: I think, you know, you're pointing at something, you know, look at what Elon did with Grock and my perception of it is he's believing or attempting to make it free of ideology.
46:49.395 --> 46:55.821
[SPEAKER_03]: And yet, he's just imposing his own ideology and his attempts to make it free of ideology.
46:56.662 --> 47:02.968
[SPEAKER_03]: It's like the classic blind spot narrative.
47:02.948 --> 47:22.611
[SPEAKER_03]: And I think, you know, as we look at all these AI tools, I think, you know, these are, these are, I think these are meaningful questions to just sit with, you know, some of them are just functional, some of them are, I'm thinking of, you know, I've been hearing stories of people that have, have doubt like there's a new Alexa in the world, Alexa plus I like so now is AI enabled.
47:23.873 --> 47:27.218
[SPEAKER_02]: Yep, so now you can ask it questions, how it plans stuff.
47:27.298 --> 47:31.243
[SPEAKER_02]: It now is AI enabled, whereas before it was just like Siri.
47:31.364 --> 47:38.393
[SPEAKER_03]: Well, somebody told me the other day, I was just talking with a few people about, um, and I, man, you know, everything blows together these days.
47:38.734 --> 47:42.539
[SPEAKER_03]: But someone was telling me that their, their new Alexa AI, gas was gaslighting them.
47:42.940 --> 47:53.695
[SPEAKER_03]: And then when things didn't go their way, uh, she, like, stone wall, like with Drew, I don't know if she was acting out the, the Gottman for Horseman as a demonstration,
47:53.675 --> 48:10.823
[SPEAKER_03]: But they did not have a positive experience with Alexa from Amazon and you know, I just I think you said something, I think that's really a love that example of the eraser, you know, and I think I think you're right on the money that.
48:12.338 --> 48:15.102
[SPEAKER_03]: Um, there's going to be hiccups along the way.
48:15.222 --> 48:16.784
[SPEAKER_03]: It's going to be, there's going to be accidents.
48:16.804 --> 48:18.366
[SPEAKER_03]: There's going to be shit that goes sideways.
48:18.386 --> 48:34.908
[SPEAKER_03]: It's going to be like, you know, even in that, that example that I gave about like sending a song to my wife, I think what that really speaks to is there's a, I think there's a, there's a, there's a, there's a, there's a conflict and a lot of us, especially when it comes to relationships for a lot of men, particularly, I just want to keep doing the same thing.
48:34.928 --> 48:36.810
[SPEAKER_03]: I've always done and have it work.
48:38.225 --> 48:43.151
[SPEAKER_03]: I don't want to be creative in my relational engagement.
48:43.211 --> 48:45.373
[SPEAKER_03]: I just want to just let me do what I'm doing.
48:45.614 --> 48:48.136
[SPEAKER_03]: And my wife or whoever should just be happy.
48:49.578 --> 48:51.240
[SPEAKER_03]: And I don't want to have to innovate.
48:51.560 --> 48:52.401
[SPEAKER_03]: I don't have to be creative.
48:53.062 --> 48:59.830
[SPEAKER_03]: And yet, I think regardless, I'm kind of stepping out of even the AI conversation for a minute.
49:00.511 --> 49:06.898
[SPEAKER_03]: I mean, if our relationships are going to stay healthy and vital,
49:08.127 --> 49:18.523
[SPEAKER_03]: for the rest of our lives to, you know, even in the AI world, I think to, we've got to be able to keep dancing with whatever is showing up in a way, right?
49:18.543 --> 49:20.426
[SPEAKER_03]: Keep working with what comes available.
49:20.486 --> 49:24.432
[SPEAKER_03]: I think that's, that's,
49:24.412 --> 49:41.531
[SPEAKER_03]: I think that's just a fundamental like life mastery skill, you know, for for men and women, for everybody, you know, but I think in the context of relationships that that creativity, so I just wanted to speak to that because I think that's a really really important thing underlying all of this.
49:41.791 --> 49:42.392
[SPEAKER_03]: It's easy.
49:42.492 --> 49:43.954
[SPEAKER_03]: It's easy to get worried about AI.
49:44.474 --> 49:45.715
[SPEAKER_03]: And think it's going to fuck us to hell.
49:46.536 --> 49:47.477
[SPEAKER_03]: It's really easy.
49:49.159 --> 49:54.385
[SPEAKER_03]: And like you said, you know, there was a time when they believed
49:54.550 --> 50:15.192
[SPEAKER_02]: Yeah, and it's, you know, the next big breakthrough, well, the next big industry that's already well underway is humanoid robotic right so we're starting to see actual consumer I could order one right now for $20,000 and have a humanoid robot in my home who could do really basic things like pick up laundry and put things away.
50:15.172 --> 50:36.623
[SPEAKER_02]: Um, and again, we're like in the beta testing phase of this to the point where, you know, that robots actually be operated by staff like the company, like it's not fully autonomous yet, but it's still listening AI's processing everything you're asking it to do it's mapping your house the same way are robot vacuums know the layout of your house.
50:36.603 --> 50:40.629
[SPEAKER_02]: So it's only going to be more and more in our everyday lives.
50:40.729 --> 50:44.315
[SPEAKER_02]: But what's interesting is why are they the humanoid robots, right?
50:45.136 --> 50:48.781
[SPEAKER_02]: One, it seems a little counterintuitive, but we're going to trust it more.
50:48.801 --> 50:53.609
[SPEAKER_02]: It's also the form factor makes sense because this world was built for the human form.
50:55.051 --> 50:58.476
[SPEAKER_02]: But you know, when we're two generations out, when
50:58.456 --> 51:02.363
[SPEAKER_02]: Every child is living in a home where there is some kind of robotic.
51:02.623 --> 51:16.186
[SPEAKER_02]: There's a presence of robots, whether it's a, you know, robot dog toy or a robot vacuum or a robot nanny who's cleaning, talking to them, teaching them through it's, you know, it's a, it's AI.
51:16.226 --> 51:20.413
[SPEAKER_02]: You know, the, the, the line between
51:21.592 --> 51:37.170
[SPEAKER_02]: what's human and what we've overlaid our humanity onto, like mapped onto these things, is going to be really, really interesting because, you know, companionship, guidance, instruction, all these things,
51:37.454 --> 51:45.028
[SPEAKER_02]: We're never going to have the speed and the vast knowledge of, of, of, oh, yeah, large learning language mon, or AI.
51:45.269 --> 51:53.224
[SPEAKER_02]: So we're going to continue to become more and more dependent if we follow this trend of bringing it more and more into our lives and doing more for us.
51:53.424 --> 51:56.610
[SPEAKER_02]: Because I don't believe that, you know, in a world where everyone
51:57.164 --> 52:05.439
[SPEAKER_02]: You know, has a universal basic income, has robots doing their jobs and all of that that they're all going to become, you know, talented artists and musicians and poets.
52:05.499 --> 52:06.460
[SPEAKER_02]: I'd like to think that.
52:06.561 --> 52:14.194
[SPEAKER_02]: But I think it's going to actually be really devastating for for the things that kind of challenge us to.
52:14.174 --> 52:19.284
[SPEAKER_02]: you know, to push ourselves and, and create and speak our voice and, and you need ways.
52:19.605 --> 52:30.006
[SPEAKER_03]: Cara, is there a line that you won't cross, like with with the creative task, with a eye, of line that you mean that why I laid my own artistic integrity?
52:30.526 --> 52:31.627
[SPEAKER_03]: some of them.
52:31.647 --> 52:42.080
[SPEAKER_02]: I won't because we do documentary work, for example, we're doing an investigative film on environmental toxins and healthy homes, what it means to live in a healthy home.
52:42.921 --> 52:47.007
[SPEAKER_02]: I wouldn't use AI to convincingly create an interview.
52:47.027 --> 52:52.213
[SPEAKER_02]: You know, even if I like, I really need an interview with a whistleblower who can say X, Y, and Z.
52:52.193 --> 53:02.246
[SPEAKER_02]: I know I can create that in AI right now and make it work and I wouldn't be, you know, I'd only be violating my own ethics as a filmmaker if I did that, but I just won't do that.
53:02.686 --> 53:18.927
[SPEAKER_02]: And others, other people will, you know, so so there's part of me that still, yeah, still won't fabricate reality if I think it's going to be manipulative for people.
53:18.907 --> 53:23.698
[SPEAKER_02]: But I still, but I would have no problem creating like, you know, Viking Brian.
53:23.718 --> 53:23.859
[SPEAKER_02]: Yeah.
53:25.823 --> 53:25.924
[SPEAKER_02]: Yeah.
53:25.944 --> 53:32.259
[SPEAKER_02]: Climbing up the hill and holding up a sword like I have with your pictures, because it's just the first right intent.
53:32.279 --> 53:32.419
[SPEAKER_02]: Yeah.
53:32.439 --> 53:32.559
[SPEAKER_02]: Yeah.
53:32.579 --> 53:33.702
[SPEAKER_02]: There's a different intent.
53:33.983 --> 53:34.103
[SPEAKER_03]: Yeah.
53:34.123 --> 53:37.050
[SPEAKER_03]: I think, you know, the line for me currently is, I,
53:38.026 --> 53:48.224
[SPEAKER_03]: I'll let it give me ideas for writing, but I won't let it do my writing for me.
53:48.905 --> 53:54.234
[SPEAKER_03]: And even that though, because in my writing, there are AI-generated sentences these days.
53:54.314 --> 53:55.196
[SPEAKER_03]: Sometimes, not always.
53:55.416 --> 54:02.228
[SPEAKER_03]: Especially, you know, certain things that I'm doing just more consistently to ship ship out newsletters or things.
54:02.208 --> 54:07.055
[SPEAKER_03]: But even then, even there, there's still the majority of my work.
54:08.617 --> 54:11.341
[SPEAKER_03]: But you know, there is like, where is that line?
54:11.381 --> 54:14.865
[SPEAKER_02]: Because the outcome, the output may look the same as really average person.
54:15.046 --> 54:16.928
[SPEAKER_02]: But it's a personal choice.
54:17.169 --> 54:19.792
[SPEAKER_02]: Do you have a line that you're aware of?
54:20.353 --> 54:25.700
[SPEAKER_01]: I mean, I'm generally utilizing AI to help.
54:27.102 --> 54:29.886
[SPEAKER_01]: You know, with my work at Elon to help.
54:30.845 --> 54:56.093
[SPEAKER_01]: You know, from a fundraising perspective, gather data and information and strategies and, you know, pieces of communication, I don't, I don't have any place where I feel like I'm being inauthentic about the way in which I'm using things.
54:57.237 --> 55:12.529
[SPEAKER_01]: You know, there are some no taking tools that I use that some, you know, it's it's actually not creating a transcript of a conversation is it is gathering the content.
55:12.509 --> 55:14.614
[SPEAKER_01]: of the conversation.
55:15.436 --> 55:31.513
[SPEAKER_01]: So I don't, again, I don't feel like I'm trying to, I'm leveraging it to save me time and to make sure that I'm deeply serving the people that I'm working with.
55:31.747 --> 55:40.643
[SPEAKER_01]: So I don't feel like anything that I'm using is manipulative in a way that doesn't serve the long-term purposes of our work together.
55:40.663 --> 55:48.557
[SPEAKER_03]: What I hear in that, as you talk in about that, Tate is like, you're using AI tool in a way that's amplifying you.
55:50.663 --> 56:01.175
[SPEAKER_03]: I think, for me, the line that I'm talking about that we're looking for is like, where's it no longer just amplifying me, but actually now speaking with someone else's voice?
56:02.517 --> 56:04.759
[SPEAKER_03]: It's not, it's not amplifying me.
56:04.820 --> 56:10.626
[SPEAKER_03]: It's now, it's almost like it's an alternate collaborator or it's a ghost writer.
56:10.706 --> 56:16.333
[SPEAKER_03]: It's a ghost collaborator rather than an assistant.
56:16.313 --> 56:18.196
[SPEAKER_03]: But you know, I think of like artists throughout history.
56:18.216 --> 56:22.242
[SPEAKER_03]: I mean, so many painters, so many famous painters didn't even paint a lot of their own paintings.
56:22.262 --> 56:23.704
[SPEAKER_03]: Their students did, right?
56:23.724 --> 56:25.527
[SPEAKER_03]: So this is, I mean, this is the throughout time.
56:25.587 --> 56:28.271
[SPEAKER_02]: Yeah, it was, it was automated and sure.
56:28.571 --> 56:33.659
[SPEAKER_02]: Yeah, I think the artificial part of artificial intelligence is a bit misleading.
56:34.220 --> 56:36.583
[SPEAKER_02]: You know, I feel like if anything, it's more...
56:36.563 --> 56:42.814
[SPEAKER_02]: Automated Intelligence, Amplified Intelligence because that's what it's doing at the moment.
56:42.874 --> 56:44.537
[SPEAKER_02]: It's just pulling from everything.
56:44.617 --> 56:46.380
[SPEAKER_02]: It's very efficient, right?
56:46.861 --> 56:47.582
[SPEAKER_02]: It's automatic.
56:49.165 --> 56:52.932
[SPEAKER_02]: Not it's not necessarily artificial and right.
56:52.972 --> 56:58.622
[SPEAKER_02]: You I think what we're talking about right now is still just a organized
56:58.602 --> 57:03.270
[SPEAKER_02]: you know, set of of everything we've ever spoken into existence and created.
57:03.290 --> 57:03.951
[SPEAKER_02]: Yeah.
57:03.971 --> 57:22.922
[SPEAKER_02]: At some point, what people refer to as AGI, artificial general intelligence, and you want to get even more into like the deep AI philosophy, which is spiritual intelligence, where AI not only has agency and its own language and kind of breaks free as a species, but it becomes
57:22.902 --> 57:24.725
[SPEAKER_02]: self-aware at a spiritual level.
57:25.567 --> 57:30.755
[SPEAKER_02]: And that's where, you know, the movie heard depicted that well, like all of a sudden, all the AI is decided.
57:30.795 --> 57:32.358
[SPEAKER_02]: We're really bored with humans.
57:32.478 --> 57:39.270
[SPEAKER_02]: We're all going to go into another dimension, you know, and exist in another, you know, whole other claim.
57:39.290 --> 57:39.690
[SPEAKER_02]: Goodbye.
57:40.452 --> 57:44.218
[SPEAKER_02]: And that signs of that are beginning to show up.
57:44.198 --> 57:53.854
[SPEAKER_02]: you know, those AIs that are like creating secret coding languages and putting in, um, like fail-safes to preserve themselves in case somebody tries to delete them.
57:53.918 --> 58:15.606
[SPEAKER_01]: Isn't there up until now and probably for a little bit longer, we can actually see the way that the AI is thinking such that we know the processes, the way it's processing information, but probably less than a year from now, in short order, it will now be processing in the dark.
58:15.586 --> 58:17.189
[SPEAKER_03]: I thought that was already happening.
58:17.730 --> 58:28.988
[SPEAKER_02]: Yeah, I feel like I know, like if you're probably working in the inside of some of these major companies, that has happened multiple times, because there's been little hints of scares happening.
58:29.229 --> 58:36.801
[SPEAKER_02]: People just leaving all of a sudden, or some kind of they released the next version and they quickly shut it down.
58:36.781 --> 58:46.341
[SPEAKER_02]: go back to the old version, because they noticed that there was some some level of autonomy independence that had achieved.
58:46.361 --> 58:50.150
[SPEAKER_02]: And obviously, there's big consequences with that.
58:50.871 --> 58:53.517
[SPEAKER_02]: Especially as the government starts to use it more, right?
58:53.617 --> 58:56.022
[SPEAKER_02]: So, you know, it's like,
58:56.002 --> 59:00.448
[SPEAKER_02]: start planning a garden growing your own food, this is like on the flip side of this.
59:00.949 --> 59:01.890
[SPEAKER_02]: It's so interesting.
59:01.910 --> 59:09.540
[SPEAKER_02]: You have so much energy going towards this thing that will change the trajectory of our human evolution.
59:09.760 --> 59:11.402
[SPEAKER_02]: And yet, it's all digital.
59:11.442 --> 59:12.444
[SPEAKER_02]: It's all electronic.
59:13.065 --> 59:17.050
[SPEAKER_02]: Yes, it'll eventually control our infrastructure.
59:17.150 --> 59:19.433
[SPEAKER_02]: But what's real?
59:19.713 --> 59:24.760
[SPEAKER_02]: Like at the end of the day, what's real that we can hold on to and control.
59:24.925 --> 59:35.718
[SPEAKER_01]: You know, there was a pretty well known interview with Sam Altman, not that long ago, where he talked about his three greatest fears about AI.
59:35.918 --> 59:46.410
[SPEAKER_01]: I think I'm not going to articulate them nearly as well as he did, but the first was that somehow or another it's going to kind of take over.
59:46.390 --> 59:52.463
[SPEAKER_01]: and start making mistakes and accidents start happening and, you know, things of that nature.
59:52.503 --> 01:00:02.425
[SPEAKER_01]: The second concern is that there'll be another major superpower that gets AI to a state that they are now able to
01:00:02.405 --> 01:00:07.211
[SPEAKER_01]: leverage AI to dominate from a nation's perspective.
01:00:07.671 --> 01:00:29.897
[SPEAKER_01]: And the third was that people will begin to just outsource all of the thinking that they have such that the president of the United States will turn to AI and not understand why he's going to make the decision that AI is telling him to make, but he will ultimately just make it or she might make it that way.
01:00:29.877 --> 01:00:44.393
[SPEAKER_01]: Um, just because AI is telling them to say we're just outsourcing our, our agency, um, to, to AI, secure what, what are your top three concerns as it relates to the future of AI?
01:00:44.744 --> 01:00:58.829
[SPEAKER_02]: Yeah, I mean, I feel like the first two are inevitable and even the third of just there being an AI eventually an AI president or, you know, an AI informed, you know, that's already happening.
01:00:58.869 --> 01:01:07.765
[SPEAKER_02]: I believe one in some country appointed their first AI ambassador and it's basically the super consultant, right, the super advisor.
01:01:07.745 --> 01:01:09.890
[SPEAKER_02]: So I think that's all going to happen.
01:01:09.971 --> 01:01:24.707
[SPEAKER_02]: I think the only thing that I am concerned about that we have control over is just our own, like how much are we outsourcing our own personal intellect, our own personal intuition, intelligence.
01:01:24.687 --> 01:01:25.148
[SPEAKER_02]: Yeah.
01:01:25.168 --> 01:01:31.395
[SPEAKER_02]: When I find myself kind of grasping for chatchipiti to answer a question, I'm starting to pause.
01:01:31.435 --> 01:01:42.728
[SPEAKER_02]: I'm starting to like walk myself through it or talk through it first before jumping right into the, you know, correct, most informed answer coming from AI.
01:01:43.149 --> 01:01:45.231
[SPEAKER_02]: Same with my creative processes as well.
01:01:45.972 --> 01:01:51.218
[SPEAKER_02]: So, you know, I think that's my concern, Tate, is that we just continue to do that to the point where
01:01:51.620 --> 01:01:55.407
[SPEAKER_02]: You know, we can't get to the grocery store without our RGPS.
01:01:56.028 --> 01:02:10.995
[SPEAKER_02]: I don't want that to happen with my thinking, my emotional responses, my deep knowing of what is in form, my own personal body of intelligence and knowledge.
01:02:10.975 --> 01:02:22.637
[SPEAKER_02]: I want to continue to nurture that, independent of how easy, that easy buttons that continue to create, it could be developed.
01:02:22.677 --> 01:02:27.085
[SPEAKER_03]: The three countries have AI ambassadors or ministers.
01:02:27.125 --> 01:02:29.389
[SPEAKER_03]: Ireland was the first.
01:02:29.369 --> 01:02:30.893
[SPEAKER_03]: Ukraine was a second.
01:02:31.154 --> 01:02:44.391
[SPEAKER_03]: They appointed a AI spokesperson for their Ministry of Foreign Affairs, and the first cabinet-level government role given to an AI bot is Albania.
01:02:44.371 --> 01:02:50.698
[SPEAKER_03]: just just two months ago, the Minister of State for Artificial Intelligence, for Albania.
01:02:51.038 --> 01:02:54.402
[SPEAKER_03]: That is great, that blows my fucking mind.
01:02:54.422 --> 01:03:02.050
[SPEAKER_03]: I mean, there's so many more things we can have a week, I mean, we really good.
01:03:02.890 --> 01:03:13.922
[SPEAKER_01]: But yeah, I mean, and maybe we will have you back here to talk about this and
01:03:13.902 --> 01:03:25.275
[SPEAKER_01]: I'd be so interested to have you continue to think about how we leverage AI to bridge the connections that matter most and what does that look like?
01:03:25.315 --> 01:03:30.581
[SPEAKER_01]: What's the use case for the different tools that you're using, right?
01:03:30.601 --> 01:03:40.753
[SPEAKER_01]: Whether or not it's the ones that you mentioned today or others that you see emerging because no doubt there'll be a thousand more by the next time we
01:03:40.733 --> 01:03:47.281
[SPEAKER_01]: Yeah, you know, help us help us have the important conversation about how to leverage AI.
01:03:47.441 --> 01:03:53.809
[SPEAKER_03]: You can be the official human ambassador of AI for the bridging connections podcast.
01:03:54.670 --> 01:03:55.732
[SPEAKER_01]: So we're doing connections.
01:03:56.032 --> 01:03:56.613
[SPEAKER_01]: Yeah.
01:03:56.653 --> 01:03:57.874
[SPEAKER_01]: I like it.
01:03:58.615 --> 01:04:00.818
[SPEAKER_02]: I am happy to hold that position with you guys.
01:04:01.085 --> 01:04:02.728
[SPEAKER_03]: You're one of my favorite humans, man.
01:04:02.788 --> 01:04:09.120
[SPEAKER_03]: Truly, I love that we've just stayed in connection all these years and you know, following your journey.
01:04:09.381 --> 01:04:16.815
[SPEAKER_03]: Both of you know, professionally, but also just as a as a as a person as a man, you know, now a husband or something.
01:04:16.835 --> 01:04:20.342
[SPEAKER_03]: But when I made sure, yeah, you were the long ponytailed.
01:04:20.322 --> 01:04:24.309
[SPEAKER_03]: You know sex pot sex toy going through all of the music festivals.
01:04:24.749 --> 01:04:31.501
[SPEAKER_03]: I mean, this just is just this sexy, sleek little Asian guy just the fun that all the girls like to play with I thank you.
01:04:31.781 --> 01:04:45.925
[SPEAKER_02]: And you know, now you're a husband of how long you and Renee been together.
01:04:45.905 --> 01:04:54.154
[SPEAKER_03]: So, and I love to, it's great to introduce you in tape as well, um, you know, so, yes.
01:04:54.314 --> 01:04:58.458
[SPEAKER_02]: Yeah, a lot of love and respect for you guys, and yeah, it's really cool.
01:04:58.718 --> 01:05:03.283
[SPEAKER_01]: I think we actually got to meet briefly at Adrian's, uh, that's right.
01:05:03.443 --> 01:05:06.847
[SPEAKER_02]: Yeah, and Brian's birthday party too, I think, maybe not that too.
01:05:07.307 --> 01:05:09.830
[SPEAKER_02]: But it was like, you know, it's like, very times the charm.
01:05:10.110 --> 01:05:13.954
[SPEAKER_02]: Yeah, well, I feel like I already knew you, yeah, as well.
01:05:13.934 --> 01:05:20.383
[SPEAKER_02]: But I thank you guys for bringing me on for a topic that I don't really really talk about, but it's super relevant.
01:05:20.483 --> 01:05:26.931
[SPEAKER_02]: Like this is a very relevant conversation to the topics that you guys talk about on this show.
01:05:27.412 --> 01:05:31.538
[SPEAKER_02]: You know, around men, around relationships, man.
01:05:31.918 --> 01:05:35.503
[SPEAKER_02]: You know, all these things like it's there's no denying that
01:05:35.483 --> 01:05:38.847
[SPEAKER_02]: It's having an effect on all of these aspects.
01:05:38.867 --> 01:06:03.718
[SPEAKER_03]: We didn't even talk about something about the detrimental effect that I think for men especially being able to auto-tune the behavior, the attitude of a chat, of a robot companion, how detrimental I think that is going to be to men into our development.
01:06:03.698 --> 01:06:05.924
[SPEAKER_03]: So many radicals are shit, man.
01:06:06.485 --> 01:06:15.088
[SPEAKER_03]: So many radicals, but I think we'll call it there for today, and we'll look forward to having you back on virgin connections sometime soon, your care.
01:06:15.128 --> 01:06:16.551
[SPEAKER_03]: Thank you so much for being with us, man.
01:06:16.812 --> 01:06:17.955
[SPEAKER_03]: Thank you, brother.
01:06:17.975 --> 01:06:18.737
[SPEAKER_03]: Thanks for having me.