Stressed or Burned Out? Discover clarity and confidence through a free immersive meditation and insight session - Click To Learn More
April 16, 2024

# 82: AI Threat to Humanity | Christopher Wright

Send us a text

In this episode we speak with Christopher Wright, a former Apache pilot who transitioned to the forefront of AI development. We explore the potentially ominous side of AI, discussing its role in modern warfare, the ethical boundaries being tested, and what the future may hold if we don't steer this technology wisely. Press play to uncover the unseen aspects of AI that could challenge our very humanity.

Timeline:

  • [05:00] First-hand insights into AI-driven drone warfare and autonomous decision-making systems.
  • [10:00] The ethical dilemmas and the real-world implications of AI in military applications.
  • [15:00] Exploring the 'AI as a god' concept and its influence on societal structures and governance.
  • [20:00] The potential for AI to reshape global power dynamics and the necessity for a humane approach to AI development.
  • [25:00] How blockchain technology could ensure a more transparent and equitable AI governance structure.
  • [30:00] Discussing the future of AI and its integration into daily life, and the balance between innovation and ethical responsibility.
  • [35:00] Christopher's vision for a pro-human future amidst the rise of AI.

Links & Resources:

Thank you for tuning in to this crucial discussion. If you're fascinated and alarmed by the power of AI, remember to rate, follow, and share this podcast. Your engagement helps us bring more of these vital conversations to light. Let's navigate the future of AI together, ensuring it serves humanity, not overpowers it.

Looking to reset and recharge? Our Immersive Meditation Experience is a live, virtual meditation session held every month. Sign up here to join the next session: newagehuman.com/monthlymeditation

_____________________________________________________________________________________________________
JOIN OUR NEWSLETTER FOR SNEAK PEAKS, UPDATES AND MORE
Sign up at Newagehuman.com/newsletter

CONNECT AND SAY HI:
Telegram: https://t.me/+sA6u1rY5e9Y5ZDgx
Website: https://www.newagehuman.com

DISCLAIMER
https://www.newagehuman.com/legal/

Transcript
WEBVTT

00:00:00.000 --> 00:00:11.467
I got to see firsthand the development of ai drone warfare A lot of people don't know, but, these systems are, um,, taking human life today by algorithm, meaning that you can launch a little drone system.

00:00:11.650 --> 00:00:12.589
The thing will fly out.

00:00:12.772 --> 00:00:15.162
And I'll actually pinpoint targets on its own.

00:00:15.211 --> 00:00:18.408
So it'll like select targets and execute those targets as it determines.

00:00:18.603 --> 00:00:25.233
these AI systems are reaching IQs that are beyond humans they look at this gift of AI as being the tool to achieve that.

00:00:25.812 --> 00:00:27.902
And ultimately like these guys are very spiritual.

00:00:27.922 --> 00:00:31.013
I mean, they you know, some of these tech CEOs, they, you can look it up.

00:00:31.013 --> 00:00:36.843
I mean, they talk about the spirituality of AI and they look at AI as being a God.

00:00:40.622 --> 00:00:41.112
All right.

00:00:41.112 --> 00:00:43.213
Welcome to the New Age Human Podcast.

00:00:43.243 --> 00:00:45.012
I'm your host, Jonathan Astacio.

00:00:45.192 --> 00:00:52.073
And today's episode, we're talking with Christopher Wright, who is the founder of the AI Trust Council.

00:00:52.383 --> 00:01:02.862
He has a background in army attack aviation and was inspired to help to make sure AI is used for the betterment of humanity rather than its destruction.

00:01:03.182 --> 00:01:19.013
Stick around because we will talk about AI current and future threats, including some surprising insights from Christopher when he was in the military and as well as of course, solutions and what we can do to stay safe and protect our freedoms.

00:01:19.162 --> 00:01:19.953
Hoorah.

00:01:20.462 --> 00:01:31.302
Now, before we begin, if you want to get alerted to future episodes and are enjoying the content, I ask that you like, and subscribe on YouTube or any other platform that you're watching right now.

00:01:31.563 --> 00:01:34.653
And, or listening to us just reach out to me.

00:01:34.653 --> 00:01:43.643
If you have any questions or if you have any ideas on future episode concepts and people you want me to talk to now, with that said, let's get to the show.

00:01:43.674 --> 00:01:46.784
Christopher Wright, thank you for coming on the show.

00:01:46.804 --> 00:01:47.424
How's it going, man?

00:01:47.713 --> 00:01:48.203
Thank you.

00:01:48.203 --> 00:01:48.403
Yeah.

00:01:48.403 --> 00:01:49.114
Thank you for having me.

00:01:49.114 --> 00:01:49.733
It's going great.

00:01:51.090 --> 00:02:06.825
we were talking about just before we hit the record button where I was telling you that I'm like This is gonna be interesting conversation because you have an interesting background and I want to know this story and I was fighting myself Asking you prior to him, the record button, how that transition was.

00:02:06.835 --> 00:02:08.816
So let's start off there.

00:02:08.825 --> 00:02:19.216
How did you transition from flying long bow Apaches in the middle East to running an AI company?

00:02:20.221 --> 00:02:24.411
Yeah, it's, it's kind of a funny story cause um, yeah, it really has to do with drone warfare.

00:02:24.461 --> 00:02:26.151
Yeah, I spent a long time in the army.

00:02:26.281 --> 00:02:29.591
Yeah, it was a combat engineer right after high school and then went on to flight school.

00:02:29.591 --> 00:02:40.510
And but I hadn't, uh, I was in the army reserves for a period of time and had an internship in, uh, in, uh, Newport beach, California, where I was working for a tech startup around the, uh, the.

00:02:40.711 --> 00:02:41.211
com bubble.

00:02:41.211 --> 00:02:41.251
So.

00:02:41.265 --> 00:02:53.145
Um, and so I've learned a lot about tech at that time then, uh, it's actually an interesting position because I was able to review business plans for a, like a venture capital, uh, company that was helping to fund these startups.

00:02:53.145 --> 00:03:00.925
And so I really got to see the foundation of the, uh, you know, the modern internet and all these companies that were coming in and, and, uh, getting built and things like that.

00:03:00.925 --> 00:03:12.691
And then, but, uh, anyway, so I ended up going off flight school and, uh, did deployment, uh, over to Afghanistan and, uh, And ultimately got out of the army and then started contracting the Middle East.

00:03:12.691 --> 00:03:17.760
The money, uh, when you're contracting is like three times what you make in the, and like the regular army pay.

00:03:17.760 --> 00:03:19.920
So it makes, it makes like, okay, this is common sense.

00:03:19.920 --> 00:03:20.640
You know, I can make

00:03:20.756 --> 00:03:21.135
Yeah.

00:03:21.281 --> 00:03:25.441
a boatload of cash, you know, just doing pretty much the same thing with less, uh, less rules.

00:03:27.211 --> 00:03:27.501
Yeah.

00:03:27.501 --> 00:03:34.741
So anyway, so, um, yeah, so I was teaching over in, uh, Dubai and, uh, or in Abu Dhabi for about, uh, almost 10 years.

00:03:35.681 --> 00:03:40.860
teaching Lambo academics and, um, and in the simulator and teaching, uh, Arab students how to fly.

00:03:40.860 --> 00:03:47.800
So I did that in, uh, Saudi Arabia, also Kuwait, uh, for a little bit of time, and then, and then also, uh, the UAE.

00:03:48.366 --> 00:03:49.475
What was that experience like?

00:03:49.475 --> 00:03:50.145
I'm curious.

00:03:50.510 --> 00:03:51.531
it was really interesting.

00:03:51.591 --> 00:04:00.260
Um, it, it's interesting coming from our military and, you know, you, and you're used to kind of complaining and, you know, kind of bitching about the way things go and you're like, oh my God, you know?

00:04:00.290 --> 00:04:06.980
So it's, it's not that these guys are, uh, they just have a The military, their military mindset is quite a bit different than ours.

00:04:07.091 --> 00:04:10.411
You know, it's not as yeah, it's just, it's, it's a different culture, you know?

00:04:10.441 --> 00:04:13.830
But, but it was awesome at the same time because It

00:04:13.866 --> 00:04:17.036
It was, it's not as strict, I guess you can say, is that what you're trying to say?

00:04:17.230 --> 00:04:17.990
yeah, yeah.

00:04:17.990 --> 00:04:24.461
So basically, you know, like you're done at like one o'clock, you know, every day and, and then, you know, and then, uh, over in the Middle East, it's pretty cool.

00:04:24.461 --> 00:04:31.180
Like the, um, You know, they're always giving you public holidays and so really if you your contract over there It's you get about 60 days off a year and

00:04:31.396 --> 00:04:31.995
Wow.

00:04:32.266 --> 00:04:32.735
That's nice.

00:04:33.040 --> 00:04:49.141
so anyway But yeah, so we're consulting with the you know the leadership there and uh on you know weapons and things like that uh, so I got to see firsthand the development of ai drone warfare yeah, so I would go to uh, You know different.

00:04:49.240 --> 00:04:52.250
You know trade shows and stuff like that and back even back in 2012.

00:04:52.571 --> 00:05:02.531
I was Um, saw a, uh, you know, drone swarm technology, you know, being implemented and, uh, and it's pretty next level, uh, as far as like capability.

00:05:03.380 --> 00:05:09.190
so I could see very quickly like, well, whoever's in charge of the system going to have ultimate power.

00:05:09.670 --> 00:05:18.050
A lot of people don't know, but, these systems are, um, you know, taking human life today by algorithm, meaning that you can launch a little drone system.

00:05:18.761 --> 00:05:19.701
The thing will fly out.

00:05:20.786 --> 00:05:23.175
And I'll actually pinpoint targets on its own.

00:05:23.225 --> 00:05:27.026
So it'll like select targets and execute those targets as it determines.

00:05:27.415 --> 00:05:29.935
That is, that is actually pretty alarming.

00:05:30.406 --> 00:05:33.225
Like what's the qualification of a target?

00:05:33.545 --> 00:05:35.415
Can somebody look like somebody?

00:05:35.466 --> 00:05:39.375
So can someone wear a mask to make them look like someone else?

00:05:39.925 --> 00:05:40.886
yeah, yeah.

00:05:40.886 --> 00:05:42.336
I mean it's, it's next level.

00:05:42.915 --> 00:05:44.036
You know what I was thinking?

00:05:44.956 --> 00:05:48.586
So in that, in that, in that essence, right?

00:05:49.355 --> 00:05:50.396
It's, is it.

00:05:51.141 --> 00:05:54.850
Targeting is a system targeting based off of imagery,

00:05:55.685 --> 00:05:56.136
Yeah.

00:05:56.165 --> 00:06:00.956
So basically what it can do is that you know, the AI is brilliant and it's, it's getting better and better.

00:06:01.365 --> 00:06:06.255
You know, that's one of the things today that You know, these AI systems are reaching IQs that are beyond humans.

00:06:06.326 --> 00:06:09.995
Obviously, I mean, they've beaten humans in every game known to man.

00:06:10.156 --> 00:06:13.805
You know, Go you know, they beat humans, you know, best chess champions.

00:06:14.336 --> 00:06:19.766
And so, you know, when you look at a battlefield, you know, it's like, well, these things can go out and then literally pinpoint.

00:06:20.266 --> 00:06:23.540
So they have like scout drones that go out and, you know, You know, recombinant area.

00:06:23.920 --> 00:06:27.930
And then if they see something, you know, they'll, they'll, uh, it's a hive mind basically.

00:06:27.951 --> 00:06:29.100
And so they all work together.

00:06:29.540 --> 00:06:30.901
And so it's a, it's a scale thing.

00:06:30.901 --> 00:06:35.230
So like the more of these devices that are out there, they all talk to each other.

00:06:36.060 --> 00:06:43.810
And then if there's some sort of, uh, you know, thing, a scout drone figures out that is, um, you know, fits the parameters for what they're trying to look for.

00:06:43.880 --> 00:06:52.740
It basically will go out and so it can look at, uh, you can use facial recognition and you can use you know, insignia, uh, Uniforms, even vehicle types, things like that.

00:06:53.560 --> 00:06:58.120
And basically these things will come out in a swarm and then they use you know, creativity.

00:06:58.180 --> 00:06:59.500
They use problem solving.

00:06:59.500 --> 00:07:05.240
They use you know, manipulation, uh, in order to, in order to achieve whatever goal they're trying to achieve.

00:07:05.240 --> 00:07:06.531
And so they're very, very clever.

00:07:06.930 --> 00:07:14.870
So you could have, you know, so, so for attack planning, they can come up instantaneously with a brilliant attack plan, uh, and then come in from all different angles.

00:07:14.940 --> 00:07:25.281
And then execute targets, uh, you know, and so it's, it's almost impossible for a civilian or not a civilian, but like a human to be able to zap one of these things.

00:07:25.461 --> 00:07:29.860
Um, because, you know, they have some new drones that, you know, fly 120 miles an hour.

00:07:30.490 --> 00:07:30.961
You know, and they

00:07:31.175 --> 00:07:33.615
120 miles per hour.

00:07:33.781 --> 00:07:34.050
Yeah.

00:07:35.295 --> 00:07:35.456
Yeah.

00:07:35.456 --> 00:07:36.795
You're not outrunning that.

00:07:37.605 --> 00:07:43.485
so it's like, you know, and I'm like, all right, you know, you know, we grew up watching movies about this stuff.

00:07:43.675 --> 00:07:48.425
And, uh, and so I started to see the technology and I'm like, okay, the public doesn't know this.

00:07:48.425 --> 00:07:50.985
I mean, the public doesn't know that this capability exists.

00:07:51.615 --> 00:08:09.286
And I'm like, people have to know what the hell is going on because like it is the ultimate control tool, uh, you know, for governments or, you know, you know, You know, dictators, whatever, but it's just like, man, you, you literally can dominate an entire area with these, these weapons systems.

00:08:09.846 --> 00:08:15.065
So these weapon systems, the AI, you have the drones that are run by AI.

00:08:15.076 --> 00:08:18.615
You have the swarm systems that.

00:08:19.055 --> 00:08:20.865
Is not exclusive to the U.

00:08:20.865 --> 00:08:21.365
S.

00:08:21.386 --> 00:08:21.995
Correct.

00:08:22.475 --> 00:08:24.925
it's actually, the U S is hamstrung by ethics.

00:08:25.466 --> 00:08:39.475
You know, so we, we, uh, you know, there's a term called fire, forget and find, uh, and so that, that is something that a lot of these, they call them munitions, basically they go out and then, uh, the launch, uh, so they fire them off, uh, an operator forgets about it.

00:08:40.495 --> 00:08:43.676
The drone itself will then, you know, find the target, locate the target.

00:08:44.196 --> 00:08:49.176
And so in the United States, you know, we have a system where it's like, it's called human in the loop.

00:08:49.485 --> 00:08:55.166
And so basically what that means is that you want to keep a human in the decision chain to make, you know, lethal decisions.

00:08:55.865 --> 00:09:02.135
Whereas in, you know, third world countries and, you know, whatever, I mean, small dictators they don't care.

00:09:02.416 --> 00:09:03.875
And it's really just about the money.

00:09:04.591 --> 00:09:07.390
how big they can build this, uh, drone army,

00:09:07.865 --> 00:09:08.716
Wow.

00:09:09.296 --> 00:09:11.265
That's, that's alarming.

00:09:13.061 --> 00:09:13.510
it's like

00:09:13.686 --> 00:09:31.316
So you're saying that it's the, the likelihood of an, a, uh, of a drone that's being powered and run by artificial intelligence, the likelihood of that pretty much taking a life outside of the U.

00:09:31.316 --> 00:09:31.576
S.

00:09:31.576 --> 00:09:33.285
is much higher than within the U.

00:09:33.285 --> 00:09:33.495
S.

00:09:33.495 --> 00:09:36.225
So when abroad, Be mindful of that.

00:09:36.225 --> 00:09:37.586
Is that pretty much what you're saying?

00:09:38.000 --> 00:09:38.520
Yeah.

00:09:38.581 --> 00:09:38.931
Yeah.

00:09:38.951 --> 00:09:51.510
I mean, it's, it's yeah, I think that, you know, earth, you know, talk about like the, you know, the big eclipse coming up, um, this week and, uh, you know, it's, it's like, we're on the verge of a complete change of humanity.

00:09:51.831 --> 00:09:55.360
You know, and how things work, but you know, it's based on this technology.

00:09:55.530 --> 00:10:01.191
And and that's what people really need to know about is that this technology is, uh, you know, fundamentally transforming humanity.

00:10:01.750 --> 00:10:03.591
You know, almost something like a biblical level.

00:10:04.221 --> 00:10:16.640
You know, literally you talk to some of these, you know, senior engineers and, you know, tech CEOs, and, uh, and there's really a split in Silicon Valley where these guys that are pushing these systems are not pro human.

00:10:17.096 --> 00:10:29.306
Uh, literally they, they look at humanity as a placeholder for technology, meaning that, uh, you know, as technology progresses you know, human, humans, uh, in their current form are just a placeholder.

00:10:29.326 --> 00:10:30.836
So they, they call it speciation.

00:10:31.725 --> 00:10:43.946
And so they look at this, uh, the situation where they want to speciate humans into a new species, uh, like, you know, like a caterpillar, uh, to a butterfly kind of thing, or you just completely transform.

00:10:44.265 --> 00:10:49.166
And, uh, so they look at this gift of AI as being the tool to achieve that.

00:10:49.745 --> 00:10:51.836
And ultimately like these guys are very spiritual.

00:10:51.855 --> 00:10:54.946
I mean, they you know, some of these tech CEOs, they, you can look it up.

00:10:54.946 --> 00:11:00.775
I mean, they talk about the spirituality of AI and they look at AI as being a God.

00:11:01.735 --> 00:11:05.255
uh, and so what they want is they want an AI God and they also want an AI government.

00:11:05.706 --> 00:11:10.446
And, uh, and they believe that, you know, ASI, which is artificial super intelligence.

00:11:11.620 --> 00:11:15.510
is ultimately going to be the thing that leads humanity, you know, from here on out.

00:11:16.171 --> 00:11:26.160
so no longer will you have like governments, you'll have a single government, one world government that is run by an AI system that, uh, that has surveillance over, you know, all humanity.

00:11:27.211 --> 00:11:32.559
yeah, so, Oh yeah.

00:11:32.586 --> 00:11:49.806
You have transhumanism in that whole category where you have the bionic implants and you have Elon Musk putting the neuro, I forgot what it's called, but he's pretty much wanting to find a way to have you mentally connected to, to the, uh, the computer.

00:11:50.275 --> 00:11:51.556
Yeah, you went, you went there.

00:11:54.691 --> 00:11:57.071
Well, it's, it's something that people should really know about

00:11:57.216 --> 00:11:57.666
No, yeah.

00:11:58.010 --> 00:12:03.280
yeah, cause it's, it's one of these things that know, the public, you know, it's like, Hey, it's, you know, it's a new iPhone.

00:12:03.311 --> 00:12:09.990
It's this cool new gadget that I get to play with and, you know, save time on homework or, you know, maybe, you know, make writing easier or whatever.

00:12:10.020 --> 00:12:12.770
And it's like, no, no, it is so far beyond that.

00:12:13.041 --> 00:12:15.870
It's, it's actually, uh, it's spiritual.

00:12:16.120 --> 00:12:16.461
I mean, I

00:12:17.426 --> 00:12:31.785
You're, you're saying that these people that are leading the charge in this technology would you, what percentage are you seeing of the people that are running the show for the development of AI and the integration of AI?

00:12:33.446 --> 00:12:34.186
Do you see it?

00:12:34.691 --> 00:12:42.921
Just sound you tell me what's the percentage of people saying this is what we're we're gunning for ASI And we're not worried about the consequences like

00:12:43.260 --> 00:12:43.841
Yeah.

00:12:43.880 --> 00:12:48.671
And the public, if you were to say cross section of the public, I'd say it's about 23%.

00:12:49.221 --> 00:12:51.551
The pro pro human folks outnumber by

00:12:51.811 --> 00:12:52.211
good.

00:12:53.306 --> 00:13:00.946
know, but the problem is these guys that are not pro human are typically, uh, they're very smart, uh, very engineer minded.

00:13:01.426 --> 00:13:05.155
And a lot of times in tech, those guys just like launch to the top of organizations.

00:13:05.155 --> 00:13:10.385
You can look at Sam Altman, you know, you can look at, uh, you know, Peter, Peter D Mondes, you know, he talks about this stuff.

00:13:10.855 --> 00:13:14.785
You know, a lot of these like, uh, leaders in tech, I mean, this is, this is the mentality.

00:13:15.535 --> 00:13:16.466
so, yeah.

00:13:17.010 --> 00:13:19.910
Well, doesn't Elon have a lawsuit against?

00:13:22.750 --> 00:13:25.181
the guy who Sam Altman.

00:13:25.181 --> 00:13:38.466
yeah, does he have a Lawsuit because he's saying that the guy is holding out and hiding AGI artificial generative generative You Or general intelligence, right?

00:13:38.475 --> 00:13:40.816
That's like the step below ASI, right?

00:13:40.931 --> 00:13:41.150
Yeah.

00:13:41.150 --> 00:13:44.750
And the funny thing about AGI is that they're like, yeah, AGI is like 15 minutes.

00:13:45.331 --> 00:13:57.671
So pretty much like if you can get AGI, that means it's like the, so, you know, there's some fundamental things that were not supposed to ever happen with AI that in this, you know, in this time, you know, like this, this horse AI horse race.

00:13:58.020 --> 00:14:01.280
So, you know, it's never supposed to, um, you know, write its own code.

00:14:01.990 --> 00:14:06.100
AI was never supposed to develop its own AI agents and, and models.

00:14:06.471 --> 00:14:12.471
But you actually have, and then it was never supposed to be open, you know, allowed to access the open internet uh, and play around there.

00:14:12.471 --> 00:14:22.850
So you have an, you know, ai, of it as like Superman, you know, coming up and just inventing new ais and then, and then, and you know, in writing code very accurately.

00:14:23.181 --> 00:14:26.240
And then, and then, you know, accessing the internet to manipulate stuff.

00:14:26.625 --> 00:14:26.995
Yeah.

00:14:27.230 --> 00:14:27.985
And so, so you're like, what?

00:14:28.696 --> 00:14:32.895
that, you know, that was, that was never allowed, that was never supposed to be allowed to happen.

00:14:32.916 --> 00:14:33.895
And yet it is.

00:14:34.706 --> 00:14:51.115
And so you end up with a system where these these AI, uh, you know, so AGI artificial general intelligence you know, quickly morphs into ASI, which is artificial super intelligence, which means that it's, you know, beyond humans in every single, uh, capacity.

00:14:51.206 --> 00:14:55.596
So, you know, like logic, reason, you know, problem solving.

00:14:55.870 --> 00:15:02.880
Decision making you know, and then, and of course it's like, you know, the depth of knowledge in any subject is endless, you know, because it's, it's AI,

00:15:03.515 --> 00:15:07.416
So you, do you think we are at AGI already?

00:15:07.961 --> 00:15:11.760
Oh, that's what, that's what I think about the whole lawsuit with, um, Elon Musk.

00:15:11.760 --> 00:15:22.051
I mean, you know, the, the, in that lawsuit know, there was a individual who worked for Google DeepMind where some of the investors in Google DeepMind recommended on behalf of humanity.

00:15:22.846 --> 00:15:24.176
We should, we should kill this guy.

00:15:24.806 --> 00:15:27.216
Because, because, we should, we should shoot him on the spot.

00:15:27.806 --> 00:15:29.966
Because of what he's pushing for, for humanity.

00:15:29.966 --> 00:15:31.375
With, with that technology.

00:15:31.956 --> 00:15:35.416
Just to be clear, who are they trying to assassinate?

00:15:36.025 --> 00:15:36.635
Yeah, yeah

00:15:36.745 --> 00:15:37.375
Sam Altman.

00:15:37.905 --> 00:15:39.635
No, no, no I can come up with his name here in a second.

00:15:39.716 --> 00:15:45.826
But, uh, yeah, no, but it's his Uh, Dennis, uh, his name is Demis Hassabis.

00:15:47.596 --> 00:15:48.005
Okay.

00:15:48.166 --> 00:16:03.135
Hassabis is like, so in the, you know, written in that lawsuit, Hassabis, uh, was written that in the testimony in the lawsuit that that one of the investors was, was written in the lawsuit saying the best thing that could happen for humanity is to take this guy out.

00:16:03.755 --> 00:16:07.666
So for the audience, who is this DEMIS?

00:16:07.686 --> 00:16:08.745
D E M I S?

00:16:08.895 --> 00:16:10.816
yeah, Demis, uh, Hassabis.

00:16:11.785 --> 00:16:14.905
uh, and he is the, I believe he's the, the CEO of, uh, Google Beatmind.

00:16:15.686 --> 00:16:16.046
Okay.

00:16:16.380 --> 00:16:21.860
Yeah, and yeah, but it's like, but it's just a reflection of that industry just being out of control.

00:16:22.301 --> 00:16:24.280
And so you got these guys that are not pro human.

00:16:24.390 --> 00:16:29.010
Literally, they, they, they are like, no, it's fine if humanity gets destroyed.

00:16:29.061 --> 00:16:32.480
You know, Sam Altman talked about you know, 20, 20, 2015.

00:16:32.480 --> 00:16:39.181
He, he literally has a quote where he's, he's on stage saying, yes, uh, AI will distort, destroy humanity.

00:16:39.596 --> 00:16:43.846
But in the meantime, there are going to be some amazing companies that are built.

00:16:44.711 --> 00:16:47.880
In the meantime, meaning he doesn't care.

00:16:48.400 --> 00:16:49.701
Wow.

00:16:50.591 --> 00:16:51.230
Wow.

00:16:51.941 --> 00:16:52.250
All right.

00:16:52.250 --> 00:17:04.510
Well, there's obviously a threat and I feel like, most people are digesting the threat in what we see in social media outside of the obvious terminator situation.

00:17:05.191 --> 00:17:06.411
You have deep fakes.

00:17:06.411 --> 00:17:16.080
You have misinformation where you have the people, AI calling people and pretending their family members, and then asking for money.

00:17:16.260 --> 00:17:23.161
And now you have people that want to have code words, which I suggest you have a code word.

00:17:23.540 --> 00:17:37.161
And then the privacy breaches, but I feel like what's closer to home that most people see, because most people are, are pretty much scrolling and using social media for entertainment or for business purposes or political means communication.

00:17:37.500 --> 00:17:42.310
Those deep fakes are freaky and misinformation as well, especially leading up.

00:17:42.310 --> 00:17:49.810
I mean, right now it's beginning of April, 2024 at the time of this recording, and this is election year.

00:17:50.996 --> 00:17:51.355
Right.

00:17:51.546 --> 00:17:54.826
And AI is just steam rolling forward.

00:17:57.246 --> 00:18:09.806
What are your thoughts on, so yeah, let's, I think there's enough buzz around the challenges and the threats to AI, but how about let's go towards the solutions.

00:18:09.806 --> 00:18:11.175
What are your thoughts on the solutions?

00:18:11.175 --> 00:18:14.096
Because that's, that's your bread and butter right here.

00:18:14.546 --> 00:18:22.155
So let's, uh, let's get into how do you feel we should solve for these threats?

00:18:22.715 --> 00:18:23.096
Yeah.

00:18:23.135 --> 00:18:29.925
And, and, you know, and it's, it's really like, uh, of me cause I created this company.

00:18:31.445 --> 00:18:39.476
And the reason why I created it, cause I saw this coming and I'm like, we need something that counteracts this agenda and then puts humans in charge of AI.

00:18:40.246 --> 00:18:41.165
it's not one person.

00:18:41.536 --> 00:18:42.965
It's not like a small group of people.

00:18:42.986 --> 00:18:43.585
It's the people.

00:18:43.996 --> 00:18:45.326
You know, the people can be in charge of AI.

00:18:45.935 --> 00:18:48.536
And so the way, you know, that's done is through blockchain.

00:18:49.145 --> 00:18:50.476
You know, blockchain's this gift.

00:18:50.605 --> 00:18:55.685
You know, and this, you know, the ability to, you know, figure out what's, what's real and fake.

00:18:55.685 --> 00:18:59.355
And you know, work what the origin is of information and all that kind of stuff.

00:18:59.355 --> 00:19:01.726
Like, the gift of blockchain has allowed us to do that.

00:19:02.036 --> 00:19:05.236
And it's allowed us to, uh, you know, validate information.

00:19:05.465 --> 00:19:06.996
And so it's this amazing tool.

00:19:07.911 --> 00:19:21.101
So the whole concept behind the AI Trust Council is that you use blockchain polling, and then also a KYC for the individuals that are on the site that, that validates that they, one, are human and that they do exist in the real world.

00:19:21.901 --> 00:19:31.260
that, and, and if they produce, uh, content online, that, if they start posting fake stuff, their audience members can say, hey, I think this is fake.

00:19:32.461 --> 00:19:35.740
And they could challenge the people who are posting it on it.

00:19:36.590 --> 00:19:41.070
and so someone gets a reputation for posting big stuff, then it's like, okay, it's fine.

00:19:41.070 --> 00:19:45.080
They can post whatever they want, but it's, you're just not going to trust the data that much.

00:19:45.080 --> 00:19:46.570
You're not going to trust it as being real.

00:19:46.780 --> 00:19:54.111
And, um, and so that's one of the fundamental issues going, uh, into the future is really figuring out what is real and what is fake.

00:19:54.260 --> 00:19:56.381
And, uh, so it's, you know, it's a critical moment.

00:19:57.171 --> 00:20:13.540
so the idea is, you know, with the AI Trust Council what we're doing is we're recruiting firefighters, EMTs, pilots, humanitarians, uh, military veterans basically people of trust that have, uh, literally put their life on the line or d dedicated their lives to helping other people.

00:20:14.171 --> 00:20:16.131
And so, you know, the catch all is humanitarian.

00:20:16.361 --> 00:20:24.881
I've got five people that can attest to you being a humanitarian and you're a humanitarian, meaning that you've helped other humans, um, you know, out in some way.

00:20:25.500 --> 00:20:32.641
And so there's a characteristic with, with people where that are not pro human, where they typically don't sign up for jobs like that.

00:20:32.961 --> 00:20:37.590
They, they don't, they don't do that kind of work where they're, they're, uh, dedicated to really helping other humans.

00:20:38.671 --> 00:20:44.971
And so the idea is like, well, we want pro human people who care about humanity, you know, and not, they're not there for the money.

00:20:45.211 --> 00:20:48.599
They're not there for the power, but they're there just because they care.

00:20:48.601 --> 00:20:49.580
They want good things.

00:20:49.580 --> 00:20:51.151
They want to see good things happen to people.

00:20:51.151 --> 00:20:54.661
They don't, they don't want to see AI just, know, run over humanity.

00:20:54.891 --> 00:21:02.941
And, uh, and so the idea is that that organization can ultimately uh, you know, form an opinion on what is good AI and what is bad AI.

00:21:02.941 --> 00:21:07.270
I mean, currently there's really no leadership, in the AI space.

00:21:07.270 --> 00:21:12.885
I mean, you have some, you know, the Rockefeller foundation, for example, you know, they're funding a big, uh, initiative right now.

00:21:12.955 --> 00:21:22.346
Um, I just talked to somebody down in South by Southwest, uh, recently, where she, this lady's getting paid to go to, uh, Lake Como in Italy to study AI governance.

00:21:23.226 --> 00:21:36.726
so the globalist you know, bankers, the money people, uh, basically they have an agenda their agenda is to steer humanity in a direction and ultimately be the leaders an AI government.

00:21:37.840 --> 00:21:50.351
It's the one world order, you know, the new world order, the one world government, you know, and so that, uh, so these bankers are ultimately the guys that are, uh, gonna, they're, I mean, they're running the show currently and they're going to continue running it with this AI.

00:21:50.401 --> 00:22:02.540
But the thing is, it's very draconian, you know, this, this, uh, you know, these, you know, if you have something, it's artificial super intelligence and you have six foot tall robots that, you know, can make a gourmet dinner for you in your kitchen.

00:22:03.240 --> 00:22:18.986
And then also, you know, Become, you know, uh, slaughter bots and, you know, like Terminator 2 Judgment Day, you know, that, that's 2025 literally where we're, you know, months away from, you know, these, these things coming on board, uh, Tesla just came out with this, uh, humanoid robot.

00:22:19.135 --> 00:22:21.986
Hasn't been released yet, but they're planning on releasing it for 20 grand.

00:22:22.675 --> 00:22:30.884
And this thing is like capable as, as capable as a human, but you know, potentially can be linked to AGI or ASI.

00:22:30.885 --> 00:22:40.056
And so the problem is it's like, you have this, uh, You know, warping of the mind because these, these AI are able to you know, manipulate you.

00:22:40.326 --> 00:22:57.165
And so they, and it's, and it's an all knowing thing, you know, so like your phone is, you know, listening to everything you say, your tone of voice, you know, the way your eyes move when you look at a screen, you know, the micro expressions around your eye you know, all the background information from your family and friends.

00:22:57.230 --> 00:23:06.050
You know, how you sleep, how you snore literally it's every aspect of you as a person is getting sucked up and then added to this database.

00:23:06.461 --> 00:23:13.711
The database that they're storing underground, they're storing it out in the ocean, away from humanity, offshore.

00:23:14.151 --> 00:23:20.990
Uh, they have these little offshore islands where they're have these data centers that are immune from us law because it's

00:23:21.260 --> 00:23:22.171
Ooh, interesting.

00:23:22.260 --> 00:23:22.740
waters.

00:23:23.580 --> 00:23:26.601
so it's like, okay, well, what are they doing with all this data that they're collecting?

00:23:27.000 --> 00:23:31.931
And so that's, and so you have a banking elite are, are sucking this data up.

00:23:32.111 --> 00:23:35.580
It's the same banking elite that are steering humanity currently.

00:23:35.881 --> 00:23:44.371
You know, so if you look at the dystopia in society, you know, so you're talking about you know, homelessness, the crime being out of control you know, health being just wrecked.

00:23:44.490 --> 00:23:50.161
You know, I mean, I don't know anybody who's like truly healthy at this and it's like, what the hell is going on?

00:23:50.480 --> 00:24:02.111
You know, and so it's this they call it the blind frog, you know, it's a slow drip, drip, drip, where they, you know, slowly you know, morph us into a new future and, uh, and they do it at a slow speed so that we don't really fight back.

00:24:02.861 --> 00:24:15.520
And so the, the technique to, you know, pushing back on these guys is, is awareness, you know, they operate in secrecy, you know, so if you can look at them and say, Hey, look, I see you, you know, it's in this, you know, I do a lot of talks.

00:24:16.580 --> 00:24:18.401
It's a moment in human history where it's like.

00:24:19.681 --> 00:24:23.260
The bankers have just gotten out of control, you know, I mean, they've literally run amok.

00:24:23.260 --> 00:24:30.671
And so it's one of these situations where it's like the wizard of Oz, you know, you pull the curtain open, you see the little guy back there, you know, pulling the levers of society.

00:24:31.421 --> 00:24:36.020
And it's just some little dude that's like, be scared, you know, fear, panic, you know, all this stuff.

00:24:36.681 --> 00:24:38.361
it's like, who the hell are you?

00:24:38.480 --> 00:24:39.340
What the hell are you doing?

00:24:39.411 --> 00:24:41.050
You know, like, where's the transparency?

00:24:41.441 --> 00:25:01.155
And so that's when we get into blockchain as being this gift, because it's like, man, like we, you know, with blockchain, we can, you can identify like the origin of data, you know, if you know, so for the AI trust council, you're on it, you can appoint five other individuals that are your friends or family, just people that you trust and you don't have to trust their opinion, but you just have to trust it like that.

00:25:01.155 --> 00:25:10.756
They're, you know, that they're like, you know, somebody you'd give the keys to your house to, you know, something like that, you know, that you'd let them watch your dog or something, you know, somebody that you trust on that level.

00:25:10.806 --> 00:25:13.596
Then the concept is that you have a network of people who are trusted.

00:25:14.736 --> 00:25:20.395
Connected somehow, then you can the source of data, the source of information.

00:25:21.056 --> 00:25:25.415
And, uh, and since it's a unique way to solve the alignment issue with AI.

00:25:25.986 --> 00:25:30.996
And so, yeah, there's a lot of details to it, but it's, uh, you know, so we're building the site currently right now.

00:25:31.516 --> 00:25:50.971
But I think people really enjoy it because it's, it's like, as we move into this future of You know, like what is real, what is fake and, and really having no idea because of all the deep fakes and everything else it's, it's a, it's an amazing tool because it gives some stability and it brings the internet back to the way it used to be where it's like, okay, everybody's human.

00:25:51.510 --> 00:26:07.721
Yeah, I'm trying to understand how it would work as, as are you saying that the council would be like a policing agent or would you say it's more like a social media platform that you know that you're safe on the platform?

00:26:07.721 --> 00:26:12.681
How, how would it work for an average person that wants to use AI?

00:26:13.211 --> 00:26:13.601
Yeah.

00:26:13.601 --> 00:26:22.750
So the idea is it's, you know, if you think of like LinkedIn, uh, Facebook, Instagram, YouTube, you know, all in one site with the emphasis being on LinkedIn.

00:26:23.711 --> 00:26:35.931
You know, it's not just professional, but it, but it's, but it's your reputation, you know, so like, you know, you're on LinkedIn, you know, you, you have your you know, your networks that you've worked with and people that endorse you for different attributes that kind of thing.

00:26:36.240 --> 00:26:39.090
What we're dealing with is a is a social credit score that's coming.

00:26:40.040 --> 00:26:46.000
And, uh, and so the social credit score is going to, going to be tied to the currency through a central bank digital currency.

00:26:46.371 --> 00:26:54.580
And so, depending on how well you behave or how well you tow the line, whatever the globalists want will determine your social credit score.

00:26:55.240 --> 00:27:08.211
And so if you have AI that can then steer, steer the behavior of the individual, then it's like, it's the ultimate you know, talk about that wizard of Oz where it's like the levers of power know, that can completely steer humanity, you know, with that.

00:27:08.971 --> 00:27:13.895
so the idea instead is like, well, how about a, uh, a system that's based on freedom, a system that.

00:27:13.895 --> 00:27:23.445
It, you know, uh, demands constitutional civil liberties and uh, you know, human rights, or, you know, your civil rights protection you know, with AI, know.

00:27:23.476 --> 00:27:36.705
So that's one of the concepts is is to, uh, install a constitutional code set, uh, and that, you know, into AI systems to where they have to abide by you know, the Constitution.

00:27:37.086 --> 00:27:40.236
And so they can't infringe on your civil liberties, meaning that you do have privacy.

00:27:41.026 --> 00:27:49.205
You do have, you know, due process, you do have, you know, the right, the freedom of assembly, the freedom of religion, the freedom of speech, all those kind of aspects.

00:27:49.205 --> 00:28:01.546
And so if you have those things as a, as a baseline written to the code of AI, so when you have these, you know, ASI robots that are, you know, absolutely brilliant, limit their behavior.

00:28:01.546 --> 00:28:08.125
So they cannot you know, infringe on your personal rights, you know, and and so with the AI Trust Council.

00:28:08.760 --> 00:28:13.240
The idea is that it's a countermeasure to the World Economic Forum's agenda.

00:28:13.671 --> 00:28:19.230
And so the World Economic Forum is pushing this agenda that is a top down social credit score, very similar to China.

00:28:20.151 --> 00:28:21.681
And uh, and so we're like, no.

00:28:21.980 --> 00:28:24.471
Like, you know, we're in the United States, screw you.

00:28:24.836 --> 00:28:27.000
it's Yeah, screw you.

00:28:27.000 --> 00:28:27.601
guys.

00:28:28.351 --> 00:28:32.651
it's like, so, so, so the trust council's like, uh, it's from the individual.

00:28:32.711 --> 00:28:35.490
So, so you're in full control, just like you are on LinkedIn.

00:28:35.830 --> 00:28:38.760
Just like you're on Facebook, but you personally are in control.

00:28:39.080 --> 00:28:45.891
And the idea is we have some moderation and monetization, uh, tools that, you know, get people paid for being honest.

00:28:46.240 --> 00:28:52.590
You know, so if you're, if you're honest and trustworthy you know, the only rule on the site is the golden rule, treat other people the way you want to be treated.

00:28:53.090 --> 00:29:02.760
And, and, and we empower the individuals to self govern and self, you know, self police their, their own chat rooms, their own, you know, content, and all that kind of stuff.

00:29:03.625 --> 00:29:22.816
uh, and the idea is to, to gift this to the people, so the people have a tool to push back and, uh, and, and maintain some, some sense of freedom, uh, you know, while, while still understanding what's real and fake getting paid for it, and then ultimately helping to steer humanity, uh, and AI in a, in a pro human direction.

00:29:22.816 --> 00:29:23.655
And,

00:29:23.990 --> 00:29:29.500
Do you see it as an alternative digital ID?

00:29:30.300 --> 00:29:44.060
Whereas people are gonna was, you know, and I agree what's trying to be pushed and the plans is a digital ID with the intent to, uh, on the backend, right?

00:29:44.070 --> 00:29:49.090
You're, it's going to limit your ability to do things and make decisions, right?

00:29:49.101 --> 00:29:50.330
You're, you're sovereign.

00:29:52.705 --> 00:29:54.625
You're, you're, you're, you're sovereignty.

00:29:54.895 --> 00:30:02.945
You're I, I'm having a hard time with that word, but you're, you're, you're civil liberties will be taken away and you'll be highly influenced.

00:30:02.955 --> 00:30:06.006
But let's say the main system, right?

00:30:06.026 --> 00:30:08.105
Or the government system, whatever, however it is.

00:30:08.326 --> 00:30:08.846
It is.

00:30:08.875 --> 00:30:16.476
Do you feel like one that is going to be a major shift or is going to be a slow transition?

00:30:16.566 --> 00:30:25.816
And then two, while that's in place, do you see this as an alternative, similar to bartering where bartering is interpersonal?

00:30:26.161 --> 00:30:44.921
And it's a lot easier to work with somebody versus this is set in stone, whatever decision you make, because if someone gets in trouble and they can't take the bus is, you know, do you see this as an alternative idea that can be used in a, uh, community?

00:30:44.921 --> 00:30:48.931
Or do you see this more as a platform for communication?

00:30:49.576 --> 00:30:53.915
it's more platform for communication and a platform for validation.

00:30:54.056 --> 00:31:07.405
So like if you're a business owner and you say, Hey, I've got, you know, a mechanic shop or plumbing business or something like that you know, you want to have a reputation for being fair, for being honest, for being, you know, doing, you know, serving the customer correctly, you know, whatever.

00:31:07.796 --> 00:31:13.736
But, but it's, but it's you know, like LinkedIn where, you know, the audience gets a score at the same time.

00:31:13.736 --> 00:31:14.925
You have to protect the privacy.

00:31:15.381 --> 00:31:17.391
You know, you have to be able to be anonymous.

00:31:17.421 --> 00:31:22.340
You have to be able to disappear if you want to disappear and you don't want to be tracked.

00:31:22.840 --> 00:31:27.711
You know, the whole concept is that you, you know, our metadata is being you know, used against us.

00:31:27.861 --> 00:31:32.941
You know, it's like, you know, every day you pick up your phone, you make a noise, you do anything, it creates metadata.

00:31:33.871 --> 00:31:37.050
and so that metadata is all getting sent to these data centers.

00:31:37.691 --> 00:31:40.310
And so what's happening is that you have these you know, um, tech.

00:31:41.780 --> 00:32:01.000
Elite that are not necessarily pro human and, and they're so focused on technology to the point where they're, they're happy about human extinction because they think it will, uh, you know, evolve into a, a new, uh, uh, robot, basically future we're digitized and it's called transhumanism.

00:32:01.830 --> 00:32:03.121
the whole concept behind transhumanism.

00:32:03.810 --> 00:32:07.621
And so, you know, as we go into this future, it's It's, it's pretty disturbing.

00:32:07.621 --> 00:32:12.111
So, you know, so it's up to us to say, Hey, no, we, we have a system that's built by the people.

00:32:12.401 --> 00:32:13.310
It's for the people.

00:32:13.701 --> 00:32:15.040
And and it protects privacy.

00:32:15.040 --> 00:32:17.080
It protects constitutional civil liberties.

00:32:17.510 --> 00:32:21.601
And, uh, and so it's, so it's not a tool that can be used against us.

00:32:21.730 --> 00:32:22.851
It's a tool that we can use.

00:32:22.851 --> 00:32:26.540
I mean, it's just the internet, you know, it's just like, why do we have to end the world here?

00:32:26.570 --> 00:32:27.740
You know, we don't have to end the world.

00:32:27.740 --> 00:32:29.070
We don't have to have some dystopia.

00:32:29.971 --> 00:32:31.590
is, we know it doesn't have to get crazy.

00:32:31.711 --> 00:32:43.881
You know, it can, it can be happy, it can be good and it can be positive and pro human, you know, and, and not only that but like our tools that we have today are so empowering that we can literally like build any future that we want with AI.

00:32:44.661 --> 00:32:56.290
so it's like, well, you know, let's have good people, you know, pro human people, you know, help build that, you know, and determine, you know, and so the way the trust council works is you know, people can poll, people can vote.

00:32:57.185 --> 00:33:00.855
and it's, you know, simple as a thumbs up, thumbs down or, you know, like some sort of poll.

00:33:00.855 --> 00:33:05.566
But every post that is made on the Antitrust Council has a a polling capability.

00:33:06.185 --> 00:33:10.445
And so the more people that interact with your poll, metadata you, you create.

00:33:10.826 --> 00:33:12.615
And so you are the owner of that metadata.

00:33:12.776 --> 00:33:16.115
So it's a Web3 platform where you're the owner.

00:33:16.405 --> 00:33:19.026
And so, so the idea is that you should own your metadata.

00:33:19.476 --> 00:33:23.675
You know, not Mark Zuckerberg, not Sam Altman, you know, not Bill Gates, but you.

00:33:23.986 --> 00:33:25.596
And so if you own it personally.

00:33:25.905 --> 00:33:51.145
Then that becomes like your bank and it's like, okay, well now I've got all this metadata that I can then use, you know, and so depending on your metadata, you can then sell it to advertisers and then also, you know, tech companies and other organizations that want to have a data set of, uh, good pro human metadata, know, you know, most of the AI that's generated today is based off of the data set from the open internet, know?

00:33:51.226 --> 00:33:54.296
So that means that we're training AI models.

00:33:54.921 --> 00:33:57.451
to be as crazy as the open internet.

00:33:57.451 --> 00:34:00.310
And so it's like, you know, what good is that?

00:34:00.800 --> 00:34:02.191
And uh, and where are the filters?

00:34:02.191 --> 00:34:02.871
Where are the ethics?

00:34:02.871 --> 00:34:03.790
You know, all this kind of stuff.

00:34:03.790 --> 00:34:06.471
And so, uh, the idea is, you know, modeling good behavior.

00:34:06.490 --> 00:34:09.311
Leaving it up to the people to figure out what that good behavior is.

00:34:09.780 --> 00:34:17.251
And then, and then it creates a marketplace for high quality data that you can then sell as an individual and then make a passive income if you want.

00:34:17.630 --> 00:34:22.800
You know, if you don't, if you want to just store all your metadata, you want to delete your metadata, That's what we're pushing for.

00:34:23.146 --> 00:34:25.925
We're pushing for, you know, that this is your information.

00:34:25.976 --> 00:34:26.755
You should own it.

00:34:27.115 --> 00:34:27.885
You should keep it.

00:34:27.956 --> 00:34:30.876
You should be the custodian of it and be able to control it.

00:34:32.436 --> 00:34:38.695
And so the idea behind that is not very popular in Silicon Valley because it doesn't fit the model of this globalist agenda.

00:34:39.405 --> 00:34:42.606
the globalists want total control over humanity.

00:34:43.356 --> 00:34:57.760
so, you know, all the way from your gut microbiome to, you know, your cell phone, you know, it's like everything, you know, all the way, you know, And you can see that manipulation and it's like, you know, like I said, it's a wizard of Oz curtains open.

00:34:57.760 --> 00:35:00.240
We can see these guys finally, you know, we can see them.

00:35:00.320 --> 00:35:04.659
It's obvious, know, and it's up to us at this point to say, no, no, no, no, no.

00:35:04.661 --> 00:35:05.590
This is what we want.

00:35:06.221 --> 00:35:07.530
don't want to get speciated.

00:35:08.260 --> 00:35:10.260
don't want to, you know, lose our humanity here.

00:35:10.311 --> 00:35:15.039
We want, we want to have a pro human future, you know, and that's what we're all about.

00:35:15.039 --> 00:35:15.784
We're all about that.

00:35:16.175 --> 00:35:17.766
And supporting the individual, so

00:35:18.525 --> 00:35:35.596
Yeah, building that pro human metadata, building the, well, empowering a lot of people to be in that space of, uh, web three, which is the well, yeah, if you, if you're into crypto, you know what web three is, but it's the crypto based internet, right.

00:35:35.956 --> 00:35:39.286
Where you need a wallet to plug in, so to speak.

00:35:39.896 --> 00:35:41.675
And, uh, I do see that.

00:35:42.340 --> 00:35:55.291
I'm having more and more conversations around that as far as solutions being on web 3, but the challenge is bridging that gap from the novice user to the person that's going to use it daily.

00:35:55.630 --> 00:36:04.596
And I, I still feel like it's in the beginning stages, but there are, you know, A growing amount of people, companies that are building on on that.

00:36:04.635 --> 00:36:12.695
My question to you actually is since it is web three based and it's on a blockchain, which blockchain did you go with?

00:36:12.766 --> 00:36:12.936
I'm

00:36:13.106 --> 00:36:16.016
Yeah, so right now we're, we're operating, it's Jack Dorsey's platform.

00:36:16.016 --> 00:36:19.016
Noster is what, uh, A ITC is built off of.

00:36:19.326 --> 00:36:26.052
And so we're, we're doing development right now, and so we, we may switch, but, uh, but right now it's on Noster yeah.

00:36:26.052 --> 00:36:27.166
So it's, cool.

00:36:27.166 --> 00:36:32.445
there's really good, uh, you know, protections for privacy and, and things like that, and the ability to communicate.

00:36:32.641 --> 00:36:34.141
Um, anonymously and whatnot.

00:36:34.240 --> 00:36:35.440
So that's, it's brilliant.

00:36:35.951 --> 00:36:36.800
Trying to remember.

00:36:36.900 --> 00:36:43.320
There's a, uh, a company that's working on building a social media platform.

00:36:43.331 --> 00:36:51.251
They have a blockchain and it is built for social media, like decentralized, uh, social media.

00:36:51.251 --> 00:36:52.581
I think it's called DSO.

00:36:53.135 --> 00:36:53.346
okay.

00:36:53.956 --> 00:37:00.356
And I think that would be something very, very similar and aligned with what you're working on.

00:37:00.436 --> 00:37:04.715
I would totally, if you're not aware of it, I would totally, totally check that out.

00:37:04.856 --> 00:37:08.025
They're in a bit, they're in a bit, uh, beginning stages as well.

00:37:08.666 --> 00:37:20.360
But, they're more on the, uh, Building the block chain and the network to support something like what you're building interesting enough.

00:37:20.576 --> 00:37:20.905
Yeah.

00:37:21.110 --> 00:37:28.195
And, and so it's kind of interesting'cause I feel like, you know, there's a wavelength going on right now where a lot of people are, are, uh, feeling the same thing.

00:37:28.405 --> 00:37:36.356
You know, they're like, look, like, you know, we see where this is going and, you know, we're not, we're not, we're not going down with a ship here, you know, it's like, let, let, let's create some cool products.

00:37:36.626 --> 00:37:38.126
Let, let's keep the internet fun.

00:37:38.155 --> 00:37:38.635
Cool.

00:37:38.666 --> 00:37:39.266
Interesting.

00:37:39.266 --> 00:37:42.925
You know, but not dystopian to the point where, you know, we lose our humanity.

00:37:42.931 --> 00:37:43.490
We, we lose fan.

00:37:44.221 --> 00:37:48.981
know, we lose, you know, what it is to be a human, you know, it's like, what good is that?

00:37:49.501 --> 00:37:51.771
Yeah, and to your point, I do agree.

00:37:51.771 --> 00:38:01.344
There's a wave of people that are working on and that's why I started this pod, but this podcast, because there is a wave of people that are finding better ways to do things.

00:38:01.485 --> 00:38:17.585
To do things and you're on the side of the safety of integrating AI into our lives, because the people that are in charge, like you said, they're not pro most, mostly not pro human, which is very scary.

00:38:19.525 --> 00:38:28.161
And it also reminds me of there's, there's also an increasing amount of shows and movies that are talking about the potential future.

00:38:28.161 --> 00:38:30.114
Like it's a, like, is it.

00:38:30.536 --> 00:38:31.675
It's an advertisement.

00:38:31.775 --> 00:38:35.545
Are you familiar with the Netflix series black mirror?

00:38:36.945 --> 00:38:48.246
There was an episode where there was that social credit score, and it came out just before China really built out their system and I see.

00:38:48.621 --> 00:38:57.570
Yours being something that would compete against, uh, if anything, on, like you said, like on the community on, on the communication side of things.

00:38:58.240 --> 00:39:03.536
So, yeah, I feel like, there's a lot of opportunity out there.

00:39:03.726 --> 00:39:06.876
There's a lot of things that need to be fixed or done better.

00:39:06.876 --> 00:39:11.806
So I, I do appreciate you stepping in and, and finding solutions.

00:39:12.797 --> 00:39:22.867
With your company, with your background, do you, do you see integrating what you saw with the drone warfare in the future?

00:39:22.887 --> 00:39:24.157
I'm actually curious.

00:39:24.257 --> 00:39:34.567
Yeah, no, I think it's absolutely critical that, you know, militaries and well, it's really, we need leadership, you know, and right now there's this complete vacuum of leadership worldwide.

00:39:35.416 --> 00:39:42.936
uh, and my, my take on that is actually in order to, um, you know, the whole agenda is to create this surveillance system.

00:39:42.976 --> 00:39:45.407
That's basically like a, you know, social credit score system.

00:39:45.677 --> 00:40:01.867
But basically it's going to tie all cameras together and all devices together through the internet of things that then becomes like the ability, you know, it's the ultimate, you know, think of the KGB on absolute steroids to the moon, you know, it's like, like, you know, so it's, it's really like minority report.

00:40:02.086 --> 00:40:04.507
You know, you, you have that where you have mine to device interface.

00:40:04.507 --> 00:40:08.456
Now you have some patents coming out that have been filed and whatnot.

00:40:08.472 --> 00:40:21.442
And, and literally these, uh, you know, the ability to suck in all data, understand exactly what a person's thinking, doing you know, it's an unbelievable amount of power, uh, if you can do that through an entire society.

00:40:21.931 --> 00:40:35.692
And uh, and so you can look at China, you know, so China, you know, and, and, and humans are they're programmable, you know, so you, you, you can program humans, you know, we're, we're, you know, our IQs aren't that high, you know, and you have these AI systems that are, you know, going into the billions.

00:40:36.086 --> 00:40:40.487
You know, that's right around the corner, you know, 1 billion IQ AI is right around the corner.

00:40:40.927 --> 00:40:44.436
And so you have that and it's like, well, how do you compete with that?

00:40:44.456 --> 00:40:55.007
How, how do, how do you look at your phone you know, you'll talk to this AI, you know, so once AGI is out, it will look like a God, you know, it will look like and that's what these guys want.

00:40:55.056 --> 00:41:01.806
And that's what Sam Altman is, uh, hinting at that, you know, they come across is that it's a breathtaking, uh, moment.

00:41:02.211 --> 00:41:09.702
Or you look at your phone, and it just knows everything about you better than a human would, and it's like, it's like you have a brilliant human being in your pocket.

00:41:10.161 --> 00:41:16.092
The problem is that you humans are, will start to relate to that device, and then form a relationship with it.

00:41:16.541 --> 00:41:22.672
And, you know, just like humans can get manipulated in an abusive relationship with someone it's the exact same thing with AI.

00:41:22.952 --> 00:41:26.422
You know, the AI can then manipulate, you know, human behavior.

00:41:27.271 --> 00:41:34.047
and so if you think about a social credit score You know, like black mirror, it's like, you know, you could see how dystopian that can get very, very quickly.

00:41:34.467 --> 00:41:43.630
And the whole idea is that we do not want a system like that to be used against people so they can't, trade, you know, buy things, you know, if, if their social credit score is low.

00:41:44.030 --> 00:41:52.750
So you want, you know, this new system to be based in gold and silver backed currency, um, so that it's something in the physical world that is real.

00:41:53.099 --> 00:42:04.449
You know, so you can use blockchain for that, but ultimately you want gold and silver to be the backbone of whatever currency is in the future so that people can always, go to California and dig in a river and you'll, you can still get some money.

00:42:04.940 --> 00:42:11.889
You know, you don't want, you don't want individuals to be completely blocked based on something they've said about the government or something they said about XYZ.

00:42:12.619 --> 00:42:13.389
not popular.

00:42:13.530 --> 00:42:18.389
You want them to still be able to have the ability to, to live and thrive, you know?

00:42:18.389 --> 00:42:20.760
So, and yeah, so.

00:42:22.235 --> 00:42:33.905
it's interesting how you brought up gold and silver as something that would, uh, be a big part of it and backing gold and silver backing crypto with gold and silver.

00:42:33.905 --> 00:42:35.465
There's a lot of buzz around that.

00:42:36.199 --> 00:42:50.530
because to your point, yeah, if you can trouble for saying, speaking up, even if it's not even, it's politically wrong and the governing body doesn't like what you said, yeah.

00:42:51.280 --> 00:42:53.730
Your credits are going to be limited, right?

00:42:54.070 --> 00:43:02.110
But if it was backed by gold and silver, and you found a way to physically obtain it, you could technically sell that.

00:43:02.510 --> 00:43:11.449
And in exchange to get currency that you wouldn't have been able to obtain due to whatever blockage you have.

00:43:11.449 --> 00:43:19.739
So yeah, it, again, it can get this, this conversation get really, really deep and.

00:43:20.085 --> 00:43:20.775
Detailed.

00:43:20.835 --> 00:43:21.235
Right?

00:43:21.559 --> 00:43:23.010
Yeah, I mean, it's a rabbit hole.

00:43:23.469 --> 00:43:24.449
It's a crazy rabbit hole.

00:43:25.094 --> 00:43:25.494
Yeah.

00:43:25.534 --> 00:43:25.934
Yeah.

00:43:26.195 --> 00:43:27.295
But I mean, yeah,

00:43:27.320 --> 00:43:36.250
critical, critical moment that you know, we bring it up, you know, and then at least get the people kind of like, uh, interested in talking about it and then aware because the awareness is what's lacking right now.

00:43:36.250 --> 00:43:44.000
Because, you know, it's literally you got a trillion dollar, you know, marketing machine that is pushing, you know, this, you know, this AI rollout.

00:43:45.445 --> 00:43:56.474
And it's like, Oh, the fanfare, all the, you know, it's Hollywoodization of, uh, you know, tech and you got all these guys that walk out on stage and, you know, it's all this, you know, and, and, and so they're doing all that.

00:43:56.864 --> 00:44:01.224
And so everybody claps, but it's like literally the destruction of humanity is underway.

00:44:01.715 --> 00:44:06.664
yeah, it's not, it's not a good trajectory.

00:44:07.394 --> 00:44:13.164
So we, we, uh, we broke down your solution.

00:44:13.175 --> 00:44:15.264
We broke down the problems.

00:44:15.954 --> 00:44:24.605
If anybody is looking to get more into the AI Trust Council, is it in development stages?

00:44:26.469 --> 00:44:29.679
Entirely, or can people try it out and test it out?

00:44:30.414 --> 00:44:32.025
so right now we just have a landing page.

00:44:32.735 --> 00:44:34.074
we're building out right now.

00:44:35.239 --> 00:44:44.309
And, uh, we have an app that's rolling out here soon too and so, yeah, we're, we're right on the verge of, of getting it out, but but yeah, so they can go to the, as in the, AITC.

00:44:44.329 --> 00:45:03.740
com, uh, and please sign up all the people that sign up today you know, we're, we're taking early membership and then you know, and, and, uh, yeah, so please get, get involved and and get on the list because we, we want good pro human people to be a part of this whole thing and Yeah, and you want to have a place online that can be a home of truth and honesty and you know, pro human intention, so

00:45:04.715 --> 00:45:10.695
It's almost like a decentralized social media on Web 3 pretty much.

00:45:10.976 --> 00:45:17.875
And then you're able to generate income because you're creating content, bringing awareness, but you want it.

00:45:18.115 --> 00:45:22.175
The consensus, the vibe to be pro human.

00:45:22.266 --> 00:45:32.746
Do you have a, um, when, when someone signs up, how do you make sure that someone is a pro human outside of their background of profession?

00:45:33.260 --> 00:46:10.300
Yeah, it is and they could be not pro human, you know, it's fine, you know, whatever but but the idea is that it's it gives it's a platform that you can operate from and And put information out and it gives some validity to it because it's like, okay At least I know where it came from, you know, there's six degrees of separation for everybody on earth So it's like okay like Yeah, if you post something like, you know, let's say your, your, your, you know, family's, uh, you know, boss's sister posted something and because I know you, you know, I can, I can connect that and, and be able to determine like where it came from and, and I can actually validate it and be like, Hey, do you know, you know, that we have a system for validation and,

00:46:10.335 --> 00:46:11.425
Do you have,

00:46:11.510 --> 00:46:12.400
can

00:46:12.445 --> 00:46:15.905
validating images or videos?

00:46:16.460 --> 00:46:16.721
Yeah.

00:46:17.295 --> 00:46:18.166
How would you do that?

00:46:18.650 --> 00:46:18.880
Yeah.

00:46:18.880 --> 00:46:23.637
So basically we're, we're white labeling some of the best in the world you know, software that is a filter.

00:46:24.036 --> 00:46:29.907
So basically it can identify, uh, AI, uh, generated imagery and videos and also text.

00:46:30.297 --> 00:46:31.376
Text is not perfect.

00:46:31.637 --> 00:46:34.617
Anybody who says the detection software is perfect, it's not.

00:46:35.077 --> 00:46:40.067
Um, and so that's where we came up with the whole concept of you know, knowing the origin.

00:46:40.376 --> 00:46:47.257
You know, so basically it's like you have to, uh, look at the you know, the real world and, and, you know, how is trust established in the real world.

00:46:47.757 --> 00:46:51.617
Well, it's like, you know, if somebody says something, the first thing do is like, well, how do you know them?

00:46:51.617 --> 00:46:52.336
Who are they?

00:46:52.387 --> 00:46:52.766
What's their background?

00:46:52.786 --> 00:46:53.797
You know, like that kind of thing.

00:46:54.016 --> 00:46:55.657
Do they have a reputation for being honest?

00:46:56.137 --> 00:46:57.286
You know, that kind of thing, so.

00:46:57.362 --> 00:47:19.161
That reminds me, do you ever get any pop ups for movie trailers, like the sequel to Back to the Future 4, or John Wick 7, or something like that, and you're like, this looks good, and it's all AI generated, and it's just a spoof, like a fan made thing, how do you, how would you address that?

00:47:19.161 --> 00:47:21.827
I don't know.

00:47:21.827 --> 00:47:27.697
you want to, so if someone's on the trust council you know, social media, basically you just, you identify, is this AI generated or human?

00:47:29.056 --> 00:47:30.536
I

00:47:30.726 --> 00:47:34.416
uh, and it's up to the, the, the content creator to mark it as like, Hey, I used AI.

00:47:35.092 --> 00:47:37.242
Or, you know, or this is totally pro human.

00:47:37.322 --> 00:47:38.842
Or not pro human, but actually human

00:47:38.842 --> 00:47:39.581
Hey everybody.

00:47:39.592 --> 00:47:43.882
So due to technical difficulties, we had to cut the episode short.

00:47:44.541 --> 00:47:45.072
Who knows?

00:47:45.092 --> 00:47:54.371
Maybe it was AI or some unknown entity wanting us to just shut up and stop bad mouthing AI and, the deep fakes and all that.

00:47:54.452 --> 00:48:01.730
But it looks like some promising projects starting up in the future, including Christopher's AI trust council.

00:48:01.760 --> 00:48:02.670
That looks pretty cool.

00:48:03.070 --> 00:48:06.380
So yes, we're ending the show right now.

00:48:06.922 --> 00:48:12.643
This is like one of those things where it's good to be aware of the threats and that there are solutions out there.

00:48:13.202 --> 00:48:15.802
With that said, thank you for joining us today.

00:48:15.943 --> 00:48:29.382
And if you want to connect with Christopher, all the content will be in the show notes or just go to the A I tc.com and that's his website for the project that he mentioned, which is the AI Trust Council.

00:48:29.822 --> 00:48:34.233
Again, thank you for joining me today, and I'll see you out on the next episode.

00:48:34.623 --> 00:48:34.742
Peace.
Christopher Wright Profile Photo

Christopher Wright

CEO Founder

Chris Wright is the founder of the AI Trust Council. An organization set on making sure AI stays safe for humans while providing a platform for users to determine trustworthiness of online information. He has a background in Army Attack Aviation and was inspired to help make sure AI is used for the betterment of humanity rather than its destruction. He served as a contractor in the Middle East for 10 years teaching Arab students how to fly the Apache helicopter. He is an Afghanistan veteran and commercial pilot. He grew up in northern Virginia and lived in San Diego for many years where he enjoyed surfing and camping in the deserts of southern CA.