paint-brush
Dezentrale KI-Diskussion: Interview mit Emad Mostaque wenige Tage nach seinem Rücktritt als Stabilitäts-CEO von@videoman
167 Lesungen

Dezentrale KI-Diskussion: Interview mit Emad Mostaque wenige Tage nach seinem Rücktritt als Stabilitäts-CEO

von Video Man
Video Man HackerNoon profile picture

Video Man

@videoman

i'm a man i'm a man i'm a video man...

1 Mindest read2024/04/02
Read on Terminal Reader
Read this story in a terminal
Print this story

Zu lang; Lesen

Emads Rücktritt als CEO von StabilityAI, seine nächsten Schritte in Richtung dezentraler KI und warum es so dringend ist, an dezentraler KI zu arbeiten. Voller Text.
featured image - Dezentrale KI-Diskussion: Interview mit Emad Mostaque wenige Tage nach seinem Rücktritt als Stabilitäts-CEO
Video Man HackerNoon profile picture
Video Man

Video Man

@videoman

i'm a man i'm a man i'm a video man

0-item

STORY’S CREDIBILITY

Interview

Interview

Between Two Computer Monitors: This story includes an interview between the writer and guest/interviewee.


these organizations are telling you that

[00:00:00] : [00:00:03]

they're building something that could

[00:00:03] : [00:00:05]

kill you and something that could remove

[00:00:05] : [00:00:08]

all our freedom and

[00:00:08] : [00:00:09]

liberty and they're saying it's a good

[00:00:09] : [00:00:11]

thing you should back them cuz it's cool

[00:00:11] : [00:00:14]

they don't care about the revenue they

[00:00:14] : [00:00:16]

have the political power people are

[00:00:16] : [00:00:18]

scared of them power should not be

[00:00:18] : [00:00:20]

invested in any one individual if I can

[00:00:20] : [00:00:22]

accelerate this over the next period I

[00:00:22] : [00:00:23]

don't have to make an impact I should

[00:00:23] : [00:00:26]

not have any power whereas again you see

[00:00:26] : [00:00:27]

everyone else trying to get more and

[00:00:27] : [00:00:28]

more

[00:00:28] : [00:00:32]

power the only way that you can beat it

[00:00:32] : [00:00:34]

to create the standard that represents

[00:00:34] : [00:00:37]

humanity is decentralized intelligence

[00:00:37] : [00:00:40]

it's collective intelligence the data

[00:00:40] : [00:00:42]

sets and Norms from that will be ones

[00:00:42] : [00:00:45]

that help children that help people

[00:00:45] : [00:00:47]

suffering that reflect our moral

[00:00:47] : [00:00:50]

upstanding and the best of us and

[00:00:50] : [00:00:51]

gathers the best of us to do

[00:00:51] : [00:00:54]

[Music]

[00:00:54] : [00:00:58]

it a week ago EOD stock was on my stage

[00:00:58] : [00:01:00]

at abundance 360 talking about the

[00:01:00] : [00:01:03]

future of open- source ai democratized

[00:01:03] : [00:01:07]

decentralized ai the day after a360 he

[00:01:07] : [00:01:09]

stepped down as CEO of stability now 5

[00:01:09] : [00:01:12]

days later I've sat down with iMat to

[00:01:12] : [00:01:14]

talk about why he's stepping down what

[00:01:14] : [00:01:16]

he's doing next the future of AI he

[00:01:16] : [00:01:18]

takes the gloves off he talks about the

[00:01:18] : [00:01:21]

dangers of centralized Ai and the

[00:01:21] : [00:01:24]

potential for decentralized democratized

[00:01:24] : [00:01:26]

AI to be the only Avenue that truly

[00:01:26] : [00:01:29]

uplifts all of humanity all right uh if

[00:01:29] : [00:01:32]

you like this episode please subscribe

[00:01:32] : [00:01:34]

let's jump in if you're a mood shot

[00:01:34] : [00:01:36]

entrepreneur this is an episode you're

[00:01:36] : [00:01:38]

not going to want to miss all right now

[00:01:38] : [00:01:41]

on to emod good morning emod good to see

[00:01:41] : [00:01:44]

you my friend that's as always pizza so

[00:01:44] : [00:01:47]

you and I were on stage literally last

[00:01:47] : [00:01:50]

week at the 2024 abundance Summit

[00:01:50] : [00:01:55]

talking about uh the whole open- Source

[00:01:55] : [00:01:57]

AI movement you were beginning to talk

[00:01:57] : [00:02:00]

about decentralized AI you talking about

[00:02:00] : [00:02:02]

where stability was the speed of the

[00:02:02] : [00:02:05]

development of the different uh products

[00:02:05] : [00:02:08]

and the day after the abundance Summit

[00:02:08] : [00:02:12]

was over uh the news hit that you had

[00:02:12] : [00:02:16]

stepped down as CEO of stability and

[00:02:16] : [00:02:19]

stepped off the board um so let's begin

[00:02:19] : [00:02:22]

with the obvious question uh why what

[00:02:22] : [00:02:26]

happened um and I I have huge respect

[00:02:26] : [00:02:28]

for you and I know a lot of the issues

[00:02:28] : [00:02:29]

in the past but I'd like you to have a

[00:02:29] : [00:02:33]

chance to share with entrepreneurs out

[00:02:33] : [00:02:36]

there and folks interested in AI exactly

[00:02:36] : [00:02:38]

your side of what's

[00:02:38] : [00:02:40]

Happening yeah thanks I think that uh

[00:02:40] : [00:02:44]

Elon Musk once characterize being a CEO

[00:02:44] : [00:02:47]

as uh staring into the abyss and chewing

[00:02:47] : [00:02:50]

glass uh because you are looking at a

[00:02:50] : [00:02:52]

very uncertain future having to make

[00:02:52] : [00:02:54]

decisions and the chewing glass is all

[00:02:54] : [00:02:55]

the problems that come to you all the

[00:02:55] : [00:02:58]

time and it's required to steer the ship

[00:02:58] : [00:03:01]

when things are incredibly be uncertain

[00:03:01] : [00:03:04]

and stability it's a pretty unique

[00:03:04] : [00:03:06]

company at a unique time like we hired

[00:03:06] : [00:03:08]

our first developer and researcher two

[00:03:08] : [00:03:11]

years ago and then in those two years we

[00:03:11] : [00:03:13]

built the best models of almost every

[00:03:13] : [00:03:15]

type except for large language image

[00:03:15] : [00:03:19]

audio 3D Etc and had over 300 million

[00:03:19] : [00:03:21]

downloads of the various models we

[00:03:21] : [00:03:23]

created and supported which was a bit

[00:03:23] : [00:03:26]

crazy and then generative AI is crazy in

[00:03:26] : [00:03:29]

terms of usually in a startup you don't

[00:03:29] : [00:03:31]

have to deal with global leaders and

[00:03:31] : [00:03:33]

policy debates about the future of

[00:03:33] : [00:03:37]

humanity and AGI and everything else at

[00:03:37] : [00:03:40]

the same time with building code yeah at

[00:03:40] : [00:03:42]

the same time as building code yes and

[00:03:42] : [00:03:44]

um especially be building code a take

[00:03:44] : [00:03:47]

fraction of the resources of our

[00:03:47] : [00:03:50]

competitors um like we had uh certain

[00:03:50] : [00:03:53]

teams who offered triple their entire

[00:03:53] : [00:03:56]

packages to move to other companies I

[00:03:56] : [00:03:58]

was grateful that only a couple of

[00:03:58] : [00:04:00]

researchers before

[00:04:00] : [00:04:02]

N I was going to leave left for other

[00:04:02] : [00:04:05]

companies and that was just startups no

[00:04:05] : [00:04:07]

one left for another big company which I

[00:04:07] : [00:04:08]

think is Testament to their kind of

[00:04:08] : [00:04:11]

loyalty in the mission but you know what

[00:04:11] : [00:04:13]

we've seen over the last year is or last

[00:04:13] : [00:04:15]

half year in particular is the question

[00:04:15] : [00:04:17]

of governance in AI is something that's

[00:04:17] : [00:04:19]

incredibly

[00:04:19] : [00:04:24]

important and who manages owns controls

[00:04:24] : [00:04:25]

this technology and how is it

[00:04:25] : [00:04:28]

distributed so we saw you know

[00:04:28] : [00:04:31]

everything from kind of open AI

[00:04:31] : [00:04:31]

because we've had this

[00:04:31] : [00:05:34]

conversation uh because it was there was

[00:05:34] : [00:05:36]

a lot of pressure asking whether for you

[00:05:36] : [00:05:38]

to step down as

[00:05:38] : [00:05:43]

CEO um and I think Founders typically

[00:05:43] : [00:05:45]

want to see themselves or or feel they

[00:05:45] : [00:05:48]

need to be the CEO and I I've heard you

[00:05:48] : [00:05:50]

say recently you know that you view

[00:05:50] : [00:05:52]

yourself more as a founder and

[00:05:52] : [00:05:55]

strategist than a CEO is that a fair

[00:05:55] : [00:05:56]

assessment yeah I think everyone's got

[00:05:56] : [00:05:59]

their own skill sets right so I'm

[00:05:59] : [00:06:01]

particular great at taking creatives uh

[00:06:01] : [00:06:03]

developers researchers others and

[00:06:03] : [00:06:04]

achieving their full potential and

[00:06:04] : [00:06:07]

designing systems but I should not be

[00:06:07] : [00:06:11]

dealing with you know HR and operations

[00:06:11] : [00:06:12]

and business development and other

[00:06:12] : [00:06:14]

elements they're far better people than

[00:06:14] : [00:06:17]

me to do that so now for example our

[00:06:17] : [00:06:19]

most popular thing stable diffusion and

[00:06:19] : [00:06:21]

comf UI the system around it is the most

[00:06:21] : [00:06:24]

widely used image software models in the

[00:06:24] : [00:06:26]

world there are great media CEOs that

[00:06:26] : [00:06:28]

can take that and amplify that to make

[00:06:28] : [00:06:29]

hundreds of millions of Revenue so

[00:06:29] : [00:06:33]

should come in and me on that so why now

[00:06:33] : [00:06:35]

pal what is there anything that

[00:06:35] : [00:06:40]

specifically tipped for you that has I

[00:06:40] : [00:06:43]

mean because you know it has you know

[00:06:43] : [00:06:45]

you have done an extraordinary job this

[00:06:45] : [00:06:46]

has been your

[00:06:46] : [00:06:50]

baby I mean how you have to

[00:06:50] : [00:06:53]

feel a a whole slew of emotional

[00:06:53] : [00:06:56]

elements and and I've had to step down

[00:06:56] : [00:06:59]

as CEO uh on two occasions over the 27

[00:06:59] : [00:07:00]

companies

[00:07:00] : [00:07:02]

I've had to sell a company for pennies

[00:07:02] : [00:07:06]

on the dollar and it takes an emotional

[00:07:06] : [00:07:10]

um hardship on you yeah I've um had

[00:07:10] : [00:07:13]

calls for me to step down as CEO since

[00:07:13] : [00:07:17]

the well since 2022 you know uh but I

[00:07:17] : [00:07:18]

always thought you know what's best for

[00:07:18] : [00:07:20]

the company in the mission and where I

[00:07:20] : [00:07:21]

look at the world right now there's a

[00:07:21] : [00:07:24]

few things say the company has momentum

[00:07:24] : [00:07:26]

it has spread it's turning into a

[00:07:26] : [00:07:28]

business like last year I said let's not

[00:07:28] : [00:07:31]

enter into large new contracts because

[00:07:31] : [00:07:33]

technology isn't mature yet and our

[00:07:33] : [00:07:34]

processes aren't mature yet and you have

[00:07:34] : [00:07:36]

to deliver so we did a lot of

[00:07:36] : [00:07:37]

experimental things we're setting up and

[00:07:37] : [00:07:39]

again now it's ramping on a business

[00:07:39] : [00:07:41]

side on a technology side the technology

[00:07:41] : [00:07:44]

is maturing diffusion Transformers such

[00:07:44] : [00:07:46]

as stable diffusion three and Sora are

[00:07:46] : [00:07:47]

going to be the next big thing and again

[00:07:47] : [00:07:50]

stability has got a great place there

[00:07:50] : [00:07:52]

but I think there's also the macro on

[00:07:52] : [00:07:55]

this so if you look at the open aai um

[00:07:55] : [00:07:58]

CEO for thing you know Sam mman said the

[00:07:58] : [00:08:00]

Bor conf fire anytime this is the

[00:08:00] : [00:08:02]

governments of open Ai and then they

[00:08:02] : [00:08:07]

fired him and then he is back on and he

[00:08:07] : [00:08:09]

appoints himself back on the board

[00:08:09] : [00:08:10]

there's clearly no governance opening I

[00:08:10] : [00:08:12]

mean I respect the people on the board

[00:08:12] : [00:08:13]

greatly you I think there're some great

[00:08:13] : [00:08:16]

individuals but who should manage the

[00:08:16] : [00:08:18]

technology that drives humanity and

[00:08:18] : [00:08:20]

teaches every child and manages our

[00:08:20] : [00:08:22]

government who's really leading on that

[00:08:22] : [00:08:24]

that can build these models and do those

[00:08:24] : [00:08:26]

things as you know I've always wanted to

[00:08:26] : [00:08:29]

build the science models and the the

[00:08:29] : [00:08:30]

health teams done that I want been doing

[00:08:30] : [00:08:33]

the education work and then my concept

[00:08:33] : [00:08:35]

of a national model for every country

[00:08:35] : [00:08:38]

owned by the people of the country all

[00:08:38] : [00:08:39]

tied together I think it needs to be by

[00:08:39] : [00:08:43]

a web3 not crypto or necessary token

[00:08:43] : [00:08:45]

framework that's something that's a

[00:08:45] : [00:08:47]

brand new kind of Challenge and one that

[00:08:47] : [00:08:49]

I think there's only a window of a year

[00:08:49] : [00:08:52]

or two to do if you have highly capable

[00:08:52] : [00:08:54]

models let's put aside AI for now which

[00:08:54] : [00:08:57]

you can discuss later really

[00:08:57] : [00:08:59]

accelerating then no one will be able to

[00:08:59] : [00:09:01]

keep up with that unless you build in a

[00:09:01] : [00:09:03]

decentralized manner and distributed

[00:09:03] : [00:09:06]

manner for data talents distribution

[00:09:06] : [00:09:08]

standards and more so there's only a

[00:09:08] : [00:09:10]

small window of time here to do that and

[00:09:10] : [00:09:12]

realistically again successful companies

[00:09:12] : [00:09:16]

and these things are all great but geni

[00:09:16] : [00:09:18]

is a bit bigger than the classical knobs

[00:09:18] : [00:09:20]

just like the whole life cycle of the

[00:09:20] : [00:09:21]

company was a lot faster than the

[00:09:21] : [00:09:23]

classical knobs so that's why I felt you

[00:09:23] : [00:09:26]

know now is the right time to make that

[00:09:26] : [00:09:28]

change and hopefully play my part in

[00:09:28] : [00:09:31]

making sure technology is distributed as

[00:09:31] : [00:09:34]

widely as possible and governed properly

[00:09:34] : [00:09:35]

as pretty much I think I'm the only real

[00:09:35] : [00:09:38]

independent agent that has built

[00:09:38] : [00:09:39]

state-of-the-art models in the world

[00:09:39] : [00:09:43]

right now yeah we've seen a lot of

[00:09:43] : [00:09:47]

turbulence with open AI we just saw

[00:09:47] : [00:09:51]

Mustafa um from inflection become part

[00:09:51] : [00:09:55]

of Microsoft um and I am curious I mean

[00:09:55] : [00:09:58]

you you had a a now famous conversation

[00:09:58] : [00:10:00]

with Satia

[00:10:00] : [00:10:02]

uh a couple of days after stepping down

[00:10:02] : [00:10:05]

um was that investigatory on your part

[00:10:05] : [00:10:08]

or was that just a touch base with an

[00:10:08] : [00:10:11]

old friend that's just ch trolling

[00:10:11] : [00:10:12]

actually you know like to let off some

[00:10:12] : [00:10:15]

steam that picture was I think from year

[00:10:15] : [00:10:19]

or two year or so ago

[00:10:19] : [00:10:22]

okay but you know um I think Sati is an

[00:10:22] : [00:10:25]

amazing CEO and uh you know he responds

[00:10:25] : [00:10:26]

again like the top CEOs incredibly

[00:10:26] : [00:10:30]

quickly when you message um and he's got

[00:10:30] : [00:10:32]

a great vision but there is again this

[00:10:32] : [00:10:34]

concern about consolidation in Tech we

[00:10:34] : [00:10:36]

didn't take money from any trillion

[00:10:36] : [00:10:39]

dollar companies at stability you know

[00:10:39] : [00:10:41]

we remained and retained full

[00:10:41] : [00:10:45]

Independence you know um and you know

[00:10:45] : [00:10:47]

the jman of some of the elements that we

[00:10:47] : [00:10:48]

could have taken very big checks and

[00:10:48] : [00:10:52]

other things um and even though you have

[00:10:52] : [00:10:54]

good intention R room that companies are

[00:10:54] : [00:10:57]

like slow Dumb AIS that over optimize

[00:10:57] : [00:10:59]

the various things that's not certainly

[00:10:59] : [00:11:01]

in the best interest of humanity when

[00:11:01] : [00:11:02]

you have

[00:11:02] : [00:11:05]

infrastructure it's like this is the

[00:11:05] : [00:11:07]

airports the railways the roads of the

[00:11:07] : [00:11:10]

future AI is an infrastructure G is an

[00:11:10] : [00:11:12]

infrastructure but it should be and

[00:11:12] : [00:11:14]

should it be Consolidated under the

[00:11:14] : [00:11:16]

control of a few private companies with

[00:11:16] : [00:11:18]

unclear objective functions again the

[00:11:18] : [00:11:20]

people in the companies may be great I

[00:11:20] : [00:11:23]

don't think so and this is a key concern

[00:11:23] : [00:11:25]

and part of that was that commentary

[00:11:25] : [00:11:26]

again he's doing an amazing job he's

[00:11:26] : [00:11:29]

consolidating a lot of power for the the

[00:11:29] : [00:11:31]

good of the company and also he has a I

[00:11:31] : [00:11:33]

think genuinely good heart of Mission to

[00:11:33] : [00:11:34]

bring technology to the

[00:11:34] : [00:11:38]

world but it is a bit concerning right

[00:11:38] : [00:11:39]

especially with the new types of

[00:11:39] : [00:11:41]

structure you're speaking of Sam in this

[00:11:41] : [00:11:45]

case Satia here oh SAA like like again I

[00:11:45] : [00:11:46]

think uh the commentary was always

[00:11:46] : [00:11:48]

interesting like you know Satia playing

[00:11:48] : [00:11:50]

4D chess you know assembling the AI

[00:11:50] : [00:11:52]

Avengers yeah I mean he is he's building

[00:11:52] : [00:11:54]

an amazing massive Talent covering the

[00:11:54] : [00:11:56]

bases and Microsoft is doing incredibly

[00:11:56] : [00:11:59]

well here right if you asked who's doing

[00:11:59] : [00:12:02]

gen to people would say Microsoft um but

[00:12:02] : [00:12:03]

there is has to be concerns about

[00:12:03] : [00:12:05]

consolidation of talent and power and

[00:12:05] : [00:12:08]

reach before I get to your vision going

[00:12:08] : [00:12:10]

forward because it's so important and

[00:12:10] : [00:12:14]

what you're doing next um I just again

[00:12:14] : [00:12:19]

as a Founder as a CEO of a moonshot

[00:12:19] : [00:12:22]

company we have a lot of those listening

[00:12:22] : [00:12:25]

here can I ask how are you feeling right

[00:12:25] : [00:12:28]

now because the decision to step down

[00:12:28] : [00:12:30]

has to have huge emotional are you

[00:12:30] : [00:12:34]

feeling relief are you feeling anxiety

[00:12:34] : [00:12:36]

um what's what's the feeling after

[00:12:36] : [00:12:39]

making that momentous

[00:12:39] : [00:12:43]

decision uh I was a big feeling of

[00:12:43] : [00:12:46]

relief um you know

[00:12:46] : [00:12:48]

because there's the there's that

[00:12:48] : [00:12:51]

Japanese concept of iy I I know iy and I

[00:12:51] : [00:12:54]

love it yes yeah do what you good at do

[00:12:54] : [00:12:55]

what you like and do what you believe

[00:12:55] : [00:12:57]

you're adding value and other people do

[00:12:57] : [00:13:01]

too you know like realistically again I

[00:13:01] : [00:13:03]

think I was an excellent research leader

[00:13:03] : [00:13:05]

strategist other things but I didn't

[00:13:05] : [00:13:07]

communicate properly or hire the right

[00:13:07] : [00:13:09]

other leaders in certain other areas of

[00:13:09] : [00:13:10]

the company and they're better people to

[00:13:10] : [00:13:13]

do that and so I wasn't doing what I was

[00:13:13] : [00:13:15]

best at a lot or I could have the most

[00:13:15] : [00:13:18]

measurable value and it was tying down

[00:13:18] : [00:13:19]

you know there's a lot of Legacy you

[00:13:19] : [00:13:22]

know technical organizational other debt

[00:13:22] : [00:13:25]

especially when you grow so so so fast

[00:13:25] : [00:13:26]

and you know we were lucky that we had

[00:13:26] : [00:13:28]

high retention in the important areas

[00:13:28] : [00:13:30]

and we could execute in spite of all of

[00:13:30] : [00:13:32]

that in spite of going the big company

[00:13:32] : [00:13:34]

route I think you know moonot Founders

[00:13:34] : [00:13:35]

you have to do it because you don't have

[00:13:35] : [00:13:37]

the resources at the start and you have

[00:13:37] : [00:13:39]

to guide the ship you know as it goes

[00:13:39] : [00:13:41]

out from Port but there does come that

[00:13:41] : [00:13:44]

transition point there and there is a

[00:13:44] : [00:13:46]

competing thing where you typically take

[00:13:46] : [00:13:48]

on VC money which has its own objective

[00:13:48] : [00:13:51]

function versus your overall mission so

[00:13:51] : [00:13:53]

again if you look at the generative AI

[00:13:53] : [00:13:55]

world right now how many

[00:13:55] : [00:13:58]

credible intelligent independent voices

[00:13:58] : [00:14:01]

are there there theyve had the ability

[00:14:01] : [00:14:03]

to build models and design things and

[00:14:03] : [00:14:05]

make an impact you know there's not many

[00:14:05] : [00:14:07]

so I was like that's where I can add my

[00:14:07] : [00:14:10]

most leverage and also the design space

[00:14:10] : [00:14:13]

is again is unprecedentedly huge because

[00:14:13] : [00:14:15]

the entire Market has just been

[00:14:15] : [00:14:19]

created like where does geny not fit and

[00:14:19] : [00:14:21]

where does it not touch and what needs

[00:14:21] : [00:14:22]

to be built there we need to actually

[00:14:22] : [00:14:24]

have the agency to go and build that and

[00:14:24] : [00:14:29]

so I felt tired relieved um I felt that

[00:14:29] : [00:14:32]

now there's a million options I want to

[00:14:32] : [00:14:34]

rather than taking a long break get on

[00:14:34] : [00:14:35]

with things I've just done the first

[00:14:35] : [00:14:37]

thing in kind of web 3 and I've got a

[00:14:37] : [00:14:38]

whole bunch of other things we're going

[00:14:38] : [00:14:42]

to discuss kind of coming and catalyze

[00:14:42] : [00:14:44]

stuff that can make an exponential

[00:14:44] : [00:14:46]

benefit because you know like massively

[00:14:46] : [00:14:49]

transformative purpose here is I want

[00:14:49] : [00:14:51]

every kid to achieve their potential and

[00:14:51] : [00:14:53]

give them the tools to do that and and I

[00:14:53] : [00:14:55]

love you for that um because you've been

[00:14:55] : [00:14:58]

true to that that vision and I know on

[00:14:58] : [00:15:00]

the heels of your announcement you've

[00:15:00] : [00:15:02]

been reached out to by national leaders

[00:15:02] : [00:15:06]

by corporate you know by CEOs and major

[00:15:06] : [00:15:08]

investment groups and um you have a lot

[00:15:08] : [00:15:12]

of opportunity ahead of you so let's

[00:15:12] : [00:15:15]

talk about where you want to go next um

[00:15:15] : [00:15:18]

you mentioned publicly and you discussed

[00:15:18] : [00:15:21]

on our abundant stage the idea of of uh

[00:15:21] : [00:15:25]

democratized and decentralized AI um

[00:15:25] : [00:15:27]

let's define that first what is that why

[00:15:27] : [00:15:29]

is it important and what do you want to

[00:15:29] : [00:15:33]

do there yeah I think that when I said

[00:15:33] : [00:15:35]

I'm going to move to do my part in

[00:15:35] : [00:15:37]

decentralizing people like isn't that

[00:15:37] : [00:15:39]

just open source you give the technology

[00:15:39] : [00:15:41]

right and then anyone can use it but it

[00:15:41] : [00:15:44]

isn't a decentralizing AI has a few

[00:15:44] : [00:15:46]

important components one is availability

[00:15:46] : [00:15:48]

and accessibility everyone should be

[00:15:48] : [00:15:51]

able to access this technology the foods

[00:15:51] : [00:15:52]

the labor and there's some very

[00:15:52] : [00:15:54]

interesting political and other elements

[00:15:54] : [00:15:57]

around that number two is the governance

[00:15:57] : [00:15:58]

of this

[00:15:58] : [00:15:59]

technology you have centralized

[00:15:59] : [00:16:01]

governance because the models are the

[00:16:01] : [00:16:04]

data there's a recent data bricks thing

[00:16:04] : [00:16:05]

uh model where they show that you have

[00:16:05] : [00:16:06]

massive improvements from data we all

[00:16:06] : [00:16:09]

know that you know who governs the data

[00:16:09] : [00:16:11]

that teaches your child or manages your

[00:16:11] : [00:16:13]

health or runs your

[00:16:13] : [00:16:15]

government that's an important question

[00:16:15] : [00:16:16]

I think too few are asking and we need

[00:16:16] : [00:16:18]

dat transparency and other things like

[00:16:18] : [00:16:22]

that um so accessibility you know you've

[00:16:22] : [00:16:25]

got the um governance aspect of that and

[00:16:25] : [00:16:27]

then finally you have how does it all

[00:16:27] : [00:16:30]

come together is it a single package or

[00:16:30] : [00:16:33]

is it a modulized

[00:16:33] : [00:16:35]

infrastructure that people can build on

[00:16:35] : [00:16:37]

and is available kind of everywhere you

[00:16:37] : [00:16:39]

know does it require monoliths and

[00:16:39] : [00:16:41]

Central servers where if it goes down

[00:16:41] : [00:16:43]

and you have an outage on gp4 you're a

[00:16:43] : [00:16:45]

bit messed up or someone can attack and

[00:16:45] : [00:16:48]

corrupt it I think that those are kind

[00:16:48] : [00:16:49]

of the key elements that I was looking

[00:16:49] : [00:16:50]

at when I was talking about

[00:16:50] : [00:16:53]

decentralizing Ai and you know I've come

[00:16:53] : [00:16:56]

up with an infrastructure to do that I

[00:16:56] : [00:16:59]

hope um as well so if if you don't mind

[00:16:59] : [00:17:02]

let's double click on it even further so

[00:17:02] : [00:17:05]

um you mentioned we don't have long to

[00:17:05] : [00:17:09]

get there if that's a true statement why

[00:17:09] : [00:17:12]

don't we have long to get there and then

[00:17:12] : [00:17:15]

what does getting there look like if you

[00:17:15] : [00:17:18]

had all the capital available and If the

[00:17:18] : [00:17:20]

right national leaders were hearing

[00:17:20] : [00:17:23]

about this because a lot of this is

[00:17:23] : [00:17:27]

supporting um supporting the populace of

[00:17:27] : [00:17:30]

a Nation um to have have ai that serves

[00:17:30] : [00:17:35]

them versus uh top down what's it look

[00:17:35] : [00:17:39]

like 25 10 years from now yeah I think

[00:17:39] : [00:17:41]

you'll have both proprietary and open

[00:17:41] : [00:17:43]

source Ai and they'll work in

[00:17:43] : [00:17:46]

combination the Practical example I give

[00:17:46] : [00:17:48]

is that sayi is like graduates right

[00:17:48] : [00:17:50]

very talented slightly of enthusiastic

[00:17:50] : [00:17:52]

graduates you got those and

[00:17:52] : [00:17:55]

Consultants um but I believe you know on

[00:17:55] : [00:17:56]

stage with mat Freedman last week at

[00:17:56] : [00:17:58]

a360 when we're there he said it's like

[00:17:58] : [00:18:00]

we discovered this New Concept what do

[00:18:00] : [00:18:03]

you call it AI Atlantis Atlantis yes yes

[00:18:03] : [00:18:04]

with 100 billion graduates that will

[00:18:04] : [00:18:07]

work for free yes I love that analogy it

[00:18:07] : [00:18:09]

was it was a brilliant analogy yeah we

[00:18:09] : [00:18:11]

need to figure out how to say Atlantis

[00:18:11] : [00:18:14]

you know um but but there's a few things

[00:18:14] : [00:18:16]

here first of all is the defaults you

[00:18:16] : [00:18:18]

know once a government Embraces

[00:18:18] : [00:18:20]

centralized technology it's very

[00:18:20] : [00:18:22]

difficult to decentralize it and every

[00:18:22] : [00:18:26]

country needs an AI strategy a year ago

[00:18:26] : [00:18:30]

one year ago was GPT 4 yeah crazy how

[00:18:30] : [00:18:33]

crazy is that you know at the AI safety

[00:18:33] : [00:18:37]

Summit uh in the UK the king of England

[00:18:37] : [00:18:40]

came on stage or came via video call and

[00:18:40] : [00:18:42]

he said that this is the biggest thing

[00:18:42] : [00:18:45]

since fire you know and that was like

[00:18:45] : [00:18:47]

what six seven months later where are we

[00:18:47] : [00:18:49]

going to be in here yeah I think he took

[00:18:49] : [00:18:53]

that from from uh uh from uh uh the

[00:18:53] : [00:18:57]

found the CEO of Google um AI is as

[00:18:57] : [00:19:00]

powerful as as fire and electricity yeah

[00:19:00] : [00:19:02]

yeah yeah I've heard the same from like

[00:19:02] : [00:19:03]

Jeff Bezos and a bunch of others you

[00:19:03] : [00:19:06]

know uh not Kindle Fire proper fire you

[00:19:06] : [00:19:09]

know um but then if you think about it

[00:19:09] : [00:19:11]

Norms are going to be set in this next

[00:19:11] : [00:19:13]

period like you know I'm in California

[00:19:13] : [00:19:15]

La at the moment if you don't set Norms

[00:19:15] : [00:19:18]

on rights for actors and the movie

[00:19:18] : [00:19:20]

industry then you could have a massive

[00:19:20] : [00:19:22]

disruption just occurring as fulllength

[00:19:22] : [00:19:24]

Hollywood features come in a year or two

[00:19:24] : [00:19:26]

generated you know if you don't have

[00:19:26] : [00:19:29]

Norms around open models and earn sh and

[00:19:29] : [00:19:30]

governance by the people it'll be top

[00:19:30] : [00:19:32]

down governance because governments

[00:19:32] : [00:19:34]

can't allow that to be out of control if

[00:19:34] : [00:19:36]

they don't have a reasonable

[00:19:36] : [00:19:39]

alternative um and I think the window is

[00:19:39] : [00:19:40]

only a year or two because every

[00:19:40] : [00:19:42]

government must have a strategy by the

[00:19:42] : [00:19:44]

end of the year and so I think if you

[00:19:44] : [00:19:46]

provide them a good solution that has

[00:19:46] : [00:19:48]

this element of democratic governance

[00:19:48] : [00:19:49]

and others that will be immensely

[00:19:49] : [00:19:53]

beneficial I think also it's urgent

[00:19:53] : [00:19:55]

because we have the ability to make a

[00:19:55] : [00:19:57]

huge difference you know as we kind of

[00:19:57] : [00:19:59]

May probably discuss later having we W

[00:19:59] : [00:20:01]

the knowledge of cancer longevity autism

[00:20:01] : [00:20:03]

at your fingertips we have the

[00:20:03] : [00:20:04]

technology for that right now we have

[00:20:04] : [00:20:06]

the technology that no one ever ever

[00:20:06] : [00:20:08]

alone again on those things or to give

[00:20:08] : [00:20:10]

every child a superior education

[00:20:10] : [00:20:13]

literally in a couple of years like

[00:20:13] : [00:20:15]

there is an urgency both from there's a

[00:20:15] : [00:20:17]

small window but also from we must do

[00:20:17] : [00:20:20]

this now because it can scale and make

[00:20:20] : [00:20:21]

that impact we have dreamed of for so

[00:20:21] : [00:20:23]

long the enabling technology is finally

[00:20:23] : [00:20:25]

you know it's finally good enough fast

[00:20:25] : [00:20:26]

enough and cheap enough everybody I want

[00:20:26] : [00:20:28]

to take a short break from our episode

[00:20:28] : [00:20:29]

to talk about company that's very

[00:20:29] : [00:20:32]

important to me and could actually save

[00:20:32] : [00:20:34]

your life or the life of someone that

[00:20:34] : [00:20:37]

you love company is called Fountain life

[00:20:37] : [00:20:38]

and it's a company I started years ago

[00:20:38] : [00:20:41]

with Tony Robbins and a group of very

[00:20:41] : [00:20:43]

talented Physicians you know most of us

[00:20:43] : [00:20:45]

don't actually know what's going on

[00:20:45] : [00:20:48]

inside our body we're all optimists

[00:20:48] : [00:20:50]

until that day when you have a pain in

[00:20:50] : [00:20:52]

your side you go to the physician or the

[00:20:52] : [00:20:54]

emergency room and they say listen I'm

[00:20:54] : [00:20:56]

sorry to tell you this but you have this

[00:20:56] : [00:20:59]

stage three or four going on and you

[00:20:59] : [00:21:01]

know it didn't start that morning it

[00:21:01] : [00:21:03]

probably was a problem that's been going

[00:21:03] : [00:21:06]

on for some time but because we never

[00:21:06] : [00:21:09]

look we don't find out so what we built

[00:21:09] : [00:21:12]

at Fountain life was the world's most

[00:21:12] : [00:21:14]

advanced diagnostic Centers we have four

[00:21:14] : [00:21:17]

across the us today and we're building

[00:21:17] : [00:21:19]

20 around the world these centers give

[00:21:19] : [00:21:22]

you a full body MRI a brain a brain

[00:21:22] : [00:21:25]

vasculature an AI enabled coronary CT

[00:21:25] : [00:21:28]

looking for soft plaque a dexa scan a

[00:21:28] : [00:21:30]

Grail blood cancer test a full executive

[00:21:30] : [00:21:33]

blood workup it's the most advanced

[00:21:33] : [00:21:37]

workup you'll ever receive 150 GB of

[00:21:37] : [00:21:39]

data that then go to our AIS and our

[00:21:39] : [00:21:42]

physicians to find any disease at the

[00:21:42] : [00:21:45]

very beginning when it's solvable you're

[00:21:45] : [00:21:47]

going to find out eventually might as

[00:21:47] : [00:21:49]

well find out when you can take action

[00:21:49] : [00:21:51]

Fountain life also has an entire side of

[00:21:51] : [00:21:53]

Therapeutics we look around the world

[00:21:53] : [00:21:55]

for the most Advanced Therapeutics that

[00:21:55] : [00:21:57]

can add 10 20 healthy years to your life

[00:21:57] : [00:22:00]

and we provide them to you at our

[00:22:00] : [00:22:03]

centers so if this is of interest to you

[00:22:03] : [00:22:06]

please go and check it out go to

[00:22:06] : [00:22:08]

Fountain

[00:22:08] : [00:22:11]

life.com Peter when Tony and I wrote Our

[00:22:11] : [00:22:14]

New York Times bestseller life force we

[00:22:14] : [00:22:17]

had 30,000 people reached out to us for

[00:22:17] : [00:22:19]

Fountain life memberships if you go to

[00:22:19] : [00:22:21]

Fountain life.com back/ Peter we'll put

[00:22:21] : [00:22:24]

you to the top of the list really it's

[00:22:24] : [00:22:27]

something that is um for me one of the

[00:22:27] : [00:22:28]

most important things I offer for my

[00:22:28] : [00:22:31]

entire family the CEOs of my companies

[00:22:31] : [00:22:34]

my friends it's a chance to really add

[00:22:34] : [00:22:38]

decades onto our healthy lifespans go to

[00:22:38] : [00:22:40]

fountainlife

[00:22:40] : [00:22:42]

decomp it's one of the most important

[00:22:42] : [00:22:44]

things I can offer to you as one of my

[00:22:44] : [00:22:46]

listeners all right let's go back to our

[00:22:46] : [00:22:48]

episode so the let's talk about the

[00:22:48] : [00:22:49]

objective function of democratized and

[00:22:49] : [00:22:53]

decentralized AI um is it that the

[00:22:53] : [00:22:56]

compute is resident um in countries

[00:22:56] : [00:22:59]

around the world is it that the models

[00:22:59] : [00:23:02]

are owned by the citizens of the world

[00:23:02] : [00:23:06]

is it that data is owned um and how do

[00:23:06] : [00:23:09]

you get there from here I think that um

[00:23:09] : [00:23:11]

you can think of the supercomputers like

[00:23:11] : [00:23:13]

universities M you don't need many

[00:23:13] : [00:23:15]

universities honestly if someone's

[00:23:15] : [00:23:16]

building good quality models that's one

[00:23:16] : [00:23:19]

of the things is iability and we did the

[00:23:19] : [00:23:20]

hard task we could have just stuck with

[00:23:20] : [00:23:22]

image we said no we're going to have the

[00:23:22] : [00:23:25]

best 3D image audio biomedical all these

[00:23:25] : [00:23:27]

models and no one else managed that

[00:23:27] : [00:23:29]

apart from open AI to Dee in fact I

[00:23:29] : [00:23:31]

think we have more modalities than open

[00:23:31] : [00:23:33]

AI again kind of want to kind of

[00:23:33] : [00:23:35]

describe this is

[00:23:35] : [00:23:38]

accessibility and governance and a few

[00:23:38] : [00:23:40]

of these other factors so I think what

[00:23:40] : [00:23:42]

it means is that this technology is

[00:23:42] : [00:23:45]

available to everyone but you see now

[00:23:45] : [00:23:46]

that you don't necessarily need giant

[00:23:46] : [00:23:49]

supercomputers to even run it you know

[00:23:49] : [00:23:50]

we show showed you a language model

[00:23:50] : [00:23:53]

running on the laptop stable M2 will run

[00:23:53] : [00:23:56]

on a gigabyte on a Mac mare faster than

[00:23:56] : [00:23:57]

you can read you know we're writing some

[00:23:57] : [00:24:01]

poems ious things you know um we see

[00:24:01] : [00:24:03]

stable diffusion now at 300 images a

[00:24:03] : [00:24:06]

second on a consumer graphics card our

[00:24:06] : [00:24:09]

video model was like 5 gabt of vram this

[00:24:09] : [00:24:11]

really changes the equation because in

[00:24:11] : [00:24:13]

web 2 all the intelligence was

[00:24:13] : [00:24:14]

centralized on these giant servers and

[00:24:14] : [00:24:17]

big data now you have big supercomputers

[00:24:17] : [00:24:19]

I think you'll need less with better

[00:24:19] : [00:24:21]

data training these graduates that can

[00:24:21] : [00:24:23]

go out and customized to each country

[00:24:23] : [00:24:25]

but they must reflect the culture of

[00:24:25] : [00:24:27]

that country like the Japanese stable

[00:24:27] : [00:24:29]

diffusion model we had if you typed in

[00:24:29] : [00:24:31]

salary man it gave you a very sad person

[00:24:31] : [00:24:33]

versus the base model giving you a very

[00:24:33] : [00:24:36]

happy person right so you must have

[00:24:36] : [00:24:38]

graduates that reflect the local culture

[00:24:38] : [00:24:40]

and then reflect the local knowledge and

[00:24:40] : [00:24:45]

then Global models again that reflect

[00:24:45] : [00:24:48]

our Global Knowledge and can be accessed

[00:24:48] : [00:24:50]

by anyone but who decides what goes in

[00:24:50] : [00:24:52]

there these are some very important

[00:24:52] : [00:24:55]

questions and who vouches for the

[00:24:55] : [00:24:58]

quality as well what's your advice to a

[00:24:58] : [00:25:00]

a national leader because you know we're

[00:25:00] : [00:25:03]

now starting to see Ministers of AI in

[00:25:03] : [00:25:06]

different nation states and what what's

[00:25:06] : [00:25:09]

your advice to them right now in this

[00:25:09] : [00:25:12]

area I think my advice to them would be

[00:25:12] : [00:25:15]

to start collecting the data sets that

[00:25:15] : [00:25:17]

they would teach a graduate that was

[00:25:17] : [00:25:19]

very smart through school and kind of

[00:25:19] : [00:25:21]

other things this is National broadcast

[00:25:21] : [00:25:24]

data this is the curriculum this is

[00:25:24] : [00:25:26]

their accounting legal and others and

[00:25:26] : [00:25:27]

note that those data sets are

[00:25:27] : [00:25:29]

infrastructured

[00:25:29] : [00:25:32]

they will enable the local populace and

[00:25:32] : [00:25:34]

others to create these models because

[00:25:34] : [00:25:35]

models are just data wrapped in

[00:25:35] : [00:25:37]

algorithms with a bit of compute that's

[00:25:37] : [00:25:40]

the recipe compute algorithms and data

[00:25:40] : [00:25:43]

and it's not going to be as hard as you

[00:25:43] : [00:25:45]

think to train these models but you have

[00:25:45] : [00:25:46]

to build them to good standards so by

[00:25:46] : [00:25:49]

the end of next year probably year after

[00:25:49] : [00:25:53]

I would estimate that a l 70b LEL model

[00:25:53] : [00:25:55]

or a stable diffusion model so these are

[00:25:55] : [00:25:58]

two leading models in image and language

[00:25:58] : [00:26:01]

will cost about under $10,000 probably

[00:26:01] : [00:26:04]

even $1,000 to train and then it comes

[00:26:04] : [00:26:06]

all about the data and then it becomes

[00:26:06] : [00:26:08]

about the

[00:26:08] : [00:26:10]

standards you know it's it's it's

[00:26:10] : [00:26:13]

interesting um there is so much

[00:26:13] : [00:26:16]

knowledge in the world that

[00:26:16] : [00:26:19]

will vaporize sublimate over the decade

[00:26:19] : [00:26:23]

ahead as people die you know uh cultural

[00:26:23] : [00:26:25]

data locked up in people's minds and

[00:26:25] : [00:26:27]

stories and so forth that's never been

[00:26:27] : [00:26:29]

recorded it's an interesting time to

[00:26:29] : [00:26:31]

actually capture that data and

[00:26:31] : [00:26:34]

permanently store it into uh the

[00:26:34] : [00:26:35]

national

[00:26:35] : [00:26:38]

models yeah and again I think people

[00:26:38] : [00:26:40]

over focus on the models versus the data

[00:26:40] : [00:26:42]

sets I mean is data set yeah yeah yeah

[00:26:42] : [00:26:45]

with the exponential compute you can

[00:26:45] : [00:26:47]

recalibrate and improve the data as well

[00:26:47] : [00:26:49]

so right now a lot of the improvements

[00:26:49] : [00:26:51]

and models are actually synthetically

[00:26:51] : [00:26:54]

improving data and data quality um as

[00:26:54] : [00:26:55]

you said there's so much that can be

[00:26:55] : [00:26:57]

lost but now we can actually capture

[00:26:57] : [00:26:59]

this and the concepts and the other

[00:26:59] : [00:27:02]

guidance and have cross checks like you

[00:27:02] : [00:27:04]

can deconstruct laws you know you can

[00:27:04] : [00:27:08]

translate between contexts you can make

[00:27:08] : [00:27:10]

expert information available to everyone

[00:27:10] : [00:27:12]

because again you have this new

[00:27:12] : [00:27:14]

continent of AI Atlantis and all these

[00:27:14] : [00:27:16]

graduates soon to be Specialists that

[00:27:16] : [00:27:20]

are on your phone and that's incredibly

[00:27:20] : [00:27:23]

democratizing you know um because

[00:27:23] : [00:27:25]

otherwise the knowledge is throughout

[00:27:25] : [00:27:26]

history knowledge has always been gatee

[00:27:26] : [00:27:29]

kept always

[00:27:29] : [00:27:31]

I want to get to uh health and education

[00:27:31] : [00:27:34]

next but before we go there I know you

[00:27:34] : [00:27:38]

were meeting with uh a mutual friend uh

[00:27:38] : [00:27:42]

Jules herbach the other day um and uh

[00:27:42] : [00:27:44]

and stability announced a deal with otoy

[00:27:44] : [00:27:47]

Endeavor and render Network um are you

[00:27:47] : [00:27:50]

still an adviser to that Venture yeah no

[00:27:50] : [00:27:52]

this is part of the whole thing it's the

[00:27:52] : [00:27:55]

first of many web 3 kind of elements

[00:27:55] : [00:27:59]

there I think web 3 is 95% % I say 90%

[00:27:59] : [00:28:00]

I'll be

[00:28:00] : [00:28:05]

generous um speculative and rubbish but

[00:28:05] : [00:28:07]

there is that 51% of genuine people that

[00:28:07] : [00:28:09]

have been thinking about questions of

[00:28:09] : [00:28:12]

governance coordination and others and

[00:28:12] : [00:28:14]

have built things that are proper so

[00:28:14] : [00:28:16]

ottoy is the bridge to the creative

[00:28:16] : [00:28:19]

industry that's why with ar Emanuel and

[00:28:19] : [00:28:21]

you Eric Schmidt and others are on the

[00:28:21] : [00:28:23]

board um and the render network is a

[00:28:23] : [00:28:26]

million gpus largely from creative

[00:28:26] : [00:28:28]

professionals that are available and so

[00:28:28] : [00:28:31]

the first uh thing I announced there was

[00:28:31] : [00:28:34]

the initial 10 million gaps 250 million

[00:28:34] : [00:28:37]

of distributed compute to create the

[00:28:37] : [00:28:40]

best 3D data sets like at stability we

[00:28:40] : [00:28:42]

funded and worked with Allen institution

[00:28:42] : [00:28:44]

and others on obverse Excel which was 10

[00:28:44] : [00:28:46]

million high quality 3D assets we're

[00:28:46] : [00:28:48]

going to a billion distributed you don't

[00:28:48] : [00:28:51]

need giant supercomputers but then that

[00:28:51] : [00:28:52]

is a community good that is owned by the

[00:28:52] : [00:28:55]

people of the network and accessible to

[00:28:55] : [00:28:58]

non academic and others as well why

[00:28:58] : [00:29:00]

because you need high quality assets to

[00:29:00] : [00:29:02]

create better 3D models we have a new 3D

[00:29:02] : [00:29:04]

model tripol that can generate a 3D

[00:29:04] : [00:29:07]

image from a 2D image in 0.5 seconds and

[00:29:07] : [00:29:09]

that 3D model feeds into better 3D

[00:29:09] : [00:29:11]

assets and then what does that mean it

[00:29:11] : [00:29:12]

means we're heading towards the hollow

[00:29:12] : [00:29:14]

deck without the data you're not going

[00:29:14] : [00:29:17]

to get there and Jules Jules wants the

[00:29:17] : [00:29:19]

hollow deck for sure yeah so you know

[00:29:19] : [00:29:21]

Jules and I are on the same page L you

[00:29:21] : [00:29:23]

know and you're not going to get there

[00:29:23] : [00:29:26]

without again a Commons of data that can

[00:29:26] : [00:29:28]

train the graduates that then becomes

[00:29:28] : [00:29:31]

specialized with Star Trek or you know

[00:29:31] : [00:29:33]

Star Wars or any of these other IPS and

[00:29:33] : [00:29:35]

then also setting standards

[00:29:35] : [00:29:39]

around monetization IP rights all sorts

[00:29:39] : [00:29:41]

of other things um and so a network like

[00:29:41] : [00:29:44]

render is really good for that but you

[00:29:44] : [00:29:45]

know I've been talking to a lot of

[00:29:45] : [00:29:46]

people in web 3 about the different

[00:29:46] : [00:29:49]

elements of the stack uh because what I

[00:29:49] : [00:29:51]

basically see is that we have the

[00:29:51] : [00:29:53]

opportunity to build almost a human

[00:29:53] : [00:29:56]

operating system uh models and data sets

[00:29:56] : [00:29:59]

for every nation every sect coordinated

[00:29:59] : [00:30:02]

through proper web 3 principles again

[00:30:02] : [00:30:04]

not speculative tokens or anything like

[00:30:04] : [00:30:07]

that you know making it so that every

[00:30:07] : [00:30:10]

child in the world or adult can create

[00:30:10] : [00:30:12]

anything they can imagine they can be

[00:30:12] : [00:30:14]

protected against the harms and they

[00:30:14] : [00:30:16]

have access to the right information at

[00:30:16] : [00:30:18]

the right time to thrive and again

[00:30:18] : [00:30:20]

that's infrastructure for everyone it's

[00:30:20] : [00:30:24]

a common good access to gpus has been

[00:30:24] : [00:30:27]

sort of The Limited fuel um do you think

[00:30:27] : [00:30:31]

decentralized GPU structures like render

[00:30:31] : [00:30:33]

is part of that future is an important

[00:30:33] : [00:30:36]

part of that future I think so right now

[00:30:36] : [00:30:39]

it's far more efficient to train models

[00:30:39] : [00:30:41]

on these again big supercomputers the

[00:30:41] : [00:30:43]

university but the rate of exponential

[00:30:43] : [00:30:46]

growth there again is insan last year to

[00:30:46] : [00:30:49]

train llama 2 cost $10 million in a year

[00:30:49] : [00:30:50]

it'll cost

[00:30:50] : [00:30:53]

$10,000 a, times improvement from

[00:30:53] : [00:30:56]

algorithms data super compute

[00:30:56] : [00:30:58]

speeds and that's crazy if you think

[00:30:58] : [00:31:00]

about it right um so I don't think this

[00:31:00] : [00:31:02]

will be the limiting factor I think the

[00:31:02] : [00:31:04]

GPU overhang for language models

[00:31:04] : [00:31:06]

probably lasts until the end of the year

[00:31:06] : [00:31:08]

but then there's plentiful Supply

[00:31:08] : [00:31:10]

because what you have is NVIDIA makes

[00:31:10] : [00:31:14]

amazing gpus at an 83 or 87% margin

[00:31:14] : [00:31:16]

right but the actual calculations aren't

[00:31:16] : [00:31:19]

complicated like we took Intel gpus and

[00:31:19] : [00:31:21]

we ran the stable diffusion 3 diffusion

[00:31:21] : [00:31:23]

transforma training so this is the same

[00:31:23] : [00:31:25]

technology that's used in Sora and

[00:31:25] : [00:31:27]

stable diffusion 3 is multimodal so it

[00:31:27] : [00:31:29]

can train Sora model with enough compute

[00:31:29] : [00:31:30]

and I think us and them were the only

[00:31:30] : [00:31:32]

people kind of doing this I think M

[00:31:32] : [00:31:36]

pixel as well um and then it ran faster

[00:31:36] : [00:31:39]

than the Intel gpus than the Nvidia

[00:31:39] : [00:31:42]

gpus but we know that it can run even

[00:31:42] : [00:31:44]

faster because it's not optimized for

[00:31:44] : [00:31:46]

either it's still running fast so what

[00:31:46] : [00:31:48]

you'll see is a commoditization of the

[00:31:48] : [00:31:50]

hardware once the architectures get

[00:31:50] : [00:31:53]

stabilized because gp4 is just a

[00:31:53] : [00:31:56]

research artifact St diffusion was just

[00:31:56] : [00:31:58]

a research artifact you're not going

[00:31:58] : [00:32:00]

engineering phase yet and you've got to

[00:32:00] : [00:32:03]

the point whereby this runs on MacBooks

[00:32:03] : [00:32:05]

it runs on other things so I think it's

[00:32:05] : [00:32:08]

a shortterm phenomenum of the next year

[00:32:08] : [00:32:10]

because people were taking a point in

[00:32:10] : [00:32:12]

time and extrapolating it without taking

[00:32:12] : [00:32:15]

into account efficiencies

[00:32:15] : [00:32:17]

optimizations and the fact that models

[00:32:17] : [00:32:19]

that work on the edge and can go to your

[00:32:19] : [00:32:22]

private data will be more impactful than

[00:32:22] : [00:32:24]

generalized intelligence everyone's over

[00:32:24] : [00:32:26]

indexing on generalized intelligence and

[00:32:26] : [00:32:28]

building AI God

[00:32:28] : [00:32:30]

versus Amplified human intelligence

[00:32:30] : [00:32:31]

shall we

[00:32:31] : [00:32:35]

say before we leave stability it's now

[00:32:35] : [00:32:40]

in the hands of uh of the chair and your

[00:32:40] : [00:32:42]

uh past CTO what's what do you imagine

[00:32:42] : [00:32:45]

the future of stability is going to be

[00:32:45] : [00:32:46]

going forward I know that you're not

[00:32:46] : [00:32:48]

involved anymore it's under different

[00:32:48] : [00:32:50]

leadership and there's a what's your

[00:32:50] : [00:32:52]

advice to them or where do you think

[00:32:52] : [00:32:55]

they're going to go you know I can get

[00:32:55] : [00:32:56]

the very basic advice I didn't want any

[00:32:56] : [00:32:58]

conflicts or anything because I'll be

[00:32:58] : [00:33:01]

setting up lots of new companies and you

[00:33:01] : [00:33:03]

know being a founder and a shareholder

[00:33:03] : [00:33:05]

and against ability I'm a Founder

[00:33:05] : [00:33:07]

shareholder in fact you're still the

[00:33:07] : [00:33:09]

majority shareholder I think as of right

[00:33:09] : [00:33:12]

now yeah just about just about yeah that

[00:33:12] : [00:33:14]

will change I'm sure money will come in

[00:33:14] : [00:33:17]

like we saw Co here yesterday on I think

[00:33:17] : [00:33:19]

10 or 20 million of Revenue run rate

[00:33:19] : [00:33:21]

they're raising at five billion money

[00:33:21] : [00:33:23]

it's plentiful cing yes yeah with the

[00:33:23] : [00:33:27]

right leadership I think that um it can

[00:33:27] : [00:33:29]

again have an amazing part to play in

[00:33:29] : [00:33:30]

media and that's what I've suggested to

[00:33:30] : [00:33:33]

it um and again there's a great team

[00:33:33] : [00:33:34]

that continues to ship great models so

[00:33:34] : [00:33:36]

last week there was an amazing code

[00:33:36] : [00:33:38]

model next week amazing language audio

[00:33:38] : [00:33:40]

and other models are coming out so you

[00:33:40] : [00:33:41]

know you continue shipping and great

[00:33:41] : [00:33:45]

products around that too um so that was

[00:33:45] : [00:33:47]

kind of my advice to them let's focus on

[00:33:47] : [00:33:50]

media and take that forward but you know

[00:33:50] : [00:33:52]

I'm not the expert on the business side

[00:33:52] : [00:33:53]

of things I did the best I could my

[00:33:53] : [00:33:55]

expertise on setting this up and taking

[00:33:55] : [00:33:56]

it not to

[00:33:56] : [00:34:01]

10 and uh yeah Z 0 to one is is

[00:34:01] : [00:34:05]

definitely um a role that that you've

[00:34:05] : [00:34:08]

played here and allow it allow someone

[00:34:08] : [00:34:11]

else to take it the rest of the way uh

[00:34:11] : [00:34:15]

but the area that I know if I actually

[00:34:15] : [00:34:16]

there something I wanted to kind of

[00:34:16] : [00:34:18]

discuss here I think is quite important

[00:34:18] : [00:34:19]

please for again the founders listing

[00:34:19] : [00:34:21]

and the new shop

[00:34:21] : [00:34:24]

companies um there is an imbalance of

[00:34:24] : [00:34:26]

power when you

[00:34:26] : [00:34:28]

have very visionary very highly

[00:34:28] : [00:34:31]

competent leaders there like what I

[00:34:31] : [00:34:33]

found at stability is that everyone

[00:34:33] : [00:34:35]

would be waiting for me no matter how

[00:34:35] : [00:34:36]

competent because I was the one that

[00:34:36] : [00:34:39]

could see around the corners and I was a

[00:34:39] : [00:34:41]

bit good at everything even if I hired

[00:34:41] : [00:34:42]

people that had built billion dooll

[00:34:42] : [00:34:44]

startups or were leaders and research at

[00:34:44] : [00:34:47]

Google or kind of whatever um because

[00:34:47] : [00:34:49]

you have the outsize thing what kind B

[00:34:49] : [00:34:51]

says you have to speak last in some

[00:34:51] : [00:34:53]

cases and some people they're meeting CU

[00:34:53] : [00:34:54]

otherwise everyone just does everything

[00:34:54] : [00:34:57]

you say and they also wait on you now

[00:34:57] : [00:34:59]

now what I find and what I told the team

[00:34:59] : [00:35:01]

is that you're flat as a power Dynamic

[00:35:01] : [00:35:02]

you're all on the same page you're all

[00:35:02] : [00:35:05]

kind of relatively equal owners and

[00:35:05] : [00:35:06]

it'll be interesting see how it evolves

[00:35:06] : [00:35:09]

from that uh given that there's actually

[00:35:09] : [00:35:10]

a business and again I think this is

[00:35:10] : [00:35:11]

something that you probably had a

[00:35:11] : [00:35:14]

challenge with than other Founders here

[00:35:14] : [00:35:17]

whereby they put more on your plate

[00:35:17] : [00:35:18]

because you are so Visionary and because

[00:35:18] : [00:35:21]

you're like up there in the future and

[00:35:21] : [00:35:23]

they're always waiting on you so you're

[00:35:23] : [00:35:24]

always like well my schedule's

[00:35:24] : [00:35:26]

completely packed my schedule now is

[00:35:26] : [00:35:27]

actually quite free which is also quite

[00:35:27] : [00:35:28]

LS

[00:35:28] : [00:35:29]

I've had a chance to speak with you

[00:35:29] : [00:35:31]

every day for the last few days so

[00:35:31] : [00:35:32]

that's been a pleasure to have extra

[00:35:32] : [00:35:35]

time on your schedule you know so so we

[00:35:35] : [00:35:39]

we do have a world of uh Visionary

[00:35:39] : [00:35:41]

founder-led CEO companies right so

[00:35:41] : [00:35:43]

you've got musk and you've got Bezos

[00:35:43] : [00:35:45]

historically and you had Steve Jobs and

[00:35:45] : [00:35:48]

you have and and that's both powerful

[00:35:48] : [00:35:51]

and dangerous the the power is the

[00:35:51] : [00:35:53]

ability for that because we don't ever

[00:35:53] : [00:35:56]

have a company that is pre-existing a

[00:35:56] : [00:36:02]

new CEO comes in has the same um uh both

[00:36:02] : [00:36:05]

hutzpah and and also the power of their

[00:36:05] : [00:36:08]

of their Vision

[00:36:08] : [00:36:12]

um the danger there you're concern

[00:36:12] : [00:36:15]

you're saying is not not allowing your

[00:36:15] : [00:36:18]

team to step up with their own Vision or

[00:36:18] : [00:36:20]

being over overly indexed on your on

[00:36:20] : [00:36:23]

your vision yeah I think that that can

[00:36:23] : [00:36:24]

be the issue and that's why I wanted

[00:36:24] : [00:36:27]

stability to again reach the point of

[00:36:27] : [00:36:29]

spread and revenue rate increase and

[00:36:29] : [00:36:32]

other things before I did anything um

[00:36:32] : [00:36:34]

and I felt again this external pressure

[00:36:34] : [00:36:37]

in that if nobody or there's very few

[00:36:37] : [00:36:38]

people in the world actually thinking

[00:36:38] : [00:36:40]

properly about governance and spread and

[00:36:40] : [00:36:42]

others and a very small window giving

[00:36:42] : [00:36:43]

the pace of this to make a difference in

[00:36:43] : [00:36:45]

the dent I believed I had a reasonable

[00:36:45] : [00:36:48]

approach to that but I couldn't while

[00:36:48] : [00:36:50]

remaining CEO of this company and again

[00:36:50] : [00:36:52]

it's a pretty unique scenario because

[00:36:52] : [00:36:54]

you've never seen a sector move this

[00:36:54] : [00:36:56]

fast that has such wide reaching human

[00:36:56] : [00:36:58]

implications

[00:36:58] : [00:37:02]

and regrettably there's too few people I

[00:37:02] : [00:37:05]

think with the right alignment and

[00:37:05] : [00:37:08]

approach in this area I've been very

[00:37:08] : [00:37:10]

disappointed like usually what happens

[00:37:10] : [00:37:12]

is you have power maximization equations

[00:37:12] : [00:37:13]

and this is what we're seeing from the

[00:37:13] : [00:37:14]

industry

[00:37:14] : [00:37:18]

consolidation how many people want to

[00:37:18] : [00:37:20]

genuinely bring this technology to kids

[00:37:20] : [00:37:24]

in Nigeria or to the global south or to

[00:37:24] : [00:37:26]

help those leaders build their own

[00:37:26] : [00:37:28]

models you know and believe all in a

[00:37:28] : [00:37:30]

positive SU game that was actually my

[00:37:30] : [00:37:31]

biggest surprise from the discussion

[00:37:31] : [00:37:34]

silic Valley almost entirely they all

[00:37:34] : [00:37:35]

believe

[00:37:35] : [00:37:39]

in uh flat or negative sum you know zero

[00:37:39] : [00:37:40]

sum or negative sum things where there

[00:37:40] : [00:37:42]

has to be a winner everyone's a winner

[00:37:42] : [00:37:43]

in

[00:37:43] : [00:37:45]

this and again I was just very

[00:37:45] : [00:37:47]

disappointed seeing that I've been

[00:37:47] : [00:37:50]

asking you for a while to write and

[00:37:50] : [00:37:53]

distribute your vision white paper cuz

[00:37:53] : [00:37:54]

I've heard you describe it in detail and

[00:37:54] : [00:37:57]

it's brilliant and I still hope that the

[00:37:57] : [00:37:59]

world will it soon enough everybody I

[00:37:59] : [00:38:00]

want to take a break from our episode to

[00:38:00] : [00:38:02]

tell you about an amazing company on a

[00:38:02] : [00:38:04]

mission to prevent and reverse chronic

[00:38:04] : [00:38:07]

disease by decoding your biology the

[00:38:07] : [00:38:10]

company is called viome and they offer

[00:38:10] : [00:38:12]

cuttingedge tests and personalized

[00:38:12] : [00:38:14]

products that help you optimize your gut

[00:38:14] : [00:38:17]

microbiome your oral microbiome and your

[00:38:17] : [00:38:19]

cellular Health as you probably know

[00:38:19] : [00:38:21]

your microbiome is a collection of

[00:38:21] : [00:38:24]

trillions of microbes that live in your

[00:38:24] : [00:38:26]

gut and mouth and these microbiomes

[00:38:26] : [00:38:28]

influence every everything your

[00:38:28] : [00:38:30]

digestion immunity mood weight and many

[00:38:30] : [00:38:33]

other aspects of your health but not all

[00:38:33] : [00:38:36]

microbes are good for you some can cause

[00:38:36] : [00:38:38]

inflammation toxins and actually lead to

[00:38:38] : [00:38:41]

chronic diseases like diabetes heart

[00:38:41] : [00:38:43]

disease obesity and even cancer viome

[00:38:43] : [00:38:47]

uses Advanced mRNA technology and AI to

[00:38:47] : [00:38:50]

analyze your microbes and your cells and

[00:38:50] : [00:38:52]

give you personalized nutrition

[00:38:52] : [00:38:54]

recommendations and products designed

[00:38:54] : [00:38:56]

specifically for your genetics

[00:38:56] : [00:38:58]

specifically for your biology you can

[00:38:58] : [00:39:00]

choose from different tests depending on

[00:39:00] : [00:39:03]

your goals and needs ranging from

[00:39:03] : [00:39:04]

improving your gut health your oral

[00:39:04] : [00:39:07]

health cular function or all of them

[00:39:07] : [00:39:09]

I've been using viome for the past 3

[00:39:09] : [00:39:11]

years I can tell you that it has made a

[00:39:11] : [00:39:13]

huge difference in my health and because

[00:39:13] : [00:39:16]

of the data they collect and the AI

[00:39:16] : [00:39:18]

engine they've built it gets better

[00:39:18] : [00:39:21]

every single day I love getting Health

[00:39:21] : [00:39:23]

scores and seeing how my diet and

[00:39:23] : [00:39:25]

lifestyle affects my microbiome and my

[00:39:25] : [00:39:27]

cells and I love getting precision

[00:39:27] : [00:39:30]

supplements and probiotics tailored for

[00:39:30] : [00:39:32]

my specific needs if you want to join me

[00:39:32] : [00:39:34]

on this journey of Discovery and improve

[00:39:34] : [00:39:37]

your health From the Inside Out viome

[00:39:37] : [00:39:39]

has a special offer for you for a

[00:39:39] : [00:39:42]

limited time you can get up to 40% off

[00:39:42] : [00:39:46]

any viome test using the code moonshots

[00:39:46] : [00:39:49]

just go to vom.com back/ moonshots and

[00:39:49] : [00:39:51]

order your test today trust me you

[00:39:51] : [00:39:52]

wouldn't regret it all right let's go

[00:39:52] : [00:39:55]

back to our episode before we jump into

[00:39:55] : [00:39:57]

into health and education let's talk

[00:39:57] : [00:39:58]

about governance a second because we've

[00:39:58] : [00:40:01]

seen governance

[00:40:01] : [00:40:04]

um complicate this what is the right

[00:40:04] : [00:40:05]

governance structure for this super

[00:40:05] : [00:40:07]

powerful technology we have

[00:40:07] : [00:40:09]

representation of democracy that I think

[00:40:09] : [00:40:10]

can be improved by this like I don't

[00:40:10] : [00:40:12]

think democracy survives this technology

[00:40:12] : [00:40:14]

and it's current for it will either

[00:40:14] : [00:40:17]

improve or it will end I don't see

[00:40:17] : [00:40:19]

anything else like um yesterday they

[00:40:19] : [00:40:22]

what does end what does end mean here a

[00:40:22] : [00:40:25]

a benign dictatorship a driven by an AI

[00:40:25] : [00:40:27]

Overlord yeah like yesterday there were

[00:40:27] : [00:40:31]

was a announcement of an app called Hume

[00:40:31] : [00:40:33]

which had emotionally intelligent speech

[00:40:33] : [00:40:35]

and it can understand your emotions and

[00:40:35] : [00:40:37]

talk with emotion you and I have to

[00:40:37] : [00:40:39]

discuss this yes you know where that's

[00:40:39] : [00:40:42]

going rightful speech it's incredibly

[00:40:42] : [00:40:44]

powerful and governments have a tendency

[00:40:44] : [00:40:47]

I mean if Government but but say it say

[00:40:47] : [00:40:49]

it here it's important for you to State

[00:40:49] : [00:40:51]

what what it means because we've

[00:40:51] : [00:40:54]

discussed it but help people here under

[00:40:54] : [00:40:57]

be ready for this democracy is is all

[00:40:57] : [00:40:59]

about representation and you see the

[00:40:59] : [00:41:00]

questions of deep fakes and things

[00:41:00] : [00:41:02]

speech is one of the most impactful

[00:41:02] : [00:41:04]

elements there but now you can't believe

[00:41:04] : [00:41:08]

anything you see here everything so One

[00:41:08] : [00:41:12]

path that we have is a 1984 on steroids

[00:41:12] : [00:41:15]

panoptica you know where life is

[00:41:15] : [00:41:17]

gamified and you listen to whatever the

[00:41:17] : [00:41:18]

government says and they're incredibly

[00:41:18] : [00:41:20]

convincing and you're happy and you've

[00:41:20] : [00:41:21]

always been happy and you've always been

[00:41:21] : [00:41:24]

at war with Eurasia you know propaganda

[00:41:24] : [00:41:26]

on steroids the other part that you have

[00:41:26] : [00:41:28]

is things like like citizen assemblies

[00:41:28] : [00:41:31]

consultative democracy the ability to

[00:41:31] : [00:41:33]

take right now you can take any of the

[00:41:33] : [00:41:35]

bills in Congress and completely

[00:41:35] : [00:41:38]

deconstruct them and find what the

[00:41:38] : [00:41:40]

motivations are you know you can check

[00:41:40] : [00:41:42]

laws against the constitution in a

[00:41:42] : [00:41:45]

second seconds this is incredibly

[00:41:45] : [00:41:47]

powerful empowering technology from a

[00:41:47] : [00:41:49]

democratic perspective so I see two

[00:41:49] : [00:41:51]

routes unfortunately because I think

[00:41:51] : [00:41:54]

that once the thing goes It goes really

[00:41:54] : [00:41:56]

fast centralized government control

[00:41:56] : [00:41:58]

increasing because the governments want

[00:41:58] : [00:42:00]

to protect themselves as an organization

[00:42:00] : [00:42:02]

and you know every party says the other

[00:42:02] : [00:42:04]

party is crap we've seen the increasing

[00:42:04] : [00:42:07]

polarization in America already and you

[00:42:07] : [00:42:08]

know fundamentally come on you can do

[00:42:08] : [00:42:10]

better than those two leaders that are

[00:42:10] : [00:42:13]

currently competing I'm saying this is

[00:42:13] : [00:42:15]

clearly our system is sclorotic across

[00:42:15] : [00:42:16]

this like democracy is the worst of all

[00:42:16] : [00:42:18]

systems except for everyone else we can

[00:42:18] : [00:42:20]

have a better

[00:42:20] : [00:42:21]

democracy where it's actually

[00:42:21] : [00:42:25]

representative and empowers the people

[00:42:25] : [00:42:27]

or we will have the end of democ where

[00:42:27] : [00:42:30]

it is a 1984 panopy in my opinion

[00:42:30] : [00:42:32]

because the momentum will go there of

[00:42:32] : [00:42:33]

course you will start using this

[00:42:33] : [00:42:35]

technology you're already seeing it

[00:42:35] : [00:42:37]

being used but not at scale and not

[00:42:37] : [00:42:40]

intelligently yet which is

[00:42:40] : [00:42:42]

scary I I I think we finally have the

[00:42:42] : [00:42:46]

technology for a direct uh democracy

[00:42:46] : [00:42:47]

versus a representative democracy right

[00:42:47] : [00:42:52]

where I can have my my desires directly

[00:42:52] : [00:42:56]

represented on any specific law or but I

[00:42:56] : [00:42:58]

think the point point you've made before

[00:42:58] : [00:43:01]

is that

[00:43:01] : [00:43:05]

speech um if you look back to everybody

[00:43:05] : [00:43:07]

from Hitler to some of the most

[00:43:07] : [00:43:09]

persuasive

[00:43:09] : [00:43:13]

politicians um is is a powerful tool and

[00:43:13] : [00:43:16]

AI can become the most persuasive

[00:43:16] : [00:43:19]

speaker out there it can take anyone's

[00:43:19] : [00:43:20]

speech and make it far more persuasive

[00:43:20] : [00:43:22]

like I think my voice is a bit whiny I

[00:43:22] : [00:43:24]

can remove the wine you know I can go in

[00:43:24] : [00:43:26]

a very polish British accent and other

[00:43:26] : [00:43:28]

things like that right you know must

[00:43:28] : [00:43:30]

fight them on the hills and the harriers

[00:43:30] : [00:43:32]

and whatever public speaking makes a big

[00:43:32] : [00:43:34]

difference someone took Hitler's

[00:43:34] : [00:43:36]

speeches and put them through an AI and

[00:43:36] : [00:43:39]

took them into English because when

[00:43:39] : [00:43:41]

we're not in the German context and we

[00:43:41] : [00:43:42]

listen to them it sounds like he's

[00:43:42] : [00:43:44]

shouting like what this crazy thing you

[00:43:44] : [00:43:46]

hear him in English it is very different

[00:43:46] : [00:43:49]

in his own voice just like someone took

[00:43:49] : [00:43:52]

Javier M's one in the United Nations and

[00:43:52] : [00:43:54]

put him into English again he sounds a

[00:43:54] : [00:43:57]

bit sh shouty but then he sounds very

[00:43:57] : [00:43:59]

reable when it's in English and you can

[00:43:59] : [00:44:01]

take the phenomes of Obama's best

[00:44:01] : [00:44:03]

speeech and a bit of trumpism and a bit

[00:44:03] : [00:44:05]

of church chill and you'll have full

[00:44:05] : [00:44:07]

modulation wave control over all of this

[00:44:07] : [00:44:10]

people are already using this technology

[00:44:10] : [00:44:12]

that everyone should have a passcode

[00:44:12] : [00:44:14]

with their loved ones because people are

[00:44:14] : [00:44:15]

getting calls from their

[00:44:15] : [00:44:18]

mother saying help I'm in an emergency I

[00:44:18] : [00:44:21]

need to send money right now and you

[00:44:21] : [00:44:22]

cannot tell it and it pulls at the

[00:44:22] : [00:44:24]

emotional strengths and if you look at

[00:44:24] : [00:44:27]

something like us radio and you know one

[00:44:27] : [00:44:29]

side of the political divide is taking

[00:44:29] : [00:44:31]

over imagine if you're hearing optimized

[00:44:31] : [00:44:33]

speech every single

[00:44:33] : [00:44:35]

day that will have a huge impact and

[00:44:35] : [00:44:36]

then they control the visuals and they

[00:44:36] : [00:44:39]

control the other things we're not set

[00:44:39] : [00:44:41]

up for defenses yeah if it's if it's

[00:44:41] : [00:44:43]

optimized speech for you specifically

[00:44:43] : [00:44:46]

for you right for the kids you have the

[00:44:46] : [00:44:48]

age group they have where you live your

[00:44:48] : [00:44:50]

historical background and so forth and N

[00:44:50] : [00:44:54]

of one persuasive speech coming at you

[00:44:54] : [00:44:58]

um the brain is not set up for enses

[00:44:58] : [00:45:00]

we're not and you know we can take this

[00:45:00] : [00:45:02]

as an example of the YouTube algorithm

[00:45:02] : [00:45:04]

like YouTube as an organization is not

[00:45:04] : [00:45:05]

an evil

[00:45:05] : [00:45:07]

organization but it's an organization

[00:45:07] : [00:45:09]

optimized for engagement which optimized

[00:45:09] : [00:45:11]

for more extreme content so there's some

[00:45:11] : [00:45:12]

dark place in YouTube which then

[00:45:12] : [00:45:16]

optimize for Isis so Isis video spread

[00:45:16] : [00:45:18]

viral I don't know sometimes viral is

[00:45:18] : [00:45:19]

good sometimes viral is bad that one was

[00:45:19] : [00:45:23]

bad because it was extreme and they

[00:45:23] : [00:45:25]

didn't understand why and if you look at

[00:45:25] : [00:45:28]

it are two of our largest gen of are

[00:45:28] : [00:45:32]

companies are Google and meta and their

[00:45:32] : [00:45:34]

business is advertising their business

[00:45:34] : [00:45:37]

is manipulation and they are both aoral

[00:45:37] : [00:45:39]

companies because why would you expect a

[00:45:39] : [00:45:42]

company to have morality our governments

[00:45:42] : [00:45:45]

are also aoral and again you can view

[00:45:45] : [00:45:46]

these things as slow Dumb AI so you can

[00:45:46] : [00:45:49]

see the way they will optimize unless we

[00:45:49] : [00:45:51]

do something about it and they will have

[00:45:51] : [00:45:53]

full control like again you put on your

[00:45:53] : [00:45:56]

Vision Pro headset with your spatial

[00:45:56] : [00:46:00]

audio that is full sensory control not

[00:46:00] : [00:46:02]

full but you know what I mean a level we

[00:46:02] : [00:46:05]

full immersion full immersion full full

[00:46:05] : [00:46:08]

immersion and so we have to be aware of

[00:46:08] : [00:46:10]

this and there's obviously other tools

[00:46:10] : [00:46:11]

that can be Ed like in the wake of the

[00:46:11] : [00:46:13]

Arab Spring you know governments

[00:46:13] : [00:46:15]

targeted everyone that was in social

[00:46:15] : [00:46:16]

media we can do that on a again hyper

[00:46:16] : [00:46:21]

personalized basis like we need to set

[00:46:21] : [00:46:23]

some defaults and standards here to

[00:46:23] : [00:46:26]

protect democracy but again why

[00:46:26] : [00:46:28]

democracy

[00:46:28] : [00:46:29]

we're not really trying to protect

[00:46:29] : [00:46:31]

democracy you know again people have

[00:46:31] : [00:46:32]

different definitions that what we're

[00:46:32] : [00:46:34]

trying to protect is individual liberty

[00:46:34] : [00:46:35]

freedom and

[00:46:35] : [00:46:38]

agency education should be about

[00:46:38] : [00:46:39]

enhancing the education of every child

[00:46:39] : [00:46:42]

is not you know healthc care is sickcare

[00:46:42] : [00:46:44]

our government should uplift us but how

[00:46:44] : [00:46:46]

many people believe our governments do

[00:46:46] : [00:46:48]

that rather than put us down because

[00:46:48] : [00:46:51]

they couldn't encapsulate and cater to

[00:46:51] : [00:46:54]

the Brilliance of each

[00:46:54] : [00:46:56]

individual because they didn't have the

[00:46:56] : [00:46:58]

tools until out so that's why I said

[00:46:58] : [00:47:00]

which way one man yes which waynet

[00:47:00] : [00:47:04]

agency or massive control these are the

[00:47:04] : [00:47:06]

two ways do we control the technology or

[00:47:06] : [00:47:08]

do these organizations control the

[00:47:08] : [00:47:11]

technology that controls us you know

[00:47:11] : [00:47:13]

when we were on the stage at at the

[00:47:13] : [00:47:15]

abundance Summit we talked about a

[00:47:15] : [00:47:17]

future of digital superintelligence

[00:47:17] : [00:47:22]

right and a future in which we've got AI

[00:47:22] : [00:47:24]

a billion times more capable than a

[00:47:24] : [00:47:27]

human which looking at it just from a

[00:47:27] : [00:47:30]

ratio of neurons is the ratio of a

[00:47:30] : [00:47:34]

hamster to a human um y

[00:47:34] : [00:47:38]

uh do you believe that someday we could

[00:47:38] : [00:47:43]

have a a benign super intelligence that

[00:47:43] : [00:47:44]

is supporting

[00:47:44] : [00:47:47]

Humanity yes and I believe that it

[00:47:47] : [00:47:49]

should be a collective

[00:47:49] : [00:47:52]

intelligence that is made up of

[00:47:52] : [00:47:53]

Amplified human intelligence as

[00:47:53] : [00:47:56]

amplifying all of us Pilots that contain

[00:47:56] : [00:47:59]

our Collective knowledge and culture and

[00:47:59] : [00:48:01]

the best of us and data sets that are

[00:48:01] : [00:48:03]

built from helping and augmenting US

[00:48:03] : [00:48:06]

versus a collected intelligence and AGI

[00:48:06] : [00:48:08]

that is topped down and designed to

[00:48:08] : [00:48:10]

effectively control us again if you look

[00:48:10] : [00:48:12]

at open AI statements on the road AGI

[00:48:12] : [00:48:14]

they say this technology will end

[00:48:14] : [00:48:17]

democracy end capitalism and maybe kill

[00:48:17] : [00:48:20]

us all I don't like that I I remember I

[00:48:20] : [00:48:22]

remember seeing that you you you texted

[00:48:22] : [00:48:25]

me you said read this does this sound

[00:48:25] : [00:48:27]

the same as it does to me yeah so what

[00:48:27] : [00:48:28]

i' prefer instead is for this to be

[00:48:28] : [00:48:30]

distributed like if you have data sets

[00:48:30] : [00:48:32]

from Nations that are built on enhancing

[00:48:32] : [00:48:34]

the capability of the nation that

[00:48:34] : [00:48:36]

reflect the local cultures and you push

[00:48:36] : [00:48:38]

for data transparency on models which I

[00:48:38] : [00:48:40]

believe we must have you know especially

[00:48:40] : [00:48:42]

language models then you're more likely

[00:48:42] : [00:48:44]

to have a positive thing and again the

[00:48:44] : [00:48:46]

human Collective can achieve anything

[00:48:46] : [00:48:48]

from splitting the atom to go to space

[00:48:48] : [00:48:50]

if we put our minds to it but we have

[00:48:50] : [00:48:53]

laed in coordination

[00:48:53] : [00:48:55]

mechanisms they've not been good enough

[00:48:55] : [00:48:57]

so if you create the human Colossus and

[00:48:57] : [00:48:59]

every single person has an AI That's

[00:48:59] : [00:49:01]

just looking out for them to enhance

[00:49:01] : [00:49:04]

their potential and coordination AIS

[00:49:04] : [00:49:06]

that is a far more positive view of the

[00:49:06] : [00:49:09]

future and that is the AGI that is a

[00:49:09] : [00:49:11]

general intelligence that's the hive

[00:49:11] : [00:49:13]

mind general intelligence not a b style

[00:49:13] : [00:49:15]

hi mind but one that's really thinking

[00:49:15] : [00:49:17]

again every child should achieve their

[00:49:17] : [00:49:20]

potential versus this embodied concept

[00:49:20] : [00:49:22]

of an AGI that's a very Western concept

[00:49:22] : [00:49:23]

you see that as well like you know you

[00:49:23] : [00:49:25]

look at the Japanese concept of a robot

[00:49:25] : [00:49:28]

the robot is your equal and your helper

[00:49:28] : [00:49:29]

you look at the Western concept of the

[00:49:29] : [00:49:32]

robot it's Terminator and Skynet and all

[00:49:32] : [00:49:34]

of that and again I think this is again

[00:49:34] : [00:49:36]

where cultural norms become very

[00:49:36] : [00:49:37]

interesting and what do we want to build

[00:49:37] : [00:49:40]

do we want to build AI God or do we want

[00:49:40] : [00:49:42]

to build that AI helper that helps us

[00:49:42] : [00:49:44]

and we help it you

[00:49:44] : [00:49:48]

know um those listening now I mean you

[00:49:48] : [00:49:52]

can see yod's Brilliance and why I'm so

[00:49:52] : [00:49:54]

enamored with the way you think about

[00:49:54] : [00:49:56]

this because it's there are very few

[00:49:56] : [00:49:59]

indviduals who are looking at this from

[00:49:59] : [00:50:01]

uh an objective function of what's best

[00:50:01] : [00:50:03]

for Humanity what's best for every

[00:50:03] : [00:50:06]

nation state uh out there let's talk

[00:50:06] : [00:50:10]

about your going forward future um are

[00:50:10] : [00:50:12]

you going to build something in the uh

[00:50:12] : [00:50:17]

in the decentralized um uh side of of AI

[00:50:17] : [00:50:19]

the democrac size of AI is there a

[00:50:19] : [00:50:22]

company there or Fund in your future for

[00:50:22] : [00:50:25]

that yes so uh you know doing the white

[00:50:25] : [00:50:27]

paper finally getting there with a bit

[00:50:27] : [00:50:29]

of help from AI uh I I can't I I can't I

[00:50:29] : [00:50:32]

can't wait to uh to help uh broadcast

[00:50:32] : [00:50:35]

that white paper yeah but look I think

[00:50:35] : [00:50:36]

the basic thing is this what I want to

[00:50:36] : [00:50:39]

do is set up a AI champion in every

[00:50:39] : [00:50:41]

nation with the brightest people of each

[00:50:41] : [00:50:43]

Nation working with the organizations of

[00:50:43] : [00:50:45]

reach Nation to help guide them through

[00:50:45] : [00:50:47]

this next period because there will be

[00:50:47] : [00:50:48]

massive job displacement from the

[00:50:48] : [00:50:51]

graduates going massive uplifts and

[00:50:51] : [00:50:53]

productivity from the technology being

[00:50:53] : [00:50:55]

implemented and again that organization

[00:50:55] : [00:50:57]

can help govern

[00:50:57] : [00:50:59]

and create these data sets and these

[00:50:59] : [00:51:01]

models that are so important and I Every

[00:51:01] : [00:51:03]

Nation should have that but then I also

[00:51:03] : [00:51:05]

believe that every sector should have a

[00:51:05] : [00:51:07]

generative AI first infrastructure

[00:51:07] : [00:51:09]

company that builds this and helps the

[00:51:09] : [00:51:11]

healthcare companies finance companies

[00:51:11] : [00:51:12]

and others through that and to

[00:51:12] : [00:51:15]

coordinate all of that you need to have

[00:51:15] : [00:51:17]

a web3 type protocol what is the

[00:51:17] : [00:51:19]

protocol for intelligence so what is a

[00:51:19] : [00:51:21]

web what is a web three type protocol

[00:51:21] : [00:51:24]

Define that for folks listening again

[00:51:24] : [00:51:25]

people talk about web 3 it's not about

[00:51:25] : [00:51:27]

the tokens or the coins or anything like

[00:51:27] : [00:51:30]

that what about three protocol is is

[00:51:30] : [00:51:32]

that everyone should have like AIS first

[00:51:32] : [00:51:34]

of all aren't going to have bank

[00:51:34] : [00:51:36]

accounts they're going to need some way

[00:51:36] : [00:51:39]

to pay each other or exchange value and

[00:51:39] : [00:51:41]

again web 3 done a lot of work in that

[00:51:41] : [00:51:43]

there needs to be some sort of identity

[00:51:43] : [00:51:45]

attribution and other format because

[00:51:45] : [00:51:47]

you'll have this Mass influx of

[00:51:47] : [00:51:49]

information and so again web three

[00:51:49] : [00:51:52]

concepts are very useful there there

[00:51:52] : [00:51:53]

needs to be an identity concept because

[00:51:53] : [00:51:56]

you'll have real and digital people web

[00:51:56] : [00:51:58]

through concepts are very useful there

[00:51:58] : [00:52:00]

so data ass estation all these other

[00:52:00] : [00:52:03]

things verifiability so when I look at

[00:52:03] : [00:52:05]

it if you've got sectorally my plan is

[00:52:05] : [00:52:07]

to launch or Mr company for every major

[00:52:07] : [00:52:09]

sector and we can talk about health and

[00:52:09] : [00:52:10]

education and bring the smartest people

[00:52:10] : [00:52:12]

in the world to solve that challenge of

[00:52:12] : [00:52:14]

the infastructure for the future Every

[00:52:14] : [00:52:16]

Nation but you need to have some sort of

[00:52:16] : [00:52:18]

coordinating protocol for all of that

[00:52:18] : [00:52:20]

that becomes a standard and that's the

[00:52:20] : [00:52:22]

substrate for this Amplified human

[00:52:22] : [00:52:24]

collective

[00:52:24] : [00:52:26]

intelligence and and is that where you

[00:52:26] : [00:52:27]

want to play

[00:52:27] : [00:52:29]

and focus your energy next yeah it's

[00:52:29] : [00:52:31]

setting up these organizations and

[00:52:31] : [00:52:33]

bringing the brightest smartest people

[00:52:33] : [00:52:35]

that really want to make a difference

[00:52:35] : [00:52:37]

there because there's massive network of

[00:52:37] : [00:52:39]

fols in doing this but again I just need

[00:52:39] : [00:52:41]

to be the founder and architect I don't

[00:52:41] : [00:52:42]

want to run the day-to-day of any of

[00:52:42] : [00:52:46]

these things um and then because the the

[00:52:46] : [00:52:48]

most scarce Talent there's three types

[00:52:48] : [00:52:50]

of capital as I view it there's

[00:52:50] : [00:52:53]

Financial Capital human capital and

[00:52:53] : [00:52:55]

political capital and in order to affect

[00:52:55] : [00:52:56]

a change in the world you actually need

[00:52:56] : [00:52:58]

all three but the financial Capital

[00:52:58] : [00:53:00]

actually comes with the people capital

[00:53:00] : [00:53:03]

and the political capital and the

[00:53:03] : [00:53:04]

smartest people in the world in every

[00:53:04] : [00:53:06]

sector from Healthcare to education to

[00:53:06] : [00:53:08]

finance to

[00:53:08] : [00:53:11]

agriculture almost all believe that geni

[00:53:11] : [00:53:12]

is the biggest thing they've ever seen

[00:53:12] : [00:53:15]

in the last year everyone's asking you

[00:53:15] : [00:53:17]

all the smartest people Keo what's next

[00:53:17] : [00:53:18]

right and you know many of the smartest

[00:53:18] : [00:53:20]

people in the world so I want to create

[00:53:20] : [00:53:22]

organization that they can come the

[00:53:22] : [00:53:25]

chefs and the cooks the thinkers and the

[00:53:25] : [00:53:26]

doers and think what is the future of

[00:53:26] : [00:53:29]

Finance that's the future of education

[00:53:29] : [00:53:31]

and then the national champions that

[00:53:31] : [00:53:32]

should be owned by the people of each

[00:53:32] : [00:53:35]

country become the distribution for the

[00:53:35] : [00:53:37]

amazing infrastructure that they build

[00:53:37] : [00:53:39]

and there's a vice a nice kind of Vice

[00:53:39] : [00:53:40]

Versa but then again you need the

[00:53:40] : [00:53:43]

coordination function so I'm trying to

[00:53:43] : [00:53:44]

bring together people in each of these

[00:53:44] : [00:53:46]

and you know there'll be public calls

[00:53:46] : [00:53:48]

and things like that to build that

[00:53:48] : [00:53:49]

infrastructure the future because as

[00:53:49] : [00:53:51]

mentioned AI is an infrastructure where

[00:53:51] : [00:53:53]

it should be I maybe it's the rocket

[00:53:53] : [00:53:54]

ship of the mind

[00:53:54] : [00:53:57]

right I love that I love that analogy my

[00:53:57] : [00:53:59]

friend it it it is the most important

[00:53:59] : [00:54:01]

infrastructure that Humanity will have

[00:54:01] : [00:54:04]

going forward across everything it does

[00:54:04] : [00:54:05]

and I look forward to helping you build

[00:54:05] : [00:54:08]

it exciting right like again I think one

[00:54:08] : [00:54:10]

of the things I got I got lots of

[00:54:10] : [00:54:12]

messages they're like I'm so sorry for

[00:54:12] : [00:54:15]

your loss it was like my dog died after

[00:54:15] : [00:54:18]

I left the CE I was like what is that

[00:54:18] : [00:54:21]

you know they're like it's nice it's

[00:54:21] : [00:54:22]

nice that people care right but I'm

[00:54:22] : [00:54:24]

generally excited kind of about what's

[00:54:24] : [00:54:26]

next like you know again it was like

[00:54:26] : [00:54:28]

staring into the abyss and chewing glass

[00:54:28] : [00:54:31]

every single day and that's not what I'm

[00:54:31] : [00:54:33]

best at or where I could have the most

[00:54:33] : [00:54:35]

impact but I want it to be a point

[00:54:35] : [00:54:37]

whereby if I can accelerate this over

[00:54:37] : [00:54:39]

the next period I don't have to make an

[00:54:39] : [00:54:42]

impact I should not have any power on

[00:54:42] : [00:54:45]

this whereas again you see everyone else

[00:54:45] : [00:54:46]

trying to get more and more power I want

[00:54:46] : [00:54:48]

to make sure it's set up properly but I

[00:54:48] : [00:54:50]

want to give it all away because power

[00:54:50] : [00:54:53]

is obligation it's dragging and again it

[00:54:53] : [00:54:55]

should not be invested in any one

[00:54:55] : [00:54:57]

individual we should not have to rely on

[00:54:57] : [00:55:00]

anyone being nice or good or for this

[00:55:00] : [00:55:02]

technology I was talking to uh Michael

[00:55:02] : [00:55:04]

sailor during the abundance Summit uh

[00:55:04] : [00:55:06]

that evening and you know talking about

[00:55:06] : [00:55:10]

the fact that because Satoshi uh when he

[00:55:10] : [00:55:13]

set it up uh did not retain any power

[00:55:13] : [00:55:16]

and did not trade on the founding blocks

[00:55:16] : [00:55:18]

and so forth that that's the reason it's

[00:55:18] : [00:55:21]

been able to succeed because there

[00:55:21] : [00:55:23]

wasn't that centralized power and you

[00:55:23] : [00:55:25]

know Bitcoin had been tried he said

[00:55:25] : [00:55:27]

Bitcoin had been tried many times before

[00:55:27] : [00:55:30]

but because it didn't have that uh

[00:55:30] : [00:55:33]

initial anonymity and and the and the

[00:55:33] : [00:55:35]

dissolution of founding power that

[00:55:35] : [00:55:38]

that's the reason it didn't succeed yeah

[00:55:38] : [00:55:40]

I mean again I think you need to have it

[00:55:40] : [00:55:41]

accelerate and you see this with

[00:55:41] : [00:55:43]

movements right the movement starts but

[00:55:43] : [00:55:46]

then it goes once you've got the DNA and

[00:55:46] : [00:55:48]

the story there right you know you see

[00:55:48] : [00:55:50]

the prophets you see the leaders you see

[00:55:50] : [00:55:52]

the others but then it's about setting

[00:55:52] : [00:55:54]

the framework correctly and reframing

[00:55:54] : [00:55:56]

the concept

[00:55:56] : [00:55:58]

this technology is not Beyond look

[00:55:58] : [00:56:00]

stability is a company that started two

[00:56:00] : [00:56:03]

years ago above a chicken shop in London

[00:56:03] : [00:56:06]

right you know my first 20 employees I

[00:56:06] : [00:56:07]

went to the job center and I said bring

[00:56:07] : [00:56:10]

me people that have overcome adversity

[00:56:10] : [00:56:12]

and I will train them young graduates

[00:56:12] : [00:56:14]

and six of them are still at stability

[00:56:14] : [00:56:16]

you know um like because it was a

[00:56:16] : [00:56:17]

program and they're doing things from

[00:56:17] : [00:56:19]

cyber security to running super

[00:56:19] : [00:56:22]

computers we only had like 16 17 phds

[00:56:22] : [00:56:24]

yet we built the state-of-the-art models

[00:56:24] : [00:56:26]

in every modality we built mind reading

[00:56:26] : [00:56:28]

models like mind ey you know I remember

[00:56:28] : [00:56:32]

that contributed to all these things yet

[00:56:32] : [00:56:34]

you're told it's impossible to compete

[00:56:34] : [00:56:36]

we have shown it's not impossible to

[00:56:36] : [00:56:38]

compete that's a reframing the reframing

[00:56:38] : [00:56:40]

is data versus models it's you don't

[00:56:40] : [00:56:43]

need giant supercomputers for everyone

[00:56:43] : [00:56:45]

you just need to have a trusted entity

[00:56:45] : [00:56:49]

to build it right yeah you know and so I

[00:56:49] : [00:56:50]

hope to kind of convey this and then

[00:56:50] : [00:56:52]

figure out this organizational structure

[00:56:52] : [00:56:54]

that can proliferate so I can take a

[00:56:54] : [00:56:57]

holiday so before go further let's talk

[00:56:57] : [00:57:00]

about one area of your next chapter in

[00:57:00] : [00:57:03]

life that uh we both have as a passion

[00:57:03] : [00:57:05]

uh which is the use of generative Ai and

[00:57:05] : [00:57:06]

health it's an area that you've given a

[00:57:06] : [00:57:09]

huge amount of thought to uh and I think

[00:57:09] : [00:57:11]

uh you're excited about can you share

[00:57:11] : [00:57:14]

what your vision is there yeah so I got

[00:57:14] : [00:57:17]

into AI 13 years ago gosh I was a

[00:57:17] : [00:57:19]

programmer before for 23 years building

[00:57:19] : [00:57:21]

L scale systems as a hedge fund manager

[00:57:21] : [00:57:23]

and other things when my son was

[00:57:23] : [00:57:25]

diagnosed with autism and then I built

[00:57:25] : [00:57:27]

an NLP to analyze all the clinical

[00:57:27] : [00:57:29]

literature and then looked at

[00:57:29] : [00:57:31]

biomolecular pathway analysis of

[00:57:31] : [00:57:33]

neurotransmitters gab toate in the brain

[00:57:33] : [00:57:35]

to repurpose drugs for him and he went

[00:57:35] : [00:57:36]

to Mas school which was great and equals

[00:57:36] : [00:57:39]

one and then I was one of I was lead

[00:57:39] : [00:57:41]

architect on the one of the co AI

[00:57:41] : [00:57:42]

projects of the United Nations La of

[00:57:42] : [00:57:44]

Stanford and others and then because I

[00:57:44] : [00:57:46]

didn't get the technology I was like oh

[00:57:46] : [00:57:47]

we got to build it

[00:57:47] : [00:57:51]

ourselves but what is health you know

[00:57:51] : [00:57:52]

again I think we've had this discussion

[00:57:52] : [00:57:55]

a lot Healthcare is sickcare we don't

[00:57:55] : [00:57:56]

have all the information that we should

[00:57:56] : [00:57:59]

have at our fingertips Health assumes

[00:57:59] : [00:58:01]

ergodicity a thousand tossers of the

[00:58:01] : [00:58:03]

coin the same as a coin tossed a

[00:58:03] : [00:58:06]

thousand times but we are all individual

[00:58:06] : [00:58:07]

and across the world there are amazing

[00:58:07] : [00:58:10]

data sets that could be better because

[00:58:10] : [00:58:12]

when you write down a clinical trial or

[00:58:12] : [00:58:15]

your own kind of experiences you lose so

[00:58:15] : [00:58:18]

much information at the same time you

[00:58:18] : [00:58:20]

don't have all the information on cancer

[00:58:20] : [00:58:22]

autism multiple sceris at your

[00:58:22] : [00:58:23]

fingertips and the comprehend

[00:58:23] : [00:58:25]

authorative and Upstate way so when I

[00:58:25] : [00:58:27]

look at the health operating

[00:58:27] : [00:58:30]

system we're going to build a gp4 open

[00:58:30] : [00:58:32]

for

[00:58:32] : [00:58:34]

cancer and it's going to mean that

[00:58:34] : [00:58:36]

nobody is alone again on that journey

[00:58:36] : [00:58:38]

and loses that agency because they know

[00:58:38] : [00:58:40]

comprehensive Authority update all their

[00:58:40] : [00:58:44]

knowledge but AI models today already

[00:58:44] : [00:58:46]

outperform human doctors and empathy so

[00:58:46] : [00:58:47]

they're not going to be a learn on that

[00:58:47] : [00:58:49]

anymore yeah can I just double click on

[00:58:49] : [00:58:51]

what you just said because it's really

[00:58:51] : [00:58:53]

important I've have so many people

[00:58:53] : [00:58:54]

because of my role as chairman of

[00:58:54] : [00:58:57]

Fountain life who reach out say I just

[00:58:57] : [00:58:59]

got diagnosed with this cancer or my

[00:58:59] : [00:59:03]

brother my sister or my wife and and

[00:59:03] : [00:59:05]

there is they're left with this

[00:59:05] : [00:59:09]

decimating use and they're left Googling

[00:59:09] : [00:59:14]

um but a model that's able to have the

[00:59:14] : [00:59:16]

most Cutting Edge information and then

[00:59:16] : [00:59:18]

incorporate all their medical data and

[00:59:18] : [00:59:21]

give them advice in empathic fashion how

[00:59:21] : [00:59:24]

far is that see couple of years if we

[00:59:24] : [00:59:27]

focus maybe even like next year and

[00:59:27] : [00:59:29]

that's amazing because for all of these

[00:59:29] : [00:59:32]

topics that again we will have diagnosis

[00:59:32] : [00:59:34]

that is superior we will have research

[00:59:34] : [00:59:35]

augmentation because again even

[00:59:35] : [00:59:37]

researchers don't have all that

[00:59:37] : [00:59:39]

knowledge at their fingertips and again

[00:59:39] : [00:59:40]

this is public infrastructure and a

[00:59:40] : [00:59:43]

public good you know from primary care

[00:59:43] : [00:59:45]

all the way through that what is the

[00:59:45] : [00:59:47]

open infrastructure of the future where

[00:59:47] : [00:59:49]

this technology can come again to your

[00:59:49] : [00:59:51]

own data as well you have uh things like

[00:59:51] : [00:59:53]

Melody and other things around

[00:59:53] : [00:59:55]

homomorphic encryption Federated

[00:59:55] : [00:59:56]

learning that were trying to figure out

[00:59:56] : [00:59:59]

how to preserve privacy we can run a

[00:59:59] : [01:00:01]

language model on a smartphone right now

[01:00:01] : [01:00:03]

that can analyze all your data and then

[01:00:03] : [01:00:05]

just feedback stuff to a global

[01:00:05] : [01:00:07]

Collective but people are people so when

[01:00:07] : [01:00:09]

I look at Healthcare I see amazing data

[01:00:09] : [01:00:11]

sets that we can activate by taking the

[01:00:11] : [01:00:13]

models to the data an infrastructure

[01:00:13] : [01:00:15]

that we can build like we had check's

[01:00:15] : [01:00:18]

agent with Stanford the top x-ray ad

[01:00:18] : [01:00:21]

Radiology model to build good standard

[01:00:21] : [01:00:23]

things across the entire gamma of

[01:00:23] : [01:00:25]

healthcare so we can actually get to

[01:00:25] : [01:00:27]

healthcare versus sit so that we can

[01:00:27] : [01:00:29]

make it so that everyone is empowered to

[01:00:29] : [01:00:30]

make the best decisions either as

[01:00:30] : [01:00:33]

experts or individuals and make it so

[01:00:33] : [01:00:35]

nobody is alone again as well as

[01:00:35] : [01:00:37]

increasing the data quality that will

[01:00:37] : [01:00:39]

then feed better models that will then

[01:00:39] : [01:00:42]

save lives save suffering and again

[01:00:42] : [01:00:44]

increase our potential like you've got a

[01:00:44] : [01:00:47]

longev book behind you right why don't

[01:00:47] : [01:00:49]

you have all the latest knowledge of

[01:00:49] : [01:00:52]

longevity at your fingertips at gp4

[01:00:52] : [01:00:55]

level right now that will happen over

[01:00:55] : [01:00:56]

the next year we will launch stable

[01:00:56] : [01:00:58]

health or whatever decide to call it and

[01:00:58] : [01:01:00]

there will be the smartest people in

[01:01:00] : [01:01:02]

each of these areas working on that so

[01:01:02] : [01:01:03]

again you never AR like doesn't matter

[01:01:03] : [01:01:05]

if you're if you're with100 billion doar

[01:01:05] : [01:01:08]

and your kid has autism

[01:01:08] : [01:01:10]

ASD there's no cure there's no treatment

[01:01:10] : [01:01:12]

there's nothing doesn't matter how rich

[01:01:12] : [01:01:15]

you are yet with just a little bit of

[01:01:15] : [01:01:17]

effort right now we can build it as an

[01:01:17] : [01:01:20]

open infrastructure for the 5% of people

[01:01:20] : [01:01:21]

in the world that know someone with

[01:01:21] : [01:01:24]

autism the 50% of people in the world

[01:01:24] : [01:01:26]

that receive a cancer diagnosis of them

[01:01:26] : [01:01:28]

of someone they love and they feel that

[01:01:28] : [01:01:30]

loss of agency so we're going to return

[01:01:30] : [01:01:33]

agency to humanity that way and again it

[01:01:33] : [01:01:35]

needs to be an open infastructure that

[01:01:35] : [01:01:37]

they can then access private data sets

[01:01:37] : [01:01:39]

and compensate them appropriately so

[01:01:39] : [01:01:41]

everyone is incentivized we need that

[01:01:41] : [01:01:44]

fast yeah and and that's a beautiful

[01:01:44] : [01:01:47]

Vision it is again infrastructure and

[01:01:47] : [01:01:48]

one of the things that's so beautiful

[01:01:48] : [01:01:51]

about it is guess what all 8 billion

[01:01:51] : [01:01:54]

people were all human or all running the

[01:01:54] : [01:01:57]

same software and the the the

[01:01:57] : [01:01:58]

breakthroughs and the knowledge

[01:01:58] : [01:02:01]

accumulated in you know in Kazakhstan is

[01:02:01] : [01:02:05]

going to be as useful in Kansas yeah but

[01:02:05] : [01:02:07]

this is the thing operating system this

[01:02:07] : [01:02:08]

is the biggest upgrade to the human

[01:02:08] : [01:02:11]

operating system we can imagine because

[01:02:11] : [01:02:14]

we're going from analog to digital text

[01:02:14] : [01:02:16]

is black and white whereas this these

[01:02:16] : [01:02:19]

models only understand context you know

[01:02:19] : [01:02:21]

Daniel canaman just passed you know

[01:02:21] : [01:02:24]

amazing kind of guy but you know he did

[01:02:24] : [01:02:25]

have this concept of type one type two

[01:02:25] : [01:02:28]

thinking and so we had one which is

[01:02:28] : [01:02:30]

these big data things that can only

[01:02:30] : [01:02:31]

extrapolate but now we have these models

[01:02:31] : [01:02:33]

that understand context and so we have

[01:02:33] : [01:02:35]

the missing parts of the brain and that

[01:02:35] : [01:02:37]

will allow us to extrapolate allow us to

[01:02:37] : [01:02:39]

have more rainbows you know have the

[01:02:39] : [01:02:41]

context of each individual push

[01:02:41] : [01:02:43]

intelligence to the edge and that's why

[01:02:43] : [01:02:45]

again there is this imperative to do

[01:02:45] : [01:02:47]

this now because there's a window on the

[01:02:47] : [01:02:50]

freedom agency democracy side but the

[01:02:50] : [01:02:52]

other imperative is no one should have

[01:02:52] : [01:02:54]

to suffer as they're suffering

[01:02:54] : [01:02:57]

now amazing and how much does actually

[01:02:57] : [01:02:59]

need doesn't need that much which is the

[01:02:59] : [01:03:01]

really amazing stuff this the total

[01:03:01] : [01:03:03]

amount spent in generative I think I

[01:03:03] : [01:03:05]

said at the conference is less than the

[01:03:05] : [01:03:07]

total amount spent on the Los Angeles

[01:03:07] : [01:03:09]

San Francisco

[01:03:09] : [01:03:11]

Railway which hasn't even started yet

[01:03:11] : [01:03:14]

and and in building stable Health again

[01:03:14] : [01:03:15]

if if that's what it's called I mean the

[01:03:15] : [01:03:18]

amount of capital required to build that

[01:03:18] : [01:03:20]

is di Minimus compared to what spent on

[01:03:20] : [01:03:24]

a single human trial of any any drug

[01:03:24] : [01:03:26]

yeah it is but then you know you build

[01:03:26] : [01:03:29]

it and you get to that 8020 incredibly

[01:03:29] : [01:03:31]

quickly that will change hundreds of

[01:03:31] : [01:03:32]

millions of lives and that will attract

[01:03:32] : [01:03:34]

the smartest people in each of these

[01:03:34] : [01:03:35]

areas thinking about what is the open

[01:03:35] : [01:03:37]

infrastructure of multiple sclerosis of

[01:03:37] : [01:03:40]

longevity of cancer and more but then

[01:03:40] : [01:03:43]

you can amp that because the value is so

[01:03:43] : [01:03:46]

so huge and you I hope to build a

[01:03:46] : [01:03:48]

trusted organization as part of this

[01:03:48] : [01:03:50]

whole human operating system upgrade

[01:03:50] : [01:03:51]

yeah that's what I want to build I want

[01:03:51] : [01:03:53]

to build human Os or at least catalyze

[01:03:53] : [01:03:56]

it again I don't want to run or control

[01:03:56] : [01:03:58]

own anything I want to check out how to

[01:03:58] : [01:04:00]

give back that control because who

[01:04:00] : [01:04:02]

should decide what cancer knowledge goes

[01:04:02] : [01:04:03]

in there who should decide what

[01:04:03] : [01:04:04]

education

[01:04:04] : [01:04:07]

ET let's talk about the second half of

[01:04:07] : [01:04:10]

your of your vision which is how we

[01:04:10] : [01:04:12]

originally met um when you were one of

[01:04:12] : [01:04:14]

the winners of the Global Learning X

[01:04:14] : [01:04:18]

prize that uh Elon and Tony uh Robbins

[01:04:18] : [01:04:21]

had had co-funded your your vision

[01:04:21] : [01:04:25]

around education um speak to us about

[01:04:25] : [01:04:27]

that yeah you know like we're deploying

[01:04:27] : [01:04:29]

it um kind of the windows are kind of

[01:04:29] : [01:04:32]

separate but every child my my entire

[01:04:32] : [01:04:35]

operating system is like if you think

[01:04:35] : [01:04:36]

about things in terms of the rights of

[01:04:36] : [01:04:38]

child today they have no agency and so

[01:04:38] : [01:04:41]

we must respect their rights climate

[01:04:41] : [01:04:43]

everything becomes a lot simpler now

[01:04:43] : [01:04:45]

that we have language models on a laptop

[01:04:45] : [01:04:48]

like I said you can go to LM studio.

[01:04:48] : [01:04:51]

download stable LM and it will run on

[01:04:51] : [01:04:54]

your MacBook faster than you can

[01:04:54] : [01:04:57]

read it's crazy we could have a gp4

[01:04:57] : [01:04:59]

level AI from us or someone else on a

[01:04:59] : [01:05:02]

smartphone or a tablet by next year One

[01:05:02] : [01:05:03]

Laptop a child was too

[01:05:03] : [01:05:06]

early you know now we have this

[01:05:06] : [01:05:07]

transformative technology you have an AI

[01:05:07] : [01:05:09]

that teaches the child learns from a

[01:05:09] : [01:05:11]

child are you visual auditory dyslexic

[01:05:11] : [01:05:13]

that's the best data in the world for a

[01:05:13] : [01:05:15]

national model but also to teach these

[01:05:15] : [01:05:20]

models how to be optimistic how to this

[01:05:20] : [01:05:23]

really is this really is uh the young

[01:05:23] : [01:05:25]

ladies Illustrated primer this really is

[01:05:25] : [01:05:28]

is Neil Stevenson's Vision in that

[01:05:28] : [01:05:31]

regard yeah but n shouldn't have had to

[01:05:31] : [01:05:33]

find the primer she should have had it

[01:05:33] : [01:05:37]

from day one as a human right as a human

[01:05:37] : [01:05:41]

right yes our schools education system

[01:05:41] : [01:05:43]

our child care mixed with the social

[01:05:43] : [01:05:46]

status game mixed with petri dish you

[01:05:46] : [01:05:49]

know they teach our kids not to have

[01:05:49] : [01:05:51]

agency yes whereas they should be

[01:05:51] : [01:05:52]

telling the kids yeah they should be

[01:05:52] : [01:05:55]

teaching it's a relic of the Industrial

[01:05:55] : [01:05:57]

age where everyone had to be counted and

[01:05:57] : [01:05:59]

you can't measure what you can't manage

[01:05:59] : [01:06:01]

so you manage the creativity and belief

[01:06:01] : [01:06:04]

out of people everyone in the world can

[01:06:04] : [01:06:06]

do anything why because even if you

[01:06:06] : [01:06:08]

don't have that Talent you can convince

[01:06:08] : [01:06:10]

someone else who does have that

[01:06:10] : [01:06:12]

talent but they don't believe it so they

[01:06:12] : [01:06:14]

can't do it so what happens if we have

[01:06:14] : [01:06:18]

an entire nation of children that have

[01:06:18] : [01:06:19]

this helper that brings the right

[01:06:19] : [01:06:20]

information at the right time and tells

[01:06:20] : [01:06:22]

them they can always believe that

[01:06:22] : [01:06:24]

supports them entire

[01:06:24] : [01:06:27]

world what can't you do you know then

[01:06:27] : [01:06:29]

they have all of the councel knowledge

[01:06:29] : [01:06:30]

of their fingertips and all of the

[01:06:30] : [01:06:31]

engineering knowledge of their

[01:06:31] : [01:06:33]

fingertips and it's a constantly

[01:06:33] : [01:06:36]

learning adaptive and improving system

[01:06:36] : [01:06:39]

again right now almost the entire AGI

[01:06:39] : [01:06:42]

and AI debate is about these machine

[01:06:42] : [01:06:44]

Gods train on giant supercomputers that

[01:06:44] : [01:06:46]

bestow their beneficence down or may

[01:06:46] : [01:06:48]

kill us or whatever what about that

[01:06:48] : [01:06:51]

human operating system upgrade that is a

[01:06:51] : [01:06:52]

decentralized intelligence where that

[01:06:52] : [01:06:55]

kid in Mongolia or Malawi or wherever

[01:06:55] : [01:06:57]

can make a real difference to humility

[01:06:57] : [01:06:58]

some of the contributors to our open

[01:06:58] : [01:07:01]

code basis for our models are 15 years

[01:07:01] : [01:07:03]

old you they just taught themselves and

[01:07:03] : [01:07:05]

just happen to be their Wizards you

[01:07:05] : [01:07:08]

don't know in this new age right and

[01:07:08] : [01:07:09]

again they should contribute to the

[01:07:09] : [01:07:11]

whole because once something goes into

[01:07:11] : [01:07:13]

this model of this system and again it

[01:07:13] : [01:07:15]

needs the verification and other things

[01:07:15] : [01:07:17]

that can be dynamic they can proliferate

[01:07:17] : [01:07:20]

to everyone using that system do you

[01:07:20] : [01:07:23]

think that once this capability is built

[01:07:23] : [01:07:27]

it will run into blocks in different

[01:07:27] : [01:07:29]

nations uh or do you imagine that this

[01:07:29] : [01:07:33]

will become a again a a human right uh I

[01:07:33] : [01:07:36]

and not all n i mean listen there's no

[01:07:36] : [01:07:39]

greater gift and no greater asset you

[01:07:39] : [01:07:42]

can give to a nation's populist than

[01:07:42] : [01:07:44]

intelligence and education but I'm not

[01:07:44] : [01:07:47]

sure every national leader wants to see

[01:07:47] : [01:07:50]

that that's why I think again there is a

[01:07:50] : [01:07:53]

gap here there is a year maybe where you

[01:07:53] : [01:07:56]

can go to any national leader and say

[01:07:56] : [01:07:57]

I will bring this technology to your

[01:07:57] : [01:07:59]

people and I will empower the smart to

[01:07:59] : [01:08:00]

your people and I want it to be eared by

[01:08:00] : [01:08:03]

the people and what option do they have

[01:08:03] : [01:08:05]

this is positive for them what happens

[01:08:05] : [01:08:06]

is that a lot of the corruption in the

[01:08:06] : [01:08:08]

world is because of local

[01:08:08] : [01:08:10]

Maxima you know actually it's weird

[01:08:10] : [01:08:12]

because unpredictable corruption is the

[01:08:12] : [01:08:13]

worst predictable corruption is a bit

[01:08:13] : [01:08:15]

like tax you know there's a good B book

[01:08:15] : [01:08:18]

by fusser J at Harvard about this and

[01:08:18] : [01:08:19]

then you have taxation kicking at

[01:08:19] : [01:08:22]

14% um if you can show them something

[01:08:22] : [01:08:24]

bigger and this is clearly big they will

[01:08:24] : [01:08:26]

Embrace this technology set new norms

[01:08:26] : [01:08:28]

and if you correct the same across all

[01:08:28] : [01:08:29]

these countries with talented

[01:08:29] : [01:08:31]

individuals in each of those groups and

[01:08:31] : [01:08:33]

talented individuals in each of those

[01:08:33] : [01:08:34]

sectors with a shared Mission even

[01:08:34] : [01:08:36]

though they're separate

[01:08:36] : [01:08:38]

organizations that's how you set amazing

[01:08:38] : [01:08:40]

standards that's how you build a network

[01:08:40] : [01:08:42]

effect and if you tie them all together

[01:08:42] : [01:08:45]

with a intelligent protocol and again we

[01:08:45] : [01:08:46]

talk about tokens or speculation or

[01:08:46] : [01:08:48]

ramps or anything like that but taking

[01:08:48] : [01:08:50]

the best of thinking around

[01:08:50] : [01:08:53]

coordination that can work that can

[01:08:53] : [01:08:56]

break this open you know um

[01:08:56] : [01:08:58]

but it's not going to be everywhere and

[01:08:58] : [01:09:00]

also when you look at the current debate

[01:09:00] : [01:09:02]

the current debate is for example we

[01:09:02] : [01:09:05]

can't let China have this

[01:09:05] : [01:09:06]

technology and you're like what about

[01:09:06] : [01:09:09]

the kids in China like well you know

[01:09:09] : [01:09:11]

it's dangerous they can have AGI so

[01:09:11] : [01:09:13]

under what circumstance would China ever

[01:09:13] : [01:09:15]

have this technology never you know

[01:09:15] : [01:09:17]

Pakistan when should they have thech

[01:09:17] : [01:09:19]

never that's really what they kind of

[01:09:19] : [01:09:20]

saying it's also self-defeating because

[01:09:20] : [01:09:22]

China has 100 million people they can

[01:09:22] : [01:09:24]

use to create data sets and two x of lot

[01:09:24] : [01:09:26]

superc computers let's put side again

[01:09:26] : [01:09:28]

it's a very Western oriented debate

[01:09:28] : [01:09:29]

whereas actually if you go to these

[01:09:29] : [01:09:32]

countries and you talk to the leaders

[01:09:32] : [01:09:34]

and the family offices that have power

[01:09:34] : [01:09:36]

and the people they will lead frog in

[01:09:36] : [01:09:39]

the global South to a intelligence

[01:09:39] : [01:09:41]

augmentation like they let frog to

[01:09:41] : [01:09:43]

mobile they want to embrace this

[01:09:43] : [01:09:45]

technology and again you can set Norms

[01:09:45] : [01:09:48]

Now versus what's going to happen is you

[01:09:48] : [01:09:49]

know they will get a centralized

[01:09:49] : [01:09:51]

solution they'll adopt that instead if

[01:09:51] : [01:09:54]

you don't right now for hundreds of

[01:09:54] : [01:09:56]

millions billions of people

[01:09:56] : [01:09:59]

that's why I think again it's a

[01:09:59] : [01:10:02]

Crossroads um is there anybody else

[01:10:02] : [01:10:03]

working towards this position that you

[01:10:03] : [01:10:07]

know of no in the in the large AI yeah

[01:10:07] : [01:10:10]

no certainly no credibility and again

[01:10:10] : [01:10:11]

that's why I had to build these models

[01:10:11] : [01:10:14]

you know and I had to kind of do this

[01:10:14] : [01:10:16]

everyone's working on Tiny parts of this

[01:10:16] : [01:10:18]

but they're expecting emergence build it

[01:10:18] : [01:10:20]

and somehow it will spread and again

[01:10:20] : [01:10:21]

this is why I found it passing in the

[01:10:21] : [01:10:23]

web 3 Community there are good people in

[01:10:23] : [01:10:25]

there and I hope to be able to unite

[01:10:25] : [01:10:26]

them just like hope to unite the people

[01:10:26] : [01:10:28]

in health and others again Peter you've

[01:10:28] : [01:10:30]

seen people working on Tiny parts of

[01:10:30] : [01:10:33]

this but this isn't a Manhattan Project

[01:10:33] : [01:10:35]

where we're facing an enemy unless the

[01:10:35] : [01:10:37]

enemy is ourselves you know but this

[01:10:37] : [01:10:40]

does require this big Global coordinated

[01:10:40] : [01:10:42]

push and that's why I've tried to design

[01:10:42] : [01:10:44]

this system that I believe will work

[01:10:44] : [01:10:46]

because it's all about the talent and it

[01:10:46] : [01:10:51]

is multiplicative is the race against

[01:10:51] : [01:10:54]

uh overly powerful centralized AI

[01:10:54] : [01:10:57]

systems that achieve some version of AGI

[01:10:57] : [01:11:00]

is that what we're racing

[01:11:00] : [01:11:00]

against yeah again we're racing against

[01:11:00] : [01:11:08]

ourselves like um humans could scale

[01:11:08] : [01:11:09]

through stories you have organizations

[01:11:09] : [01:11:11]

you know come and join abundance come

[01:11:11] : [01:11:14]

and go to Oxford come and do this but

[01:11:14] : [01:11:17]

then when we scaled through text text

[01:11:17] : [01:11:18]

was a lossy information format and

[01:11:18] : [01:11:21]

there's this poem by Ginsburg how about

[01:11:21] : [01:11:23]

this carthaginian demon of disort of Moc

[01:11:23] : [01:11:26]

that comes in Mallet comes in through

[01:11:26] : [01:11:28]

the data loss our organizations are slow

[01:11:28] : [01:11:30]

Dum AIS but now what's happening is

[01:11:30] : [01:11:32]

they're configuring to achieve their

[01:11:32] : [01:11:34]

thing of getting more and more power

[01:11:34] : [01:11:36]

again corporations are technically a

[01:11:36] : [01:11:39]

people under law but they're not fully

[01:11:39] : [01:11:40]

formed people they eat our hopes and

[01:11:40] : [01:11:43]

dreams so I believe the competition here

[01:11:43] : [01:11:45]

is against those organizations

[01:11:45] : [01:11:47]

consolidating too much power and

[01:11:47] : [01:11:48]

creating Norms that are almost

[01:11:48] : [01:11:50]

impossible to break so we're almost

[01:11:50] : [01:11:53]

competing against ourselves and again

[01:11:53] : [01:11:55]

the question is this do you believe it

[01:11:55] : [01:11:56]

amp ifed human intelligence or do you

[01:11:56] : [01:11:58]

believe in artificial general

[01:11:58] : [01:11:59]

intelligence do you believe in

[01:11:59] : [01:12:00]

collective intelligence or do you

[01:12:00] : [01:12:03]

believe in collected

[01:12:03] : [01:12:05]

intelligence who decides is this

[01:12:05] : [01:12:09]

infrastructure or is this a product

[01:12:09] : [01:12:12]

like so it's not like a Manhattan

[01:12:12] : [01:12:14]

Project against you know the Soviets or

[01:12:14] : [01:12:17]

anything like that but it this is

[01:12:17] : [01:12:20]

require us all to come together or at

[01:12:20] : [01:12:22]

least the smartest people in each of

[01:12:22] : [01:12:24]

these areas from coordination to

[01:12:24] : [01:12:26]

governance systems to healthcare to

[01:12:26] : [01:12:28]

education with a blank slate of how do

[01:12:28] : [01:12:30]

we upgrade the human operating system

[01:12:30] : [01:12:33]

the time is now it's our last chance to

[01:12:33] : [01:12:36]

do it and I and I I love you for it

[01:12:36] : [01:12:38]

because I think you're you're right you

[01:12:38] : [01:12:43]

were there when Elon beamed in on x x

[01:12:43] : [01:12:44]

video over

[01:12:44] : [01:12:47]

starlink from his airplane which is

[01:12:47] : [01:12:49]

which was a fun moment and we were

[01:12:49] : [01:12:53]

talking about uh the rate of growth and

[01:12:53] : [01:12:55]

his his statement um cu you know Rick

[01:12:55] : [01:12:57]

herw was there talking about his still

[01:12:57] : [01:13:01]

his prediction of AGI by 2029 and Ray

[01:13:01] : [01:13:04]

and and Elon saying we'll have AGI

[01:13:04] : [01:13:07]

whatever that means by next year and and

[01:13:07] : [01:13:09]

the intelligence of the entire Human

[01:13:09] : [01:13:14]

Race by 2029 so I I am I am curious what

[01:13:14] : [01:13:17]

just to close out what you think about

[01:13:17] : [01:13:21]

that those timelines and that potential

[01:13:21] : [01:13:24]

for a super intelligent uh AI system

[01:13:24] : [01:13:27]

that is centralized because that's the

[01:13:27] : [01:13:29]

people who are building that level of

[01:13:29] : [01:13:32]

Power are building centralized systems

[01:13:32] : [01:13:34]

they're building centralized single

[01:13:34] : [01:13:36]

systems that again take our collective

[01:13:36] : [01:13:38]

intelligence like all of YouTube In the

[01:13:38] : [01:13:41]

case of open AI clearly and other things

[01:13:41] : [01:13:43]

and they package it up sell it back to

[01:13:43] : [01:13:44]

us but they don't give they don't

[01:13:44] : [01:13:47]

care you know these organizations are

[01:13:47] : [01:13:48]

trying to build a system that will take

[01:13:48] : [01:13:50]

away our freedom Liberty and potentially

[01:13:50] : [01:13:52]

kill us all let's be kind of fair about

[01:13:52] : [01:13:55]

that direct and sell it to us uh on an

[01:13:55] : [01:13:58]

incremental basis the selling to us is a

[01:13:58] : [01:14:00]

complete Canad they don't care about the

[01:14:00] : [01:14:03]

revenue of this again let's kind call a

[01:14:03] : [01:14:05]

spade a spade they are telling you that

[01:14:05] : [01:14:06]

they're building something that could

[01:14:06] : [01:14:10]

kill you and something that could remove

[01:14:10] : [01:14:12]

all our freedom and liberty and they're

[01:14:12] : [01:14:13]

saying it's a good thing you should back

[01:14:13] : [01:14:15]

them because it's cool it's not it's

[01:14:15] : [01:14:18]

actually shameful if you think about it

[01:14:18] : [01:14:19]

and we should not stand for it anymore

[01:14:19] : [01:14:21]

this another reason I want to step aside

[01:14:21] : [01:14:22]

to see because you can't say things like

[01:14:22] : [01:14:24]

that I got canceled Sil Valley so many

[01:14:24] : [01:14:27]

times but realistically it's ridiculous

[01:14:27] : [01:14:28]

and it should not be stood

[01:14:28] : [01:14:31]

for but they're going to do it anyway

[01:14:31] : [01:14:32]

because they have the political power

[01:14:32] : [01:14:34]

people are scared of

[01:14:34] : [01:14:37]

them so there has to be an alternative

[01:14:37] : [01:14:38]

and the alternative has to be

[01:14:38] : [01:14:40]

distributed intelligence when I resigned

[01:14:40] : [01:14:43]

I said you can't beat centralized

[01:14:43] : [01:14:44]

intelligence with centralized

[01:14:44] : [01:14:45]

intelligence you're not going to beat it

[01:14:45] : [01:14:47]

with a stability this is a great

[01:14:47] : [01:14:49]

organization it's going to do well the

[01:14:49] : [01:14:51]

only way that you can beat it to create

[01:14:51] : [01:14:54]

the standard that represents humanity is

[01:14:54] : [01:14:56]

decentralizing intelligence it's

[01:14:56] : [01:14:59]

collective intelligence and the data

[01:14:59] : [01:15:01]

sets and Norms from that will be ones

[01:15:01] : [01:15:04]

that help children that help s people

[01:15:04] : [01:15:06]

suffering that reflect our moral

[01:15:06] : [01:15:09]

upstanding and the best of us and

[01:15:09] : [01:15:12]

gathers the best of us to do it because

[01:15:12] : [01:15:14]

if you work in healthcare if you work in

[01:15:14] : [01:15:15]

education if you work in finance if you

[01:15:15] : [01:15:17]

work in any of these things there's no

[01:15:17] : [01:15:19]

Organization for you to come and join or

[01:15:19] : [01:15:23]

partner with on this there's no kind of

[01:15:23] : [01:15:24]

centralized Mission I have looked I've

[01:15:24] : [01:15:26]

want to help other people I don't want

[01:15:26] : [01:15:28]

to do this myself and I don't want it to

[01:15:28] : [01:15:29]

be about me very very quickly which is

[01:15:29] : [01:15:31]

why I'm kind of getting it out there now

[01:15:31] : [01:15:33]

I hope I can capitalize something that

[01:15:33] : [01:15:36]

then people will take forward and time

[01:15:36] : [01:15:38]

is now for that because AGI when it

[01:15:38] : [01:15:39]

comes if it comes again there's various

[01:15:39] : [01:15:42]

definitions of this why on Earth do you

[01:15:42] : [01:15:43]

need any knowledge workers anything that

[01:15:43] : [01:15:47]

can be done via a laptop doesn't need

[01:15:47] : [01:15:50]

humans and so you have concepts of Ubi

[01:15:50] : [01:15:52]

here you have concept I think when the

[01:15:52] : [01:15:54]

AJ comes you don't need money money is a

[01:15:54] : [01:15:56]

common story as a common good we HD

[01:15:56] : [01:15:59]

towards a post capitalist Society yeah

[01:15:59] : [01:16:00]

yeah I think the example I think you

[01:16:00] : [01:16:03]

said was uh Star Trek versus Mad Max you

[01:16:03] : [01:16:06]

know I'm like Star Trek versus Star Wars

[01:16:06] : [01:16:08]

I think is AIT a better

[01:16:08] : [01:16:11]

one you know and so you got the Sith

[01:16:11] : [01:16:14]

Lords and all of that um but again if

[01:16:14] : [01:16:16]

you kind of look at this I don't think

[01:16:16] : [01:16:18]

we need money like it's cross contextual

[01:16:18] : [01:16:20]

like bartering with our AI systems

[01:16:20] : [01:16:23]

representing us or it's you don't need

[01:16:23] : [01:16:25]

money because you're told what to do

[01:16:25] : [01:16:28]

again our governments the definition of

[01:16:28] : [01:16:30]

a government is the entity with Monopoly

[01:16:30] : [01:16:32]

on political violence and an AGI can

[01:16:32] : [01:16:34]

overtake any government that can then

[01:16:34] : [01:16:35]

control the

[01:16:35] : [01:16:37]

people because again listen to it

[01:16:37] : [01:16:39]

whispering look at the kind of human

[01:16:39] : [01:16:41]

thing so we have this opportunity to set

[01:16:41] : [01:16:44]

Norms right now the way that the big

[01:16:44] : [01:16:46]

labs are going to AGI is likely to kill

[01:16:46] : [01:16:49]

us all Elon and I signed that six-month

[01:16:49] : [01:16:50]

pause letter because even though people

[01:16:50] : [01:16:52]

like emad you're an accelerationist you

[01:16:52] : [01:16:54]

put all this open source AI out you have

[01:16:54] : [01:16:55]

to think about the other and who's

[01:16:55] : [01:16:57]

involved in that discussion and again if

[01:16:57] : [01:17:00]

we build an AGI as a centralized thing

[01:17:00] : [01:17:04]

is Windows or Linux safer as

[01:17:04] : [01:17:06]

infrastructure our entire internet

[01:17:06] : [01:17:09]

infrastructure is built on open open can

[01:17:09] : [01:17:10]

be challenged open can be

[01:17:10] : [01:17:14]

augmented a monolith is like to be crazy

[01:17:14] : [01:17:17]

and the way that I put this is you and I

[01:17:17] : [01:17:18]

both know so many

[01:17:18] : [01:17:21]

Geniuses you know side effect of Genius

[01:17:21] : [01:17:24]

is insanity honestly we're not ment

[01:17:24] : [01:17:25]

Geniuses are not mentally

[01:17:25] : [01:17:28]

stable why would you expect an AGI to be

[01:17:28] : [01:17:29]

so and you're putting all your EGS in

[01:17:29] : [01:17:32]

one basket versus creating a complex

[01:17:32] : [01:17:36]

hierarchical system that is a hive mind

[01:17:36] : [01:17:37]

that's intelligence that represents us

[01:17:37] : [01:17:39]

all we should be working towards

[01:17:39] : [01:17:41]

building that because it's safer it's

[01:17:41] : [01:17:44]

better it achieves all the benefits that

[01:17:44] : [01:17:45]

people are talking about and it's

[01:17:45] : [01:17:47]

possible today do you think Elon shares

[01:17:47] : [01:17:49]

in this vision of a de centralized AI do

[01:17:49] : [01:17:51]

you think he would play in that area and

[01:17:51] : [01:17:53]

do you think any of the national leaders

[01:17:53] : [01:17:55]

that you've been speaking to would

[01:17:55] : [01:17:58]

support that kind of a vision as

[01:17:58] : [01:18:00]

well um yeah I can't speak veryy long

[01:18:00] : [01:18:02]

I'll speak to him and see what he thinks

[01:18:02] : [01:18:04]

and I'll get back to you you know he

[01:18:04] : [01:18:06]

always says what he thinks um but you

[01:18:06] : [01:18:08]

know he's immensely concerned he was one

[01:18:08] : [01:18:09]

of the leaders in this area saying

[01:18:09] : [01:18:12]

originally why Google you know now why

[01:18:12] : [01:18:14]

Microsoft open AI like it can't be

[01:18:14] : [01:18:16]

centralized but it's difficult this is a

[01:18:16] : [01:18:18]

difficult question how many people have

[01:18:18] : [01:18:19]

a feasible solution or you even thought

[01:18:19] : [01:18:21]

about this properly you and I both know

[01:18:21] : [01:18:23]

just not many and that's very sad should

[01:18:23] : [01:18:25]

be everyone thinking about this

[01:18:25] : [01:18:27]

on the leader side all the leaders I've

[01:18:27] : [01:18:29]

met are super happy you know because

[01:18:29] : [01:18:33]

they again leaders want Power they want

[01:18:33] : [01:18:36]

control and all of this but genely like

[01:18:36] : [01:18:38]

they want to see

[01:18:38] : [01:18:40]

abundance they're not happy with where

[01:18:40] : [01:18:42]

their countries are and embracing this

[01:18:42] : [01:18:43]

technology they know that they can leap

[01:18:43] : [01:18:46]

ahead and you know they will still have

[01:18:46] : [01:18:48]

a say in all of this it's not like it's

[01:18:48] : [01:18:50]

kicking them out or removing them there

[01:18:50] : [01:18:53]

were still various kind of mechanisms

[01:18:53] : [01:18:55]

there and ultimately improving the

[01:18:55] : [01:18:57]

health education and capability of your

[01:18:57] : [01:19:00]

people is not a bad thing I mean like

[01:19:00] : [01:19:01]

obviously I haven't talked to the

[01:19:01] : [01:19:03]

completely oppressive leaders you know

[01:19:03] : [01:19:05]

maybe that'll be an interesting thing

[01:19:05] : [01:19:07]

but honestly I don't want to even be

[01:19:07] : [01:19:08]

talking to leaders I want to create

[01:19:08] : [01:19:10]

again a system the people of the country

[01:19:10] : [01:19:12]

coming together with the franchise

[01:19:12] : [01:19:14]

system can then build this technology

[01:19:14] : [01:19:16]

for the good of their people in the open

[01:19:16] : [01:19:18]

and not be reliant on anyone politically

[01:19:18] : [01:19:19]

or any other type of thing like that we

[01:19:19] : [01:19:21]

don't need giant super computers for

[01:19:21] : [01:19:22]

where we're going we need

[01:19:22] : [01:19:24]

coordination need a few giant super

[01:19:24] : [01:19:27]

computers yeah what's your timeline for

[01:19:27] : [01:19:29]

putting out this white

[01:19:29] : [01:19:31]

paper I'm working as hard as I can you

[01:19:31] : [01:19:33]

know put it together I've held I've held

[01:19:33] : [01:19:35]

I've held you to this a number of times

[01:19:35] : [01:19:37]

I've said get the vision out there it's

[01:19:37] : [01:19:39]

it's getting though we're about to go

[01:19:39] : [01:19:41]

off this call to a 4our session to

[01:19:41] : [01:19:43]

dictate all the various bits and pieces

[01:19:43] : [01:19:44]

and again it was impossible when I was a

[01:19:44] : [01:19:46]

CEO of stability there was always

[01:19:46] : [01:19:47]

another fire there was always another

[01:19:47] : [01:19:50]

thing I didn't have time to think you

[01:19:50] : [01:19:51]

know and I hope people can take that

[01:19:51] : [01:19:53]

white paper and make it better I I don't

[01:19:53] : [01:19:54]

have all the answers I'm just trying to

[01:19:54] : [01:19:56]

Capal something man I think after I

[01:19:56] : [01:19:58]

heard you Ste down I wrote you a text

[01:19:58] : [01:19:59]

saying

[01:19:59] : [01:20:01]

congratulations yeah exactly not

[01:20:01] : [01:20:05]

commiserations time time to time to feel

[01:20:05] : [01:20:09]

Unleashed yeah um yeah uh Imad thank you

[01:20:09] : [01:20:13]

my friend uh thank you for sharing uh

[01:20:13] : [01:20:16]

where you are what led up to this where

[01:20:16] : [01:20:18]

you're going next and uh and really

[01:20:18] : [01:20:20]

pulling the gloves off on discussing the

[01:20:20] : [01:20:24]

idea of centralized um closed AI system

[01:20:24] : [01:20:27]

s and their dangers uh and the

[01:20:27] : [01:20:29]

importance of of the vision that you

[01:20:29] : [01:20:32]

portrayed because I'm I'm fully

[01:20:32] : [01:20:34]

supportive and full believe that what

[01:20:34] : [01:20:38]

you laid out um is probably one of the

[01:20:38] : [01:20:41]

most sane visions of AI in the future

[01:20:41] : [01:20:42]

that I've

[01:20:42] : [01:20:44]

heard I hope other people agree you know

[01:20:44] : [01:20:47]

and they can take it Forward other real

[01:20:47] : [01:20:51]

heroes thank you thank you

[01:20:51] : [01:20:51]

pal

[01:20:51] : [01:20:56]



About Author

Video Man HackerNoon profile picture
Video Man@videoman
i'm a man i'm a man i'm a video man

KOMMENTARE


Hängeetiketten

DIESER ARTIKEL WURDE VORGESTELLT IN...

Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite
X REMOVE AD