Tune in to Listen to Tech Stories from Hacker Noon 2-3 times a week!
Is decentralization fair for everyone? This Week On Planet Internet, Utsav Jaiswal, Sidra Ijaz, and Amy Tom talk about the decentralization movement. What's the latest with DeFi? Can AI & decentralization be friends? 🥺
🌍 THIS WEEK ON PLANET INTERNET:
🗒️ SHOW NOTES:
[00:00:00] Utsav: So hello and welcome to the Heartland podcast. It's this week on the internet today, I have Amy drone with me, like hello, little lady. And we have
[00:00:20] Amy: Oh my goodness.
[00:00:22] Utsav: It is a callback to the like us presidential debates or whatever. And in between the debates like Joe Biden was like, the entire Muslim, but I was like, okay, this is our guy. Yeah like it's a very pretty language, but today we are here to discuss about de-centralization what it means to somebody or some organizations who might not have been digging down the minds of cryptocurrencies and blockchain for years and years to come.
But it's more from the perspective of where the next million or the next billion people to blockchain technologies. Decentralization in general are going to come from for that. We want to get a fresh perspective on what decentralization means and where is the road taking us. So without further ado, I'd like to pass the mic to Amy and let her watch.
She has to talk about decentralization, what it means to her. Wow.
[00:01:28] Amy: What a question that is a loaded one at sob. I have a lot of thoughts on this centralization. I think that it's definitely important and I am all about more power to the people and less powered to the Jeff Bezos's of the world. So I am all of our decentralization of businesses of like money of power.
Yeah. Love it. I'm here for this movement. I'm on board, right?
[00:01:58] Utsav: I have this small joke to make for decentralization is like this movie gone in, right? It has so many actors, like all of them equally,
[00:02:11] Amy: no more than actor of all
[00:02:12] Utsav: time. No most are bought in
[00:02:22] Sidra: and that too.
Didn't tell you, there was this big screen in front of me and I couldn't do anything. I couldn't help it. It was about the other guy was the peace guy, the person who.
And each of those faces. Oh,
[00:02:55] Amy: he's great. He actually, Nicholas cage is definitely for the decentralization movement because he's like the most like average. Of all time, just really striving for everyone to be in the middle. And if everyone was more average, like Nicholas cage, the world would be a better place.
[00:03:13] Utsav: Let's all be Nicholas Cage's right.
Billions of dollars, I would say no to that, but moving on. So figure out what does like decentralization like mean to you? What belt does it rank for you?
[00:03:26] Sidra: I take a little bit of philosophy. And I've written something about decent realization. For me, it is a socialist dream. Like it is a promise to transfer power from the tech times back to the people.
And it is like freedom trust. And what you say, giving strength back to the people. And that is a great thing.
[00:03:48] Utsav: This has very interesting now because I have really smart people on my panel. For those of you who don't know, like I used to be a PhD. So tell me, you didn't have to say that. We will get that edited out.
[00:04:04] Amy: was going to say, wait, how do you use to have a PhD?
[00:04:07] Utsav: If you like the way I see it as started. Have where they give you, but you don't process it. For example, I have a big problem basically, but I don't practice it. So do I still call myself from the industry? No, I don't.
Exactly. I feel my patterns, indian decides not to be a doctor.
But yeah, coming back. So what I want to say was that you have really smart people on the pilot. So how do you feel when the power is equally distributed among everybody? Because now there is. Meritocracy. So imagine if what you have to say has the same weight as somebody else that I don't want to go into their education, but let's say that they have a very big point to make.
They don't really understand what they want to talk about
[00:04:54] Sidra: politics. That's the point that it, that we have to think about because there are many questions that arise with that persons of censorship. We can not censorship hate speech. Some other times like child pornography or things like that. So yes.
Give me power to everyone. Can malicious people can take an advantage of that, but technology might enable the de-centralization in the way that we may control things like that in, in a certain way. There is needles, a little bit of censorship or control in a sense. In some way, I believe.
[00:05:33] Amy: Got it. And that's tough. Listen. Yeah, the people. RD are evil and have power. So it doesn't matter if everyone else gets the power because they're already there. Anyway.
[00:05:48] Utsav: Wow.
This meeting for those of you who don't know, Amy is the one who usually brings a lot of energy to the meeting. I don't know what broke her today, but hopefully she'll be back ready. I next have any interest in God with you, but Sidra brought up about censorship and how things work like big, the case of artificial intelligence.
What they basically do is they feed a lot of data points to let's say an algorithm or a product or a user. And that then uses their own algorithms to create certain things. Like we had we have those chatbots, Microsoft based one called. I have the
the account is now protected.
[00:06:38] Sidra: people actually poisoned they into a racist Nazi.
[00:06:43] Utsav: Exactly.
Guests are tables built off of a technology called shower. Shower was built by again, Microsoft in China, they are using it to give out weather reports. They don't have the weather girls or the weather men or other people in China at least one channel. They use this one, AI bot shower wise, it works over there.
Why does it work over there? Because. Let's say instead 10 censorship. The second that like mentioned something back, it goes out piano, man. Square goes out. And because I mentioned it, this podcast might not be censored in China. Anyways, the point that I'm trying to make is that massive censorship can have a very similar effect to what you would want.
Just to be like an ad the other day. No censorship could leave, could lead to a very dystopian future where nobody wants to live. How do we harmonize that to Amy? Let's hear you on that.
[00:07:55] Amy: No, there's no dystopian. Everything is beautiful. Everyone loves each other and everyone gets along and it's all great.
[00:08:05] Utsav: I just said that there are like bad people. The world has already part panel where you said that, right? Yeah,
[00:08:12] Amy: yes, this is true, but how do we co-exist with bad people today, they just integrate into society.
[00:08:21] Utsav: I don't know. All right. So okay. Let's move on to the next thing. Like the biggest thing, why these centralization took hold Bitcoin became something.
It stems from the 2008 housing loan crisis or the subprime mortgage loan crisis or whatever you want to call it. And that was when people realized that what they were thinking could be personal finance, my savings, my deposits, what I have as salts. What's pretty much being used to gamble by the people and the Hightowers or the people in the banks or whatnot.
So this movement came up where it says we don't really need banks. Let's take the power back. Our fails. I become the arbiter of my faith and that leads to something called social finance, where really smart people like you guys can pull your money together, or this side together. They are things that as our Dao, a decentralized autonomous organization, what it does basically is it takes the decisions based on what you do.
So it wouldn't be the entire world. It wouldn't be the bad people as we call it, but only those people in the same Dow where Amy's, and there'll be blended out, have similar interests. So that leads us to something called a social finance. Very would you guys like vote to be like, what side of the fence would you guys want to be?
Where do you want to be on the side of personal finance? Where let's say you deposit your money in the bank. 90% of that is given away by the banks to people who want to borrow money and they do it with reckless abandon to put it mildly. Or would you want it to be in a system like a doubt where people like you, big decisions.
On Marco is where you have an actual sale. Wow.
[00:10:19] Amy: That
[00:10:19] Utsav: question last time
[00:10:26] Sidra: for social finance, really this first example they have given in this article taping savings in which they have described history of social finance and actually such things In Pakistan it has been happening for centuries as women do this thing that is called committee and we combine our let's look, you must be
yeah. The best thing that has happened to women because we, something that we cannot afford, we can. Get the money together. And every month each woman gets the task to spend that money. And that's a great. I believe
[00:11:06] Utsav: it's totally a very nice thing. Like I have seen credit being used in manners.
Like I have my mother belongs to a lot of these committees or pity parties and a lot of these women, they use. Let's rotate this money into exactly. Exactly. It does. Like the youth start to find their own businesses, like buy stuff for the house to get a new car on it. So ready,
[00:11:32] Amy: put in some money every month.
[00:11:34] Utsav: Yeah.
[00:11:35] Sidra: So as long as a community usually come in for Palm Hutch on these using these committee.
[00:11:43] Amy: Oh, nice. What's it called? A kitty.
[00:11:49] Sidra: committee
[00:11:50] Amy: a committee. Okay.
[00:11:51] Utsav: That's nice. Cause how does like something that like let's say something close to somebody that use it and it's like retina and your scriptures that you need to do that. So it's nice. It's not wasteful. It's not a waste for the expense as like some other people might do.
And then they're like, we have something very similar where it says that whatever government grants need to be given, it would be given to them. And less women of the family. It wouldn't be given to the man because when you give men money, they spend it on drugs are one thing. So they were like, okay, we will go to the people who are more and there is a very good effect for that.
So what happens is that women are less likely to cheat basically. Like facts and figures are not pulling stuff out based on pre conceived notions, but moron, they are less likely to achieve when there are, when there is this social construct around it, just like this. What do you have to say about that?
Any are you on the side of personal finance or are you like on the side of social finance?
[00:12:57] Amy: How could I be on the side of personal finance out, or the banks are giving away all of our money carelessly to people? I theoretically am on the side of social finance. For the movement, I'm all there, but like all of my money is in the bank.
Like actually I guess personal I'm on the side of personal finance cause I'm voting where my money is and yeah, that's really got to change though.
[00:13:26] Utsav: I'm back. Makes sense. Leads us to this, where if you have more personal freedom, because Amy lives in Canada and I live in India. So where there is like more personal freedom where there is more transparency where banks can be less likely to play truant.
Because of all of these controls, checks and balances, people are more likely to trust these organization. Then maybe just maybe the Dow's of divorce are very similar to banks based out of Canada. I might be getting it wrong, but I guess the direction that I'm going, I going towards might help in a particular manner.
But with that let's talk about social credit. So when there is social finance, now, what amount of money you can access? Depends on social credit of what do you want to be held to a social credit of sorts over the award to be trusted on your merit? Like first among equals stuff like that? Sidra,
[00:14:26] Sidra: Expertise on the topic. So
[00:14:29] Amy: I don't know. I don't know what you mean by social credit.
[00:14:33] Utsav: So basically everybody has a credit score, right? Like it could be whatever names we have in these places that we live. And every credit score basically says that you are eligible to take out loans after a certain amount.
So when that is this take on a social finance now, instead of one person having like a pod of people with their own credit score, it is the society that half this credit score. So you might be an outlier. You might be somebody who was brilliant exceptional, but your credit score might be lower just because you like live in a bad neighborhood.
Is that acceptable? And if not, how can we say.
[00:15:14] Sidra: Again, I guess there should be a little bit of chicken balance, but for me, if you said personal choice, I'm okay with it. I'm okay with that. Because I guess that we have a limited life to live. Everyone has their own life priorities and for me perhaps money.
On the top priority. I can share that.
[00:15:46] Utsav: I can share that. I understand that. But what do you have to say on that, Amy,
[00:15:49] Amy: Are the people that are deciding who can have the loan or not qualified to do or are they just like a regular, everyday?
[00:16:01] Utsav: There is no such thing as a qualification now. Cause it's all based on like your social standing.
[00:16:07] Amy: Oh, okay. That's quite alarming because okay. What if I say something dumb on the podcast and then everyone is wow, she's dumb. Let's counsel her and then I'll never get along.
[00:16:21] Utsav: You cannot get, you cannot get cancer because of the centralization. Think of it like that. Everybody makes mistakes.
If I were to count, I made five before this podcast started and that's just the last one. Not right. Should I feel bad about it? Or should I learn from it and move on? Show that mistake. Be the defining factor for me.
[00:16:43] Amy: Me to you, I say no, me and to an internet stranger. I don't care. You don't get your alone,
[00:16:53] Utsav: dad.
[00:16:56] Amy: That'd people don't get money.
[00:16:59] Utsav: I hear you. I don't know what you'll do with that money. I hear you Barclay. If these questions sounded hard, they were meant to be. A confession to make. So these are the kind of questions that the decentralized world are. The people from who are building these doubts are struggling with.
They have been trying to find the answers to these questions for the last five years or six years, some of the smartest people in the world. And they struggle with these questions daily, and there have been cases where they thought that they had the right answer. And then some hacker came onto the internet and was like, okay, this sounds like a mistake.
Let me pay out some money out of it. Millions of dollars have been lost because they weren't able to get the right answer. Yeah, that is in a nutshell, some of these questions that these decentralized companies are trying to find what exactly is decentralization? What exactly is censorship? How is it tied to be big though?
Gord off the society without affecting our, without falling into the trap of what is known as the tragedy of the comments. So hopefully like people will listen to your tarts and deep art costs on this matter that find a solution sound very down the line. Hopefully. Cause I have a lot of money in crypto, so yeah, not in the banks.
[00:18:29] Amy: What are your thoughts on this then on the Dow is and its benefits.
[00:18:35] Utsav: Aren't no. I have my experiences were thous. And my take on this martyr is that does move too slowly for my liking. Every decision needs to have a consensus. Every decision taken we'll have a lot of people questioning you and there are.
Objective facts that you can point to them and say, Hey, so this is why that decision was taken for every people saying yes, there would be an equal amount of feeble saying no people outside this industry are from outside. Your society can always come in based on people of influence devoting. No, I have to worry about a lot more people than about one people.
When there is one figurehead, we can take them out. Why our elections are in certain countries like Guatemala, where they assassinated the president, they could do that, but then it is be centralized it for take a genocide to get the bad actors out. A lot of people with good intentions, at least according to them, try to make up.
A better place for their own people and the work they ended up doing, what's called a genocide. So I guess we would have a similar problem when decentralization comes into a larger picture when we would need to have multiple figureheads, but be able to tell me to have figured it out. And when we have figured it is even decentralization.
[00:20:13] Amy: Yes. That's. That is the question.
[00:20:16] Utsav: So that's what I'm saying. I am comfortable with the idea. I love the concept that I am not beholden to some person's power. Like probably because I don't believe in God or whatever, but at the end of the day, it boils down to this. I would still need to be beholden to, so yeah.
Now who is that? Somebody, if I don't know that if I can't, if I can't influence them to take a decision in my favor, should I even worry about that? Why not go work out? Which is what most of these Bitcoiners do, they don't care about blockchain. They don't care about decentralization. It is the Ethereum side of the world on the Bitcoin side of the world.
They are like, my money is now on my phone. And it does not. And like any of these banks, I can do whatever I wanted with it.
[00:21:12] Amy: Yeah. Too many unanswered questions, too many
[00:21:16] Utsav: unanswered questions is the exact summary that I should have gone on with if I hired a few. excellent. You were supposed to say, no, you have a few minutes.
No what I go, I like what I touched upon earlier, which was about AIS and decentralization, right? So there's your thought process on that this study by Oscar dapto, it asks of you part of that question, right? It says. We live in a world that is very data oriented and the people who have the most data have also been fine.
The most amount of money for breaking rules. You take that, I'm sorry.
[00:22:02] Sidra: The tech, they have the data science.
[00:22:04] Utsav: Yeah. And they have the control. Exactly. So some of these smartest people in the world, they decide to break the law. First of all, why. They had already billionaires, what do they want? They want to be trillionaires are late.
Is that a benefit for breaking the law? Because if they are being fine, they probably brought the loan.
[00:22:27] Amy: I don't think that they're necessarily breaking the law on purpose, but cybersecurity is such a slippery slope and these organizations are huge. So I would imagine that like even one backdoor could like open up a wide range of vulnerability.
[00:22:41] Utsav: I'll make it simple for you. You have a Facebook profile, let's say it's 2010. Then you have a Facebook profile and Facebook, besides that, whatever information you have posted on your profile can be accessed by a mobile app or some application built on Facebook. They say that, okay, if you check these boxes, you can become a Facebook app.
So you say, Hey, I played this game. What personality? I buy mine. I Cinderella or whatever other fairytale is, you play that game. You answer five questions in return. They get all of your data. They know what pages you like to know, what people you engage with is that ethical. Does it take a really smart person to say, Hey, that sounds about Gronk.
Maybe it would say that this is wrong and these are very smart people who said that's okay. I got paid for it. The joke's on you for using my social media platform.
[00:23:46] Sidra: That thing is sprain. Nothing is free.
[00:23:49] Utsav: Nothing is free in this. I get, but it's this responsibility that like, these are the kinds of things that led us down to the part where we are taking these centralization would be a solution.
[00:24:06] Amy: You have just resurrected a deep memory in my brain from probably the 2010s of me using Facebook to do all of the quizzes. And I am so deeply ashamed.
[00:24:22] Utsav: It's okay. Everybody did. Okay. Like when I
[00:24:28] Amy: forgot about that.
[00:24:30] Utsav: Yeah. Like when I got Facebook, it was 2008. You had to have a college ID. I had just gotten into college.
You had to invite 20 friends. Through Facebook before you could use any of their apps. Every time I had that, it was an app I had to tap. I did that twice. And then I was like, should I keep spamming? My friends look like, I don't know what personality type am I
right. But that brings us to de-centralized artificial intelligence. Now imagine instead of a Jeff Besos who. Who allegedly hates or a person
[00:25:13] Amy: or a legend,
[00:25:19] Utsav: who is the person we all hate, like people on a certain political spectrum hate. So how. Does that reconcile with the fact that these are the people who are making all of these decentralized AI software, because it needs the kind of resources that you and I might not have. So it is the smart people say, I'll build something that takes me out of the picture, but we have examples of these people not taking themselves out of the picture.
How is that something that we can do to solve this syndrome?
[00:25:59] Sidra: I want to share another two guys
here and there's this way that let me share it with you. It is putting the blinders on it. I I have written about it. It is Dr. Michael research officer and Dean and head of department of computing and Imperial college London. Wow. He is the SciFest technology that's edge AI. It is like processes of processing of AI algorithms on edge that is on users.
This is the data will remain on user's devices. And it is a concept that striped from edge computing that starts at the same premise. The data is store is brought and it is processed. So I guess this technology will be the answer and data will remain at the source and will not be used by these tech giants or be stored or stored by them.
[00:26:59] Utsav: I'm sorry. I was saying for the understanding of nautical, people can explain what edge computing is.
[00:27:09] Sidra: Okay. It is but what we do is what let's start with AI, like how AI is. You have this data. And then there is this algorithm on which data is faint and when the data is strained, it becomes a model.
And on that model for example, like recommendation systems that you have mentioned already, like Netflix recommendations are recommendations on Facebook or Amazon. Those recommendation models are trained on that data and how data is collected and stored. It is managed by the They collect the data and monitor people and then collect the data and store with it.
That will take us that what edge AI the concept of edge AI is that the data will remain on people's devices and it will be a train and the models will be trained over there. And then it will be used like in. So the data will no longer be in excess with companies who want to perform a modern training for them.
So the data will remain at the source.
[00:28:17] Amy: Okay. The data and the data on your computer or on your home network. Okay.
[00:28:24] Utsav: Like they have this very nice quote from Francis bacon. If the data will not come to the AI, then the AI will go to there is a saying in in our part of the world, which has very similar.
And it says if the
very nice flats has bacon stole back grief.
Like some very smart people stealing, as you can see exhibit eight right now let's move over to the next thing. It's like the applications or decentralization, like there are many, and the finance side of the things that the Cubs, the centralized finance are, the fi a lot of people might not be ready.
Concerned about how finance works on how they make money using 10 or whatever, but you do affects everybody. And that is a play over here in the study that talks about that the biggest competitors to YouTube would be coming from these centralized video platforms. What would a de-centralized radio platform like this person has some talks on it.
Jeremy Kaufman and like for good reason, he is like judo. It means that basically effort likes you. If YouTube likes you, your story will do well, or your episode will do well. But if you are somebody new, you need to climb all of the ladders and it might be very late for you before you get there. So that algorithms are not objective.
The sexualization would make it objective. Do you guys think based on the conversations that we have hired, like maybe help us. Computing, can we like create algorithms that are objectively fair?
[00:30:21] Sidra: I'll go to them. It's an entire different, like that's an entirely different research area again.
And people that are working on that bias, fair algorithms and There are many ways to do that. Actually,
[00:30:37] Utsav: you need to explain that to those smart waters.
[00:30:43] Amy: I love see dress left brain is just like doing the calculating and like analytical answer. And I was just going to be like
[00:30:52] Utsav: yeah. I did not know. Ready last minute,
[00:31:01] Sidra: a language barrier. I might have explained things better in my own
[00:31:05] Utsav: language.
[00:31:13] Sidra: There's this concept algorithm bias or AI bias companies have used it. Used and abused and guarded them to give answers that I like to get things that are like more you say are more that will help them more. And then again, there are like, the data is in that way that when the gardens are trained on them again, the data is biased.
I'll give an example. For example, there are gardens that for example, there was a tweet on Twitter in which a person has shared his professor who was black. And he said that with background, but my professor's pace finishes that we post quite a, like a viral thing was that the, that guarded them can, could not recognize the professors.
And it was because a garden was not trained to recognize like physics. And then plus this, then again, there was this very famous tweet in which there was a picture of Obama and another white man and the Twitter and guarded them. The propping algorithm always faked the picture of the bike.
You can find it over on the internet. We find it for us. Yes, I've got our bias. And one of the reason that I'm telling you is that data back to them is actually biased.
[00:32:45] Amy: It depends on what data they feed them. Doesn't
[00:32:54] Sidra: expect me. If you find a Crete. In the meantime, I guess what's the, and
[00:33:03] Amy: yeah, I, they, I think yeah, it's no, I lost my train. Yeah.
[00:33:09] Utsav: I have a very simple example for that. So look at all of these tobacco companies, they are the ones commissioning studies, or at least we're doing that blatantly in the sixties or the seventies saying that smoking was actually good for your hands.
Like they were doing all of these studies and because they were feeding their research target stations because they did not have the kind of computing power that we have today. So they were getting these startup stations too. The same data in a way that it gave out the impression that smoking might actually be beneficial for you.
[00:33:46] Amy: That kind of stuff still happens today. It doesn't just going to more of the data to give out the impression that this is what
[00:33:53] Utsav: bed. Exactly. That's how marketing works. That as a new car that comes out it's it always has right. Yes, that's how marketing works. Like how you present data, how you make people think.
[00:34:10] Amy: I was going to say like on the YouTube decentralization, like platform or like decentralized YouTube, I feel like they all grow them, could be made if done correctly or by people who are going to feed it. Non-biased data. Are people going to adopt it is like another layer to it as well, because.
YouTube is a central place that everybody can visit, that they know where to search. They know what kind of content they're looking for. They know where they're going to get generally, but to learn a new platform would be another story that people and people will have to use the marketing to convince people, to make the switch over.
[00:34:53] Utsav: So drug and you'll have to explain this to us.
[00:34:57] Sidra: Open this. It is face detection, cropping algorithm, and
[00:35:02] Amy: not opening. Oh yes.
[00:35:04] Sidra: Open the big
[00:35:05] Amy: checks,
[00:35:08] Sidra: open it
[00:35:12] Utsav: I had no idea.
[00:35:14] Amy: Cropping the picture, the Twitter out guys. I'm cropping the pin card.
Yes. It's cropping the photo to the white man instead of to Obama, even though he was in a different position both times, it still brings the attention to the white man, as opposed to the black. Interesting.
[00:35:42] Utsav: Yeah. There's
[00:35:42] Amy: like a hole. There are holes on
[00:35:50] Sidra: that,
[00:35:51] Amy: but. No, there are a whole discussions and arguments on how the social aspect of algorithms are racist. I'm sorry. The aspects of social. Media algorithms are racist. So Instagram, Twitter, or Facebook, the algorithms that push down content creators who are black or LGBTQ plus, or talking about black lives matter or racial movements or.
Things like that are getting pushed out of the algorithm. And Instagram recently rolled out something about disabling sensitive content that struck a really big chord with the people of color and LGBTQ plus community of content creators on Instagram, because they're essentially. Trying to limit the quote unquote sensitive content that people see, but what they deem as sensitive content is a lot of what these people's platforms are based off of.
But in a more of a positive way, and they're still limiting the reach that these people have when they're trying to show trans people that they are accepted or whatever the case is, the Instagram algorithm will take that out as well. So really interesting.
[00:37:10] Utsav: So I've always wanted to ask this question.
So as somebody who is of an opinion that all of these tech Childs, they are openly, slightly, too far to like further down the line on the left, right? Jeff Bezos, Washington post one of the most likely left-wing newspaper, but he is hated by them. But if I talk about Twitter itself, that Jack Dorsey safe factor has one it's hard leftist bias and it always.
I don't have a problem with that. What I have a problem with this, then why do these, all these algorithms that these guys create, who say that they are left-wing or who are liked by the people on the left wing? Like why is it that their algorithms try to paint a picture that somebody on the right-wing might do.
[00:38:07] Sidra: Maybe my wife and people are more at the data that you get.
[00:38:12] Utsav: But they are trying to, but they are trying to remove all of these biases. If you have, I think there should
[00:38:19] Sidra: be proper testing of these, but are
[00:38:21] Amy: they drank to remove the bias? Yeah. Are they
[00:38:23] Utsav: trying to remove it? Are they qualified?
Cause these are the people who say that Facebook had a lot of their employees like resigning during the black lives movement thing, because . You cannot do that while you are working over here, there is assigned, right? So these are the kind of people who feel strongly about, about your last subject.
These are the people who are building these algorithms. Are they missing it deliberately in your opinion? Cause nobody knows what's happening or is it something like more permissions?
[00:39:00] Sidra: I would say that.
[00:39:03] Amy: I think it's deliberate as well. I don't know what pernicious means, but I think it's deliberate.
[00:39:06] Utsav: So basically like bodacious, I know the word from the word pernicious anemia, which means that it creates like problems are a smaller level and then it like keeps on going. Yeah.
[00:39:17] Sidra: So again, we might be biased as well against, as well, right?
[00:39:27] Utsav: Because they have a leftist bias. I hate them for that. Not because they have money, but yeah.
Yeah. So that is bad. These questions are in decentralization and AI is, would need to be answered. I think, believe that these guys are trying to do the right thing. They are crying. A fish out all of these biases or these racisms or whatever of this these isms back might be tripping into the system.
Because of the system itself. Like for example, let's say that they want to create a credit score based on the person's, like whatever. And because of particular, like particular country has had a history of imprisonment people and they take the data from who has been in present. Obviously that particular is, would have a very high chance of getting rejected and whatnot.
That makes sense. But they are trying to remove these biases, the systemic problem. And hopefully they get to solve it, but I haven't seen them moving the needle a lot on that. I might be getting ahead of the picture, but yeah, this is what some people have to do to show that they still have a long way.
Yes. Coming back to YouTube. Short out videos be DM on it. I used to, we shouldn't have the right to monetize our radio stats because like us.
[00:40:59] Amy: No, I love content creators. All content creators should have the ability to monetize their video because they put work and effort into it.
[00:41:07] Utsav: Yeah. Check out what this guy is doing. A hackathon writer who created library and open source, everything, the source, the screens, digital marketplace, but on blockchain
[00:41:21] Amy: it's Libery.
[00:41:25] Utsav: Leverage,
[00:41:27] Amy: you know how like startups and like tech companies, they always have to have the special spelling of the company name. Yeah.
[00:41:34] Utsav: And now I want to move to the other side of the puzzle, which is the governments, because they have to deal with this as well. They are elected to take care of these things.
Countries such as Thailand is probably looking to have their own. CBDC a CBDC is basically a center back digital physical currency. So they would say Bitcoin is bad. What if I create a cryptocurrency tax code and you can use it for payments or whatever that is similar to how China has we chart India has BDM.
The us has a windmill that's that Canada has Venmo. No, no kinda needs to catch up. What past Pakistan house?
We have philosophy, right? Philosophy on shining, like Indians are fans off
[00:42:28] Amy: that I've never heard of those
[00:42:31] Utsav: never heard of philosophy. Nice. So going back through this. But you guys take that a center bag, digital currency with the kinds of censorships that work, like, how are we creating data? Do not have biases would work.
Can it be sufficiently decentralized where it ties back to where we all wanted to have social finance, but it's also free of the problems that creep in when we let the Verano.
[00:43:01] Sidra: First of all, I really like this feature demos of this article is like exotic.
[00:43:07] Utsav: It's exactly what is there? Davey?
[00:43:13] Amy: I have no idea.
[00:43:14] Utsav: It's large.
[00:43:21] Sidra: Do any hints or Buddha has two hands, I believe.
[00:43:25] Utsav: And guards, they have multiple hands period. And if they don't, they can have them. Delivered to the interpretation of the artist. So there are guards who have coded 3000 hearts, where do they come from? Where do they go? No one knows, but it exits.
So these are two few, like probably a smaller guard, but yeah, this is lard with her. Yeah.
[00:43:51] Amy: I didn't know. I was going to get culture today.
[00:43:57] Utsav: You have an Indian Pakistani on the culture. Excellent.
[00:44:03] Sidra: These are a little bit though.
[00:44:06] Utsav: Yeah, we'll do that offline. Like after we wrap this podcast out, but like coming. I want you to have from your say dry, like variable, your heartbeat, like where do you be willing to, for your goal of social finance, what having social control over your funds?
Like where do you be willing to let the government train these algorithms or AI is in a manner where they say are in a manner where they can make your belief, that it would be in your bank.
[00:44:40] Sidra: I would love that. I would love that, but again, that's something that again, we have to believe that algorithms are not biased, that they're working fine in some way we have to believe them.
But yes, I would
[00:44:54] Utsav: love that. Amy for you. A small piece of information day. The part by Microsoft was very nice for two hours. If is to us enough to make your believe, get your vote and say, Hey, this is nice. Give us the power. The second you give them the power.
[00:45:21] Amy: Yeah, no. No,
[00:45:23] Utsav: nobody brought what you do, but the media ability in the moment, it's more of a philosophy thing or more of a decision-making thing. Like whether you want to trust the government or whether you don't want to trust a government, like what side would you want to be on?
[00:45:40] Amy: I want to trust the government.
I want to dress the government.
[00:45:45] Utsav: That makes a lot of sense. Yeah. Yeah. That is I guess the goals that these governments have in terms of central bank, digital currencies, the Bitcoin maximalists, or the toxic Bitcoiners, they don't like it at all because they are like, this is Fiat in a new bottle.
They might have their own arguments, but this is what I wanted to buttress dark. Like for a lot of us, this is going to make a lot of sense because the benefits. Outweigh, whatever it is that the system might have. And with that, I'd like to have some closing words from the dual-fuel, let's start with Amy first.
[00:46:29] Amy: Wow. What a beautiful day, what a beautiful podcast. Thank you so much for giving me the opportunity to say a few closing words, all the power to the decentralization movement. I'll see you on the internet.
[00:46:45] Utsav: That's fine. How about,
[00:46:47] Sidra: I would say that with great power comes great responsibility. And listen, is that how to like, maintain this balance between power and responsibility when you are distributing power to everyone?
So that's the question we should
[00:47:05] Utsav: be thinking. There's this saying that says absolute power corrupts, absolutely
responsive. Any debug, watch it get has got option.
[00:47:18] Sidra: That's
[00:47:18] Utsav: true, but anyways, but that like pancakes, Amy and tankers are dropped for being on this podcast. But we note I might sound like somebody who was at the very deeper of break, all the systems break the government and all of these things, but it's very sobering for me to have these conversations with you guys.
So thank you for being on the podcast and hopefully we'll be talking soon again.
[00:47:41] Sidra: Thank you. Thank you so much.
Create your free account to unlock your custom reading experience.