Over the past few days I’ve been listening and talking to a lot of amazing and smart people at Web Summit. My experiences ranged from listening to the CTO of Microsoft Azure and CEO of Adyen to having half-drunk conversations on the street with a bunch of eager entrepreneurs that figured their start-up would be the next best thing.
All these talks have been very inspiring, but almost all of them missed an important aspect: apart from Evgeny Chereshnev, VP of Kaspersky Labs, none of the speakers I’ve listened to, or attendees I’ve met, spoke about the ethical implications of their business.
I’m not talking about the obvious ethical questions on safety and security which seem to have become standard interpretation of ethics in tech. Nor am I talking about sexism and hostility towards woman or other minorities within the tech community. Although these are valid topics which need to be addressed, they remain within the closed circuit of the tech community.
I’m talking about the hardcore “is my product really doing anybody any good” kind of ethics. I am talking about morality.
Take Qualtrics, an online survey platform. On their website they have statements like ‘The wisdom of all. The power of one’. The CEO of Qualtrics, Ryan Smith, spoke about how they are helping enterprises becoming more transparent both in terms of customer satisfaction as well as employee statisfaction. Transparency meaning that all the hard data they collect on performance metrics being publicly available, but also day-to-day activities (what did I do last week, and what am I planning to do this week) and even reimbursements on events like Web Summit.
Ryan Smith’s claims that there is no reason to withhold this data from employees because they are more than capable of interpreting this information.
That same logic is also applied to consumers, for instance when it comes to healthcare applications. It seems that the popular thought is that if we, as a tech community, provide this seemingly amoral data to the consumer, we empower them to make their own informed decisions. The meaning of the data lies in the eyes of the beholder.
This is a common claim in tech, and it’s even applauded: look at us, we are modern day heroes giving a voice to consumers who were kept in the dark by banks, healthcare professionals, governments, etc, etc.
This modern day version of the tale of Robin Hood is used to conceal the fact that although some might truly believe in consumer empowerment, none of us would be doing any of this if it wasn’t for that same consumer throwing money at us for the fabricated notion of self-control.
What we’re basically doing is impose our world view, that we are all capable of making informed decisisson on any aspect of life, on this notion of a generic consumer.
As a rather homogeneous and somewhat priviliged group, the tech community tends to forget that in most developed countries, only 30–40% of the population has a tertiary education degree. And that’s a rather broad interpretation of the type of degree they received. Let’s just say they didn’t all go to Stanford, Berkely or Harvard.
The fact that we ourselves, and those we know, might be able to correctly interpret this huge amount of data that all these applications gather and provide us with and make informed decision based on it (which I don’t believe is true), doesn’t mean that there isn’t a whole bunch of people who are most certainly not trained to understand the complex notions of correlation or implication.
For instance, if you provide consumers with in-depth information about their health, and use machine learning to identify possible anomalies, they will run to their healthcare professional for every little scare. And they will distrust that same professional if their query is dismissed by lack of any medical evidence.
So we may have empowered this consumer with additional “information” about their health, but we completely destroyed the most basic element of healthcare, which is the patient-docter relationship.
And even if it’s not about data interpretation, but just generic disruption, we should try and learn from the protesters that took over Airbnb HQ last week. They are protesting against the way that one of the lauded disruptive tech companies not only disrupted hotels, it also disrupts communities and neighbourhoods. It is a side-effect of our business which we cannot dismiss. It is something the tech community needs to address and become accountable for.
Up until now, if I tried to have a discussion about ethics with people within the tech community, I’d be branded as patronising or conservative.
That’s unfair as the question of ethics should never be dismissed by disqualifying your oponent. The impact of technological innovation on the world and our societies requires us to have this debate in an open and understanding way.
The tech community cannot hide behind the smoke screen of amorality. If your answer to the ethical question of “is this doing anybody any good” is to say that you only provide the data and that it’s up to the consumer to interpret it, you may want to think again.
And the questions should go beyond that. Surely, Airbnb is doing lots of people lots of good. I’m writing this article from an Airbnb rented room. But there is a negative side-effect when the less fortunate are driven from their homes to accomodate those who can affort to stay in a local “authentic” experience. We need to ask ourselves: is disintegrating communities worth the disruption of hotels?
Most of the pitches that I’ve seen at Web Summit are created for our kind of people. And I don’t mean that in a negative way. It’s just that we get our inspiration from our environment, our friends, the stuff we do and the people we meet. And in most cases, these are compatible to ourselves and far away from the average day of many other people (or consumers).
So I’d like to urge those who are invited on stages like Web Summit, who engage with start-ups as investors or advisors, to start asking these ethical questions and to reject the current amoral approach to ethics.
If you’re planning to change the world with our app, you’d better make damn sure it’s going to be a nice place afterwards.