Product Designer at BCG DV | Researching products for distributed intelligent ecosystems
Trust in governmental institutions is on the verge of collapse; as Nathan Kinch mentions, people turn towards businesses and other people to take care of social issues and to fix the economy. People distrust ‘the system’ and its leaders. This is a somewhat global phenomenon. Broken trust is an area ripe for disruption and redesign. Technology is creating new ways for us to interact, ways that require putting our trust into people, companies, ideas that we don’t know. As Suzanne Ginsburg puts it in her article entitled Designing for Automation, the technology also brings different methods of communication and design into our everyday interactions that need a deepening trust too. And with more automation ahead, if trust will not be properly embedded into intelligent systems it will only create distrust towards new solutions, diminish the chances of people adapting to changes, and divide societies at scale.
More technology requires us to give up our privacy for the cost of better personalization. But how to fix the issue of ever growing lack of trust in our society? More and more brands are asking people for trust based on their promises and by being transparent about its policies. But the psychology of trust works quite differently. There have been many attempts and debates happening around the black box of algorithms and being transparent about how the algorithms work. But I would like to ask: Is transparency enough? Is it an effective way to build a long-lasting relationship with a customer? Is it going to build trust in a brand and in a product?
For a long period of time, we would be trusting local people and their knowledge to make our choices since, in a small village, reputation was everything that people had; reputation was built over time based on the quality of the service and spread from mouth to mouth. But the internet has turned our villages into global villages full of strangers, complex technology and businesses that we don’t know. We are surfing on the verge of the illusion of the trust that has been mastered to perfection by mouth-watering design, convincing content and first customer bonuses. Maybe the question should be how to rebuild the trust in the society since it is an alarming issue and start from there? In this article, I am not going to talk about the aspects such as creating value to the customers that are irreplaceable in order to build a successful product and win user’s long-lasting trust. Instead, I am going to dive deep into creating the mechanism of trust by using the aspects of human psychology to evoke the feeling of trust and win the customers.
Let’s recap why looking at the trust as a design issue is important:
In today’s society, we need to trust not only people, but institutions and systems. It’s not so much that I trusted the particular pilot who flew my plane this morning, but the airline that produces well-trained and well-rested pilots according to some schedule. And it’s not so much that I trusted the particular taxi driver, but instead the taxi licensing system and overall police system that produced him. Similarly, when I used an ATM this morning — another interesting exercise in trust — it’s less that I trusted that particular machine, bank, and service company — but instead that I trusted the national banking system to debit the proper amount from my bank account back home. Source
Let’s start by understanding how trust works in a product-driven economy. There are three major layers of trust that allow the companies to create revenue by leveraging these three major levels in their customers.
Trust is like a chain effect. If other people use that service it means it has gained its reputation and must provide what it promises. People usually would choose the service that someone they know recommended them. It saves time and they know they not risking anything or at least the risk has been substantially minimized. Web search costs time and implies lots of risks. The web-based reputation system might also not be reliable as the reviews might be bot-generated or the company paid to be advertised to appear in your search results.
The crisis of trust is paradoxical: We increasingly entrust our wellbeing and security to institutions, technologies and strangers but, at the same time, we report greater feelings of mistrust or an erosion of trust in these same institutions, technologies and individuals. Source
On the other hand, our society is becoming more fragmented and the recommendations that would pass from word to word has become less of a common practice. The communities within which we live have become more distributed and they don’t allow for everyday interactions that would allow for knowledge exchange. These communities often live online, and therefore we lack verification methods attached to them. People online might not be directly related to us and they can’t be verified by people closely related to us. The reputation online is based on different metrics than reputation within real communities. We must rely on other sources of trust and search for other sources of validation.
…despite our growing need to trust others and institutions due to the complexity of modern life, trust is considerably harder to establish — we no longer have the same guarantees that others are trustworthy, nor the same recourse should our trust be betrayed, that we had with frequent opportunities for face-to-face interactions. Source
Lack of trust not only disturbs the social domain but it also doesn’t let the innovations to thrive as they might be lacking the users to come and try the new products/ services. Trust is seen as a “pivotal factor in determining how we progress as a society”, it allows us to get used to doing new things which keeps moving us forward.
(Six Ethical Principles For AI By Microsoft Design)
If we want to have a free market with a healthy competition between the organizations we need to know how to win the real trust of the user. Consumers are becoming smarter and smarter every day. Their trust has been abolished a long time ago with many of the recent privacy concerns circulating around the web. Companies are using that opportunity of showing their transparency and trustworthiness as it is going to speak directly to the emotions of the consumers and help them make the right choices. In the ever-growing data-driven society we will more than ever be relying on our emotional responses switching off our rational analytical parts of the brain as the information that needs to be consumed will be too overwhelming in order to make a choice. We will be influenced by how we are targeted. And there comes responsibility for the designers to make sure that trust is targeted for value-creation reasons and not misused. The responsibility is also to bring back the social trust among the peers using the products/ brands services.
To achieve this, we must focus on emotions our product delivers, using intuitive information architecture and a delightful design to manage it. Explore your users to connect with them emotionally. UX Design Agency for Banking and Fintech
Transparency and privacy have become marketable aspects of our century, therefore, consumers are not that easy to convince especially when it comes to sensitive data and they rather give trust to larger companies. But we want an economy to thrive on new businesses too. Most of the digital products currently on the market require us to fill in the personal information. They ask for this information in order to proceed with the service. How would we feel if a stranger on a street would ask us for our name, phone number, home address, and social security number so that we could receive better services in return? It would most likely seem bizarre to us to do that and that promise wouldn’t be based on any realistic figure. But the reality is quite different. We voluntarily disclose our most personal information on an everyday basis. While giving this trust to these companies makes our life easier we shouldn’t forget that this trust can be broken at any moment without any expectation.
Let’s look at some of the examples of how companies are tackling the aspect of transparency and what steps they take in order to communicate and win the trust of the users.
Juro is very explicit about the use of the data that you leave on their website. They have listed the information about:
While this is a lot of information to process it is a great step forward towards educating users about the use of their data. But let’s take a step forward. How actionable is this information, can they opt in and out from specific points of this policy? Will being transparent win the consumer’s trust in the first moment even before signing up for a service? Should we guide the user across the terms and conditions so that it can work for their benefit?
A studio called Greater than Experience led by Nathan Kinch has been conducting research around translating data to the customer, basing the product design around data transparency for the win-win solutions such as winning users trust and maximizing the revenue. They provide solutions that create a competitive advantage over others in terms of customers allowing the organizations to gather more data after consumers learn about how the data is actually being used and what they can gain from this acquisition and eventually accepting the terms and conditions. They have created a framework called Data Trust by Design which you can see below:
So how do we go about gradually building trust when the user first encounters a product? While building lasting trust with a customer is the most precious thing you can create for your business, it requires further examination of how the trust works in the society and making a step ahead to bring it to the product design. And that mechanism of trust will vary depending on your type of business and that has to be adjusted to how your customers think and what they value the most. They might not tell you that while doing user testing, therefore, it would require further research into your industry to uncover common patterns. Below I have presented few design solutions of how trust can be integrated into the onboarding experiences or even stages before that such as discovering a product.
Reputation Capital (2012) Rachel Botsman
(Evidence-Based Behavior Change In The Real World — Paul Cohen, 2019 Source)
In the recent Interaction ’19 conference, I had a great chance to listen to Paul M. Cohen talk from One Medical about incorporating behavioural economics, marketing science, and cognitive psychology into lucid and practical lessons for designers. One of the examples which helped me to shape this article was part of Paul’s talk about designing for identifiability and how people tremendously change their choices about taking the vaccine if they can more easily identify with the options presented. In that case, using the real name of doctor led to more activity around that option instead of using a generic name of a medical team. People more easily trust real people than names of the organizations that never heard about. It is hard to measure the reputation of the medical team rather a single person they might have heard from a friend. When raising funds people will more likely donate money to the individual rather than a nameless faceless person. Continuing that line of thought Paul has presented more examples of social identifiability in the product design on his website. Here are some of them:
(Technology adoption model by Sebastian Manrique)
I would assume that in the near future we are going to have AI embedded into our personal devices that will preselect options for us from a wide range of choices based on what is going to be beneficiary to us but that will also be in consent with human approval. And the fact that recommendation can work differently for each user based on his perception of reality and society the autonomous system might not be fully capable of making these choices. They might know our social circle to a certain extent but they will not know our value system or what features of the brand might be appealing to us. And it will not know which friends we want to trust and who words mean more to us.
I would suggest, for successful product adoption, acquire two systems of trust. One based on how the trust has been evolving over the decades in the society and leverage that ( by embedding the notifications for the user about the choices of the people that we know, which will more likely increase the reliability of the product and help with making choices) and secondly reinvent/reestablish the meaning and system behind the new mechanisms of trust based on the product/service. But how the user can actually learn about the product?
The solutions that I am going to provide you with might not be that easily adaptable to the current search/find systems as there is no unique way to find a product or a service but these are meant for the further development of these discovery procurement services. Some of them might be displayed before even opening product page or an app and some might be part of the onboarding strategy. The trust roadmap doesn’t have a definite beginning either end as it is part of each individual business strategy plan.
1. Incorporate traditional ways of how trust works in a society into a product
According to my research, it is safe to say that users completely trust our product if they recommend it to their friends. If users are trusting you with their personal information, they are more likely to trust you with their sensitive and financial information. Aimen Awan
(Importance of competence at early stages of trust Aimen Awan)
2. Ask the domain experts for the recommendation — an idea for a platform; if you would like to scale out the value of the individual and create revenue based on the recommendations. These domain experts either could be within a circle of the people that you follow or could be totally new to you
3. Reinvent what trust means for your product and develop a mechanism to make it work for your service
If you are interested in learning more about my approach to designing an identity for the distributed autonomous economy you can read my other articles titled Designing a decentralized profile dApp and Distributed self-sovereign identity both published by UX Collective.
Don’t hesitate to get in touch via email, or👇
Disclaimer: These are the projects that I do independently in my free time out of my passion and they are not connected to any organization and I am not collaborating with the companies mentioned in the article.
Create your free account to unlock your custom reading experience.