In the core of many trades are often put in place Codes of Conduct that will allow workers to adhere to guidelines that are mostly associated with the respect of human rights. These codes can be heard intrinsically and do not necessarily have an official contractual value, but in case the parties agree that the non respect of these « rules » has been reached, the one who committed the transgression may get sanctions, for ethical reasons.
One of the most striking examples in the field of ethics is the Hippocratic Oath, which allows medicine to ensure the proper practice of its profession. However, there are modern forms of guidelines, such as the Declaration of Geneva, established in 1948, which implies how medical research should respect the principle of consent and the protection of confidential information. Except that today we are in 2018 and the world we know is experiencing differences that radically change the way we communicate our disagreement when a transgression is made. In addition to that, information can leak and the potential of personal data has never reached such vulnerability. Thereby, the World Medical Association updated modern versions of the Hippocratic Oath along the years. The last one is the Declaration of Taipei, and it concerns the Research on Health Databases, Big Data and Biobanks.
If we recognize that this kind of declarations can establish a list of statements that requires scientific research to regulate human experimentation in order to maintain an acceptable framework of practice (this includes the consent of the participants in the research concerned), we should recognize that we are now facing another form of human experimentation that needs substantial ethical consideration: the use of our personal data circulating within a virtual network.
It would be potentially necessary to think that we would need particular driving instructions that concern this virtual domain, because the question touches the whole world today. It is no longer a secret that the information we deliberately share on social networks such as Facebook become potential data for commercial or even political purposes, as was the case with Cambridge Analytica.
In this precise example, it seems clear that their “analysis” concerns the field of research. But if the use of personal data has been carried out in an intrusive and non-consenting manner, then we may be able to compare it with an act that has exceeded the ethical rules of the proper use of data that belongs to private property. It is indeed possible to see a violation of the rights of consent on the use of personal data, as we may follow with the Declaration of Helsinki adopted in 1964 where it is stated that « it is the duty of physicians who are involved in medical research to protect the life, health, dignity, integrity, right to self-determination, privacy, and confidentiality of personal information of research subjects. ».
The purpose of the several declarations regulated by the World Medical Association was to enable researchers in the field of medicine to moderate the possibilities of abuse concerning the potential activities of experimentation on the human being. It was therefore a moral means for the time to respect human dignity as a priority over the techno-scientific possibilities that continue to grow today.
The current problem we face today is however a bit different. The collection of personal data is supposed to be understood as a hidden trade between a free service in exchange of personal data, which can also be called Economics Privacy. However, according to a recent investigation, Facebook would have liked to collect medical data in order to associate them with profiles registered on the network. While this is an extreme situation that goes beyond any moral respect for the consenting use of privacy-related data, there is another aspect of our world that is finally not as obscure and that should be developed honestly, actually, as the consumption of these tools is becoming more and more common: the IoT (Internet of Things). If the adoption of such technological equipments are understood to function specifically as data gatherers, what could happen if the said tools collecting personal data do not aim medical purposes but contain the greatest potential of private information?
From this reality, it would seem that we are dealing with a new need: the ability for users to feel the right to own the responsibility to control their personal information given to any device or network that involves the collection of personal data. Let’s take a look at the Declaration of Taipei :
« Respecting the dignity, autonomy, privacy and confidentiality of individuals, physicians have specific obligations, both ethical and legal, as stewards protecting information provided by their patients. The rights to autonomy, privacy and confidentiality also entitle individuals to exercise control over the use of their personal data and biological material. »
So as to honor the liability that arises from their products, each society that wants to study the personal set of data of their users should consider the same respect for dignity than the medical practitioners. Especially because most of the information collected go beyond what is consensually shared. The ideal case would require a moral transparency from the companies that collect personal data, in order to allow their users to control the circulation of those information and thus be aware when these data are used by a third party. For example, in Europe, Estonia has already put in place such a system, which allows each individual to be notified when private data are inquired by a third party, such as the police or the medical sector. The world of the virtual industry should follow the honest transparency of this system (see SecurityPledge for more information on this type of projects).
Let’s try to see the current case as if a collection of data aimed at a scientific or marketing research project had to be realized. Let’s say it involves personal data, shared voluntary or not on the Internet. In order to respect the « rights to autonomy, privacy and confidentiality [that] entitle individuals to exercise control over the use of their personal data », it should be necessary to obtain the consent of the person whose information will be used. There should be no problem with this consent being given, as users would be willing to agree to share their data if they are aware that it could be used to improve their condition, for example. In the case of medical research, only 7% would disagree to share their personal data.
However, what makes the need for consent even more necessary today, goes beyond a medical necessity, as personal data can also be collected via sex toys. Furthermore, a list of criteria that respect our legitimate right of control over our personal information, as stated in the World Medical Association declarations, should become a unavoidable ethical priority in our world today. The certification of these principles are already in practice, for example with the Open Internet things Certification Mark.
When we consider the uncertain future regarding the use of personal data circulating on the Internet, the objective is not to cancel their collection, but to put in place a regulation which makes it possible to validate a research only if the participants consciously consented to the use of their data. Just as the ethical minimum of every scientific research should be, one should not be allowed to validate an experiment having used personal data if they have not received the official and informed consent of their participants. Each study that collects personal data beyond the informed consent of the owners of those data should be considered illegal or invalid.
Cases where the use of personal data should be re-considered:
- Facebook’s controversial collection of Data
- Cambridge Analytica on Brexit
- Ashley Madison data dump
- Facebook’s use of advertisements
- Facebook’s non-consensual social experiment
- Big Data and politics
- Artificial intelligence and biased decisions
- Personal identity
- Services’ price based on personal data
A large amount of personal data is upon our responsibility. The awareness of our common sense should therefore be the priority after all.