In a previous article, we outlined investment opportunities in companies utilizing decentralized software solutions such as blockchains.
While the number of projects in the space is growing steadily, at present these networks continue to struggle to achieve the goal stated by its pseudonymous inventor to serve as “peer-to-peer cash”.
A much greater challenge still, is the direct transfer of non-bearer financial products, as these frequently require the reliable identification, and authentication of the entities involved in the transaction - not to mention the verification of these instruments onto a peer-to-peer network utility.
The following will review the tenets essential for sustainable solutions aiming to enable peer-to-peer value transfer while addressing identity through technology.
Throughout this assessment, identity - and its digital correlation - is reviewed from a first principles, science-based perspective, necessarily ignoring philosophical or ethical considerations, further omitting legal constructs that have culminated in some jurisdictions in concepts such as
‘corporate person hood’.
Surprisingly, characterizations of identity intended to lend themselves to a technical analysis seem to take on a purely abstract form, ignoring aspects of anthropoid intentionality entirely.
The definition provided by the National Institute of Standards and Technology provides an example echoed with slight variations by many other linguistic constructs introduced by technologists: “Identity is as a set of attributes that uniquely describe a person within a given context.”
As with most other failed linguistic efforts of clarifying the tenets of identity citing “sets of attributes” - and more often: information or data - this description lacks any reference to human-directed activity rising to a level of legal relevance. Instead, the definition merely refers to a collection of data that can be more appropriately summed up as a ‘profile’.
The latter is frequently used to target individuals with information intended to elicit a particular behavior such as the purchase of a product or service.
'Digital Identity' is a metaphor typically ascribed to a set of data, and/or the manipulation thereof. The correct technical term for the latter is 'profile'.
Systems and companies assembling profiles are not concerned with the identity or the management thereof, but rather manipulation of human
activity for profit ("profiling").
Specifically when combined with behavioral data this type of targeting breaches the threshold of human agency, into the sphere of behavior directed via deceptive manipulation to act in a specific manner.
Headline-making examples such as Cambridge's Analytica's mass manipulation of voters using user profiles harvested on Facebook's platform should be considered just the most egregious examples of this form of social engineering.
Data of billions of internet users are assembled into individual profiles, frequently optimized to consume advertising copy. And, while this type of manipulation prompted futurist Hazel Henderson to coin the term the attention economy in 1999 (it was later popularized by Thomas Davenport in a book with the same title), thus far it has not risen to a level of awareness prompting regulators to address externalities of these social-engineering-as-a-service enterprises. However, even multi-billion dollar fines (the FTC imposed a $5 billion fine on Facebook in 2019), are just the cost of doing business for companies peddling human attention, and no other business model in sight.
Somewhat ironically, it is the result of these manipulations in the form of human-directed activity which rises to a provable and legally relevant outcome - i.e. the purchase of goods or services. As such, the technologies that can most accurately be described as "identity management" to-date are those which harvest human attention for profit.
However, as these platforms are designed to increase shareholder value of for-profit companies posing as information providers or search engines, management functions rest almost exclusively with the systems engineers. A function which ultimately classifies its users as products to be
sold to paying clients, as such negating agency functions to the individual.
Identity can be observed as human directed activity - i.e. "attention" - in a legal sense - with agency. The latter is frequently absent in users of technology designed to direct human attention - most prevalent today in systems that blur the lines between effective advertising and social engineering.
Websites such as CryptoCompare.com document that blockchain-native assets such as Bitcoin's mining reward bitcoin and Ethereum’s fee mechanism ether are being exchanged daily for the equivalent of billions in US Dollars.
However, a closer look at the site’s statistics shows that most of these trades are taking place on marketplaces with centralized order books operated via databases. The operators of these trading venues take control of their users’ assets, storing the associated access data in databases under their control.
And, while the custodial nature of these exchanges all but invalidates the digital bearer functions of blockchain-based assets, operators further subscribe to supervisory burdens imposed by many nation-states in which these marketplaces want to acquire customers.
As a result, even exchanges not taking control of user assets - often referred to as ‘decentralized exchanges’ - have bent to the pressures of governmental agencies, and well-meaning - but misguided - legal professionals - intent on inflicting regulatory requirements such as anti-money-laundering and know-your-customers regulations on these platforms.
Consequently, the exchange of assets does not happen peer-to-peer but is susceptible to censorship by a wide variety of state actors and their agencies, as well as arbitrary policies implemented by the solution providers.
As the pivotal innovation of blockchains, self-executing software code - often referred to as “smart contracts” – in principal enable autonomous organizations, and digital bearer instruments (more here).
While the former may ultimately prove to be the most impactful creation, the latter continues to pose a technical and intellectual challenge for many companies hoping to provide disruptive value transfer solutions.
Metaphorically a collection of smart contracts can be understood as a digital vending machine, a concept first introduced in an article by Nik Szabo in 1997.
A user of this automaton may send an offer - such as a digital token - into the machine which in return dispenses the desired product. The latter would have usually been deposited by the creator of the asset or the owner of the requested right.
This often neglected distinction is necessary as the enforcement of rights frequently requires the identification of the previous and/or new rights owner, while the direct transfer of blockchain-native bearer instruments - such as Bitcoin’s mining reward bitcoin - requires no disclosure of the transacting parties.
In a narrow sense only these digitally native implementations can be considered digital assets, while external transfer requirements point to a derivative nature of the exchanged value, taking on the legal nature of a rights transfer. These conditions are regularly expressed in form of Know-You-Customer (KYC) rules.
While most of the providers of KYC solutions proclaim to provide identity solutions, these implementations are not concerned with human-directed activity but limited to the identification of a person, most often expressed through government-issued credentials (“fiat persona”).
This credentialing requirement voids the permissionless nature of peer-to-peer transfer, and regularly introduces new middlemen which in addition to maintaining databases of user records, may assert further custodial functions.
As with centralized exchanges dealing in cryptocurrencies, platforms facilitating the trading of products not native to blockchains, decentralized networks. Consequently, platforms providing these functions fail to provide the peer-to-peer value transfer, invalidating claims by start-ups proclaiming to provide exchange functions for ‘security tokens’ (more here).
Decentralized networks carry the promise of un-intermediated value transfer from person to person through provable, public transactions.
However, current implementations of “identity technology” regularly negate the peer-to-peer, censorship-resistant exchange of value. This is particularly the case for platforms employing so-called know-your-customer "identity" solutions.
Approaches which introduce one or more custodian of participants’ assets or data, not only invalidate the innovations introduced by decentralized systems but might further fail to provide data privacy rights afforded to individuals under new regulations - most significantly the right to "be forgotten".*
In the context of technology-facilitated commercial activity however, the detection of an agency is a prerequisite for the enforceable transfer of assets and rights from one entity to another.
While the latter may describe a legal construct or a person, an agency is ultimately a uniquely human quality indispensable to a definition hoping to address direct value transfer through technology.
As such identity in the context of technology hoping to address peer-to-peer value transfer is the ability to convey agency.
For technology to enable direct value transfer of rights and non-bearer assets, a human agency must be conveyed, including the authorization, authentication, identification of the buyer and/or seller, and in some instances “proof of life” without a third party.
Viable approaches will enable users to provide any manner of agency sufficient to deposit the value into a smart-contract-build ‘digital vending’ machine, where other users might retrieve it.
*Note: The “Right to Erasure” or deletion, or more famously, the “Right to be Forgotten” is not a new right. Its origins stretch back to the pre-GDPR era when Mario Costeja Gonzalez sued Google to suppress search results
about him that described his earlier financial troubles. According to Mr.
Costeja, the links were irrelevant and damaging to his reputation. The Court of Justice of the European Union (“CJEU”) held Google was generally obligated to remove links that were inaccurate, excessive or irrelevant.
Create your free account to unlock your custom reading experience.