paint-brush
Measuring Decentralizationby@f1r3flyceo
406 reads
406 reads

Measuring Decentralization

by Lucius MeredithAugust 7th, 2023
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

We applaud Congress for challenging Chair Gensler’s proposition that everything except Bitcoin is a security. That peculiar, and technologically illiterate, ideology is a special brand of pernicious nonsense, It not only harms consumers. It is already creating a severe brain drain of experts, who are leaving the US for Europe, the Middle East, and Asia. We want to help their challenge robust and future proof.
featured image - Measuring Decentralization
Lucius Meredith HackerNoon profile picture

First and foremost we would like to applaud Congressional thought leaders for taking the bull by the horns. The technology underlying decentralized digital assets poses significant challenges to the current legal frameworks and it is difficult to get to grips with it. Striking a balance between protecting the public  –  investors and customers – while encouraging innovation is not an easy task. Congress should be commended for this good faith effort to strike that balance.


That said, Congress has only one sitting member, Rep. Bill Foster (D-IL) fully proficient in the technicalities of the blockchain. As the old saying goes, it is just as important to do the thing right as to do the right thing. And to achieve its stated goals the proposed legislation needs some good hard tweaking to achieve what it seeks.


So let’s do our best, in plain English, to enable the Congress to decode the mysteries and reengineer the proposed legislation.Like the legislators who have proposed the ambitious blockchain legislation, we recognize the enormous potential of decentralized digital assets, and their underlying technology, the blockchain, to change the commercial and social landscape in very beneficial ways.


The current dominance of “Web 2.0” architecture came about because of a simple trade-off made by the average consumer.  Let’s call her “Grandma.”


Grandma wants the ease, convenience, and utility of having her data, her content (everything from family photos and favorite songs to financial and healthcare records, and more) digitally available, anywhere, 24/7. But Grandma does not want to nor is she equipped to be responsible for the care and feeding of a basement full of servers to keep all this content available and completely private, or at least under her primary control.


So Grandma outsources that responsibility.  She lets her healthcare providers, Google, Dropbox, Amazon, and other companies store and handle her personal data for a nominal fee… or for the privilege of serving ads to her.


This choice comes at a cost. As any motorcyclist will tell you, it’s not a question of whether you will go down, it is a question of when. Thus it is prudent to retain as much control of your vehicle as practical. Likewise, it is not a question of whether these digital service providers will be hacked and Grandma’s, or your, personally identifiable data breached and very sensitive data fall into the wrong hands. It is a question of when.


The headlines noting these sorts of breaches are so common that the average consumer of digital services has become inured to them … until someone steals their identity, or charges a first class ticket to Cancun to their credit card. Perhaps more consequentially the economic model underlying Web 2.0 companies has had enormous and sometimes adverse social consequences.


Critical to the bottom lines of data service providers are the ability to sell off “anonymized” versions of aggregated personal data. To paraphrase the Princess Bride, “I do not think anonymous means what you seem to think it means.”


The well documented antics of Cambridge Analytica exploiting the data about us that Facebook made available to them, and, more recently, Twitter’s antics are just the tip of the iceberg. These abuses have caused concern among users and to those at the highest levels of government.


Perhaps even more to the point, the data providers, i.e. the consumer base, have never been offered a share of the revenue from the sale or use of their data. What if Big Tech gave consumers an option to participate in that revenue stream?. With the partial exception of monetization of content channels, like YouTube, this is largely untapped economic potential.


Tapping into it requires a change of mindset, as well as a change of technology base. The omnibus legislation under consideration can be adapted to enable this change. Giving users a share of the revenue stream by affording intellectual property rights akin to copyright – ownership in their own personal data – could be very, very popular with voters while providing constructive incentives for the development of the technology..


As the technology for decentralized digital asset management services scales up, Congress can give consumers new options. Instead of trusting our personal data to a third party, like Google, we can use a decentralized infrastructure to manage it… without having to have a basement full of servers and other hardware.


In today’s decentralized digital asset technology base that proposition is largely limited to financial data. Yet soon, very soon, as the technology scales, it will increasingly apply to all data.


Grandma will be able to securely and efficiently  store her private healthcare records on a public network that is not run or controlled by her doctor’s practice, hospital, or health insurance company. Rather, many parties, so-called miners or validators, are providing the network with computational resources.  They make money doing that. And Grandma’s data will be protected by her private key(s), and considerably stronger cryptographic guarantees.


It will be  solely under her or her designated agents’ control. In fact, her data will be opaque to the network service providers while her basement remains server-free. Unless Grandma decides she wants to participate in the new economic engine afforded by decentralizing the network, and maintain her own server farm. In either case, she will enjoy the option of keeping her secrets to herself, or anonymizing them and “renting them” to those who wish to commercially exploit them.


This reconfiguration of who runs and controls the network will not  change things just for Grandma. It changes things for every single digital provider, from Facebook, Twitter, and Google, to Aetna, Blue Cross Blue Shield, and Regents. It gives consumers more choice and more control.  And it is a win-win scenario, also providing Big Data tech companies whole new economic opportunities to exploit.


This is one of the reasons why we applaud Congress for challenging Chair Gensler’s proposition that everything except Bitcoin is a security. That peculiar, and  technologically illiterate, ideology is a special brand of pernicious nonsense, It not only harms consumers. It is already creating a severe brain drain of experts, who are leaving the US for Europe, the Middle East, and Asia.


In plain English, the SEC is driving billions, maybe trillions, of dollars of wealth offshore.  Chair Gensler, whether negligently or cynically, is deporting tens or hundreds of thousands of high paying jobs, the opposite of the Biden Administration’s stated policy.  Bad regulatory policy is destroying  vast economic opportunity while destroying American technological superiority.


To preserve wealth, opportunity, great jobs and technological superiority based on the emerging technologies such as blockchain, the Congress rather must  craft the legal mechanism to the mechanics of this tech.  In this article we focus on some of the fundamental issues with the proposed definition of decentralization and make some modest proposals as to how to adapt it to fully achieve the potential.


Specifically, we will explain why using the draft legislation’s proposed asset distribution rule as one of the criteria for determining decentralization is neither practical nor would it achieve the legislators’ stated goals. Additionally, we will discuss some of the challenges for using the proposed software governance models as a criterion for determining decentralization.  The initial legislative criteria  suggests a  fundamental misunderstanding of how software works.


Readily fixed!


We also will urge the inclusion of a realistic safe harbor clause, an easily formulated provision currently absent from the proposed legislation. Many blockchain companies have worked in good faith while enduring regulatory ambiguity, being put at risk by an overreaching SEC. Good faith efforts to put the US at the forefront of this technological development should be rewarded, not punished.


Rather, to reestablish a reasonable climate for entrepreneurs and developers – to stem the brain drain, which is fast becoming a flood of Biblical proportions – we need to provide reasonable criteria under which various commercial enterprises can be on-shored rather than deported.


Finally, in this same spirit we wish to emphasize how important it is that entrepreneurs not just feel but actually be included in the political discourse. Currently, many, perhaps most, of our most valuable innovators are under the distinct impression that policy setting is a  politicians-and-bureaucrats-only discussion. That has induced  many key players to vote with their feet and take their business to Dubai, or Zug, or Berlin, or Singapore.


Demonstrating that the policy  process is not only open to their input, but that legislators are actively seeking the intelligence and wisdom of people who have actually done the work and reaped the rewards of experience – demonstrating this will go a long way to bringing the expatriated value back to US soil.

Proposed Definition of Decentralization

The current draft legislation attempts to provide practical observables for determining whether a decentralized digital asset is, in fact, decentralized. One of the observables proffered is asset distribution. The draft legislation proposes a 20% rule, arguing that to be considered decentralized a digital asset can’t be concentrated in the hands of a few. Instead no one holder of the asset can have more than 20% of the asset.


This sounds great in theory.  It does not, and cannot, work in practice.


Challenges to the 20% Rule

Addresses ≠ Identity

Addresses on a blockchain are not the same as identity. This is true for Bitcoin. It is true for Ethereum. In fact, it is true for virtually every major blockchain. But, since this is the case, it is impossible to discern whether any one person controls more than 20% of the total distribution of assets. Determining asset distribution is a hard problem anyway, for which financial forensics experts are extremely well paid, because people often have multiple economic agents, from shell corporations to holding companies, managing assets on their behalf.


But wait!  It gets worse!  As zero-knowledge technology matures, enclosing private transactional data in virtually impenetrable security, knowing who controls the assets will be virtually impossible.


Beyond the discussion of the impracticality of such an asset distribution rule, there is the question of values. The lead  author of this article is decidedly left leaning. Yet even he is queasy about imposing an asset distribution policy. Is it really a free market if the government is determining asset distribution?


Does centralized state involvement in asset distribution sound like capitalism? This superficially innocent rule is a great leap backward toward the central command and control economics that failed, spectacularly and repeatedly, in the 20th century and continues to fail today.


One would think that Republicans, at least, would know better.


Trades that Create Transitory Violations


Consider, for example, a situation in which a cohort of digital asset holders are all seeking to exit their positions, each for their own individual reasons. It might come about that a single buyer would be willing to buy them all out, not for reasons of control but for purely financial motivations.


If the buyer’s original position combined with the added ownership purchased from the cohort comes to more than 20%, suddenly the properties he is buying magically converts from property into a security. But, suppose like the banking industry in 2008, it is the US government stepping in to support an asset “too big  to fail,” does that act  convert the asset to a security?


What if the action was only a temporary measure and the position quickly liquidated? Does its status revert to a non security? Should it? What about transitory trades? What if an asset holder only has 20% of the distribution for a few seconds before they sell off some of their position? Does that cause the asset to convert to a security? What about a few minutes? A few days? Where is the dividing line?  While the government is in the business of drawing such lines, in this case the lines are purely arbitrary… and capricious.  Not good policy.


More importantly, who really wants the government to be wading into these market dynamics? Mao (the Great Leap Forward, the most lethal famine in history) and Stalin’s holodomor killed countless millions with centralized economic policies.  Who wants to take such a great leap backward?

Whales on Common Networks

There are certainly whale cohorts on Bitcoin who control more than 20% of that network. As previously mentioned, it is virtually impossible to know who controls these addresses. Moreover, if they move as a block, does it matter if they are different people?


Finally, it is possible  that whoever controls the wallets of Bitcoin’s mysterious creator, Satoshi Nakamoto, controls roughly 63% of that blockchain. Thus, by the definitions in the proposed legislation Bitcoin, the one digital asset Gary Gensler is certain is not a security, would be a security.

We are very much in favor of providing Chair Gensler a much needed sanity check  in the form of a sensible conceptual  framework. A definition adding Bitcoin to the list of victims of Chair Gensler’s jihad is not one of them.

Think Computing Resource as Commodity, not Percentage

To achieve its very good goals legislators should be thinking commodity, not percentage. In the current and contemplated markets digital assets are not like scheduled or toxic substances, the distribution of which need to be carefully controlled to protect the public. Nor are they akin to commodities that might come under price control, like rent, or fuel, the distribution of which also, are often managed for the public good.


In most cases digital assets are really just proxies for computing resources.  There is no shortage of compute power, what with smart phones in everyone’s pockets and laptops on everyone’s desk and data centers the size of cities encircling the globe. In such circumstances, where the commodity is readily available, there is no justification for rationing. And as such, asset distribution is simply not a proper signal of decentralization.


Instead, the compute resources themselves that make up the network serving the digital assets are a somewhat better signal, though far from perfect, as we have argued in another article. The salient observable is: who controls the miners and validators that make the network possible. If the control of these programs and the machines they run on are concentrated in the hands of a single entity, then the network is not decentralized.


For example, if Google were in control of all of the validators on a fork of the Ethereum network, and all of the servers running those programs were running on Google Cloud Platform, then that fork of Ethereum simply cannot be called decentralized. If, on the other hand, we found a healthy diversity of validators, run by a relatively large community, some of which were run on machines provisioned on cloud providers and some of which were run on machines maintained and managed by private individuals and companies, then it would be correct to call that fork decentralized.


In a similar manner, people talk about the concentration of hashing power of the Bitcoin network. If the miners and the servers that run them come under the control of a relatively small community, then we view that network as centralized. This was a topic of much discussion when Chinese mining companies controlled some 65% of the Bitcoin hashrate.


What muddies the water for a measure like this is that digital assets can be made platform or network independent. Many blockchain protocols, such as RChain, have shown that assets originally developed on the Ethereum network can successfully be migrated to assets on a different layer one platform. Further, this is, in some sense, the essence of both DeFI and protocol interoperability efforts.


They provide mechanisms to liberate digital assets from being tied to a given network. However, when there are limited or no paths for assets to migrate from one network to another, and the network's actual compute resources are controlled by a small community, we can safely call those assets centralized.


Effective policy will need to focus on this as a practical observable to measure decentralization of a digital asset. It’s what the technorati watch, and for good reason.

Challenges to Code Control


Complementary to the physical infrastructure necessary to run a network that serves a digital asset is the software that runs on the network, providing the logic for how the digital asset works. Again, it is a natural inclination to look to how that software is controlled as an observable for the decentralization of the network. Unfortunately, code, even open source code, runs at odd angles to this inclination.

Software Evolves

We are at the very beginning of this technological revolution. As a result, one thing that history teaches us is that more than 90% of the code deployed is probably wrong. This fact is weighted more heavily for not-yet-contemplated design or architectural developments and improvements that we can be confident will be discovered, than for many latent bugs hiding in the code, that will also inevitably be discovered. Consider the bug fixes regularly issued by the venerable Microsoft, Apple, as well as the Linux communities!


As such, we cannot lock down the code underlying a digital asset. We must give it room to be improved, just like we need to give products like the iPhone or the Tesla room to be improved. Technical stagnation is death, especially at the beginning of a technological paradigm. The real Faustian bargain was not an exchange of Faust’s soul for knowledge. Faust would forfeit his soul to Mephistophelse if at any time he said of the moment, “Verweile doch, du bist so schön,”But stay, thou art so fair!”


So, who can make these improvements? How are modifications accepted and deployed to the network? What is the process by which independent miners and validators adopt these changes? More to the point here, can we make an observable for decentralization from the size of the community who can modify the code, or the process by which a proposed change is accepted?


Software is Brittle


As all developers know, software is brittle. Changing a single character, say a plus sign to a minus sign, in a single line of code can result in dramatically different behavior of the software. Such a change could be hiding in otherwise innocuous changes in a pull request to an open source repo. This makes any one contributor potentially responsible for the behavior of the entire network.

Modularity as Antifragility

One defense against this brittleness is a trick that software developers have learned from Nature: modularity. Much like hardware, software that is assembled from relatively isolated, and somewhat redundant components is less brittle (antifragile). The isolation of components tends to restrict a failure mode to a smaller subset of components.


Obviously, if the entire network loses its power supply, all the software components fail together. However, a problem in the logic of one component is less likely to be a problem in another, especially if they share little, if any, of the same code. Further, if components are made redundant, then one can fail while another takes its place. This means that an error, or even a number of errors can manifest and the system can still limp along.


Thus, while it might be somewhat surprising to folks who are not software developers, a precursor to decentralization of a digital asset is modularity of the software architecture underlying it. In particular, modularity not only makes the running system antifragile, it allows different communities to work independently on relatively isolated components.


Let’s consider an example. A common design pattern for a smart contracting blockchain architecture consists of four components.


  • Node discovery and communications layer (such as Kademlia)
  • Storage mechanism (such as a key-value store)
  • Compute mechanism (such as the EVM)
  • Consensus mechanism (such as Casper, or proof-of-work)


Each one of these is relatively isolated in function. For example, one can, in principle and practice, swap out one implementation of a key-value store (such as LMDB) for another (say Reddis) – or even one type of store (say key-value) for another (say relational) – without noticeable difference in the behavior of the network. Likewise, various blockchains, such as RChain, are architected with the explicit design goal that the consensus mechanism can be swapped out. In fact, this ability to swap out components is the definition of modularity.


Modularity like this limits exposure to dependency on any single component, and the community that produces it. In the same way that an automotive manufacturer prefers to have multiple tire suppliers, so as not to be beholden to a single vendor, modular codebases can source different suppliers of the components of the modular design from different suppliers, as is in the example of swapping the storage component.


Of course, much of this is simply common sense, and we agree. We feel that the proposed legislation should have a healthy dose of common sense and include measures of modularity of the software underlying a digital asset as part of the criteria for decentralization.

Community as Antifragility


Another critical defense against the inherent brittleness of software is community. No single proposed change to the code goes without peer review. Linus’s law: With many eyeballs, all bugs are shallow.” This creates an entire complex of tensions: peer review slows down development and response to critical errors; a community of peers is often small; too many cooks spoil the soup.


A recent development in open source projects generally, and in blockchain projects specifically, is a community improvement process. It’s a code governance process by which a proposal for improvement to the codebase can be submitted; the pros and cons deliberated; and a decision reached as to whether the improvement should be developed, by whom, and by when.


Such a process can be subscribed to by a community of relatively decent size. Issues of credentials of participants come out in the wash of public debate. On the Internet no one cares if your dog found the bug fix. (Twitter will go nuts, but that’s an entirely different matter.)


A measure of maturity and decentralization for the software underlying a digital asset is whether there is a well established community improvement process both for long range changes to the codebase and for immediate and mission-critical bugs. Additionally, the size of the community who participates in these processes is a measure of decentralization.

Forking and licensing

One of the most important developments in the management of code, and open source code in particular, is the practice of forking. Essentially, forking is making a copy of the codebase at a particular state in its development and making modifications from that state. When a particular developer community feels that the evolution of the code would be better served by a different approach, they make a copy and pursue that development arc independently of the evolution of the original codebase.


It’s akin to a software equivalent of the many-worlds hypothesis in quantum mechanics, except that nothing prevents forks from merging in the future. So, in this sense it might be more like honey bee hives when a new queen emerges and takes about 10K of the hive’s population with her to found a new hive. Genetically, bee hive “forks” can merge later during the so-called flight of the queens.


Analogies from the physical world aside, the importance of forking for decentralization is paramount. The fact that open source projects can be forked means that code is never under the central control of any single community, unless it is encumbered by licensing. Thus, one of the most important measures of decentralization is the licensing of the codebase.

Safe Harbor

As we mentioned in the introduction many companies worked in good faith to create US business at the forefront of this technology. LBRY is one. RChain is another. Yet, these companies, among many others, have suffered from the regulatory ambiguity, arbitrarily ignorant enforcement, and regulators’ misunderstanding of the technology.


We believe that these sorts of companies and their entrepreneurs and engineers should be treated as civic leaders, not as predators. Specifically, this means that it is a practical imperative to offer a safe harbor clause for companies to achieve the decentralization of their network to prevent their work product in development from being  regulated as securities.  Without a safe harbor no ship can be launched and digital innovation will grind to a complete halt.


Three years to decentralization is simply too short for new tech

Gillibrand and Lumis’ prior draft legislation indeed offered a safe harbor clause. However, it provided only a 3 year window to achieve decentralization. This cannot and does not work for companies pursuing more substantial engineering work and protocol development.


While Ethereum was developed in less than three years, Bitcoin took more.Both are crippled by insurmountable design flaws that keep them from  scaling. Moreover, the Bitcoin protocol has well documented energy inefficiencies that have known environmental impacts. Companies that pursue the rational course to provide scalable, and sustainable designs need more time to bring the technology to fruition and decentralize it.


Based on extensive field experience we recommend  10 years as a reasonable time frame to achieve decentralization. Of course, during that time it is reasonable to require key milestones achieved to maintain safe harbor status.. Many of these milestones can be drawn from the points discussed above, such as adopting appropriate licensing, developing a community improvement process, etc.


Importance of Including Feedback from Experienced Entrepreneurs and Developers

We salute Congress for inviting industry representatives, such as the CEO of Avalanche, to testify and give their perspective. This is necessary but insufficient to stem the flow of talent and value from the US to foreign jurisdictions. There must be more robust give and take.


Feedback recently invited as to the practicalities of the proposed legislation must be taken seriously. A robust ongoing forum must be established for ongoing private-public collaboration

Conclusions and Future Work

In the standard article format we would draw some conclusions and then point to possible future work. However, we want to leave legislators with a crisp statement of our feedback on the proposed legislation as the last thing they read. Therefore, before we provide that statement we want to foreshadow the arguments for a future  article regarding a fundamental issue in the design of digital assets.

Digital assets should be time bound

The issue is that digital assets predominantly represent computing resources. Specifically, they represent processor cycles and storage capacity. However, those resources cannot, in practice, be delivered just in time. That’s why the cloud providers adopt the pricing models they do. Having processor cycles and storage capacity at the ready costs electricity (as well as rent for warehousing of the computing devices, etc) even when there are no network transactions that might offset the utility costs.


Think about all the addresses on the Bitcoin and Ethereum networks that are in a HODLing pattern. The digital assets they contain are not generating transaction fees. Yet, the network resources needed to maintain those assets are still being consumed. Eventually, the value of any compensation originally collected for distributing those assets will be overrun by the electricity and other costs to maintain the network.


As a result, in the limit, there comes a time when any upfront payment for a digital asset does not offset the cost of having the processor cycles and storage capacity at the ready. As such, digital assets are actually and in fact time bound in their value. Just like an orange or a bushel of wheat, digital assets go stale.


Network providers in an adjacent industry have long recognized this fact about similar network resources. That’s why phone minutes are measured in time and electric companies charge in kilowatt-hours. Much of the current speculation craze associated with digital tokens that has caused investors so much heartache and regulators so much headache  would simply fizzle out if the time bound nature of these assets were acknowledged.


Were this fact acknowledged and factored into the network economics it would completely change the landscape. No one invests in phone minutes, or at least not with a HODL strategy. We recognize it is  taking a long time for the digital asset markets to wake up to these facts. Yet, if they are to  be economically sustainable, they will awaken to the real market imperatives


In the meantime, we propose modifications to the proposed legislation that will do more good than harm and not have to be rolled back in less than a generation.


Summary of proposed changes to the legislation

  • Remove the asset distribution rule.
  • Instantiate a network compute resource decentralization rule.
  • Instantiate digital resource network independence rule(s).
  • Adopt software modularity measures.
  • Adopt software licensing measures.
  • Adopt community improvement process measures.




Measuring Decentralization

Comments on the Decentralized Digital Assets Draft Legislation


By,

Lucius Gregory Meredith

Ralph Benko

Jun 21, 2023



Lucius Gregory Meredith, founder and CEO of F1R3FLY.io, is a mathematician, the discoverer of the rho-calculus, a co-inventor of OSLF (Operational Semantics in Logic Form), and the inventor of the ToGL approach to graph theory.


Ralph Benko, a former White House official, co-founder and General Counsel of F1R3FLY.io, is author or co-author of several critically acclaimed books, including, with Dawn Talbot, Redefining the Future of the Economy: Governance Blocks and Economic Architecture.