Decentralized storage is still far from mature. It is faced with three (3) key obstacles - technical, regulatory, and adoption.
Decentralized cloud infrastructure’s support for different kinds of data is still underdeveloped. Today’s decentralized networks mainly handle cold data - data that’s seldom accessed - and offer some support for warm data, data that users retrieve on occasion. But decentralized networks aren’t yet capable of hosting hot data, data stored in a database and accessed frequently. Hot data represents a key piece of the puzzle when it comes to storage. Clients and users across industries and domains need instant, reliable access to data - on demand and in real-time. No support for hot data means no video streaming or other kinds of content delivery where speed is everything.
Decentralized networks need a fully functional web3 CDN (Content Delivery Network) that can support the kind of low-latency streaming offered by YouTube and TikTok. CDNs are crucial to accelerating streaming for lots of users all at once. To date, no such mature CDN exists in any decentralized storage or compute network.
Indexing represents another weakness for today’s decentralized storage networks, whose dispersed storage resources (provided by nodes around the world) make indexing still far too slow. Lots of apps are search-based, though, and right now they can’t be run on decentralized networks.
As it stands, decentralized storage networks also lack database services - a feature offered by centralized storage providers like AWS. Slow data retrieval and writing speeds also hamper decentralized networks: Speed is the most relevant factor for database performance. Ideally, databases should be able to retrieve and write data in real-time. Unstable bandwidth is another reason that decentralized networks haven’t yet been able to deploy database services.
The impact of regulation comes into clear relief when we think about how highly regulated industries, like insurance, operate. In most cases, the government requires insurance companies to know exactly where their policyholders’ data is stored at all times, and may even require insurers to store multiple copies of that data at different data centers.
In some countries, like the US and China, many types of data must be stored within the relevant country’s borders; a company can’t simply distribute client data across thousands of anonymous nodes spread around the world. In short, big clients simply won’t use decentralized services as long as these rules are in place. Regulation, though, is constantly adapting - albeit slowly - to innovation. As decentralization continues to gain momentum, we can expect regulators to adjust their policies to allow insurance and other industries to store data in decentralized networks.
Decentralized networks are still immature; on account of their relatively small scale, they simply haven’t been subjected to the same intensity of testing as the established centralized providers. Low levels of adoption and scarce use cases make it hard, or even impossible, for decentralized storage networks to assess their capacity and performance, or foresee risks that could snowball into crises once more users come on board. Consider how often hackers target Facebook or Google - successfully or otherwise. The fact that most of these attacks fail is important because it suggests that the targets are plugging security gaps and improving their systems, and it helps them establish trust with their users over time.
Decentralized networks like Filecoin or Arweave, on the other hand, haven’t been tested to the same degree. Users - current and potential - have no reason to believe that these networks will be able to protect their assets because they’re still unproven. For now, these networks just don’t hold enough value to entice hackers. Ironically, the more decentralized networks are tested, the more likely they are to build trust with their users. Moreover, it’s hard for a decentralized network to gauge how well it’ll stand up against massive data volumes when most of these networks have zero control over their tens of thousands of miners’ internet bandwidth or performance. Just as it will take time to overcome regulatory obstacles, decentralized networks will gain reliability and trustworthiness as they mature.
While regulatory and scale-related challenges will likely lessen over time, I see technical obstacles as an important site for innovation. To that end, our aim is to provide users decentralized storage with lower costs and less latency when it comes to data retrieval. Specifically, we are building an edge network that can help us support hot data, deploy a CDN and that will aid database indexing. If data is “hot,” we’ll send it to the edge of the network, where it’s closer to users. In turn, users can access this data from the network’s edge faster and more reliably. Our edge network will help ensure that users’ connections are always stable, and will thus allow our platform to host dApps - like a web3 video streaming service - that rely on low-latency content delivery.
We’re aiming our decentralized storage services at web3-native dApps, as well as startups and individuals in the web2 space. Web3-native dApps, such as NFT marketplaces, have to use decentralized networks. Web2 individuals and small to medium-sized companies are looking for lower cloud storage costs and don’t require industrial-level performance. Moreover, a startup can migrate cheaply and quickly, sometimes in just a few days.
Decentralized cloud storage networks still have a long way to go before they’ll be able to support the same kinds of data-intensive services, and boast the same reliability and trustworthiness, as the centralized cloud old guard. Today, these nascent networks face a host of technical, regulatory, and scale-related obstacles. But while the latter two obstacles will take time to overcome, we are leading the way toward breaking down the technical barriers that have limited the potential of decentralized storage.