paint-brush
The Failed Promises of Extract, Transform, and Load—and What Comes Nextby@vinaysamuel
655 reads
655 reads

The Failed Promises of Extract, Transform, and Load—and What Comes Next

by Vinay SamuelMarch 27th, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Telecommunications companies possess vast stores of theoretically-usable data. Yet, this data is often difficult, if not outright impossible, to use meaningfully. A networked data platform uses metadata to identify different data sources and sort out how those various sources can be most effectively brought together.
featured image - The Failed Promises of Extract, Transform, and Load—and What Comes Next
Vinay Samuel HackerNoon profile picture


Telecommunications companies find themselves in a curious position. On the one hand, they possess vast stores of theoretically-usable data, perhaps more than any other industry. Yet, on the other hand, this data is often difficult, if not outright impossible, to use meaningfully. So much water, so little to drink.


The prime culprit here is the scattered nature of all this information. Telecommunications firms are, by nature, vast operations, and their data is more often than not split across a large number of systems. Think billing systems, product management platforms, and data warehouses. Or the many critical resources that service things like customer service and network availability. By necessity and design, these operations sprawl.


Accordingly, synthesizing all of this data—mining it for valuable insights that might help customer engagement, for instance—is a herculean task with potential pitfalls. Telecommunications companies want a single view of their customers—but they struggle daily to get one. As a result, their insight remains only partial and clouded, and as a consequence, customer satisfaction is thwarted, and revenue is lost.


The failed promises of "extract, transform, and load" (ETL)

For many years, one of the guiding beliefs of the telecommunications industry—and of other industries that deal in massive amounts of data—has been that insights are only possible if all data is in a single central location. This was the promise of "extract, transform and load"—funneling data from many disparate sources into a single "data lake" could be cost-effective and help businesses generate the insights they need about their customers.


Of course, in practice, things have been more complex. Scooping up and pooling data from a half-dozen or more different sources has become hugely laborious and cost-intensive for businesses. Data from various sources must often be transformed to fit in a single cohesive database. And this extraction process was costly—not to mention dangerous when considering the cybersecurity risk that opens up during the transportation process.


All of which is to say that extract, transform, and load has definitively failed to deliver on their promise. Its high cost and complexity have proved to be an insurmountable barrier to fast and efficient insights. Datasets have been omitted for various reasons; trying to input new data has led to endless (and costly) projects that took forever to come to fruition, by which point the data in question was often no longer needed or simply out of date.


Why networked data platforms are the answer

Luckily, over the last few years, many in the industry have discovered an alternative to the "extract, transform, and load" approach that saves companies significant amounts of money while delivering a much higher yield of usable insights.


That alternative? It's called a networked data platform. Put simply, a networked data platform uses metadata to identify different data sources and sort out how those various sources can be most effectively brought together.


Picture a service representative on the phone with a customer. To solve this particular customer's query, the representative needs specialized data pulled from, say, half a dozen of the company's data streams—for instance, information on the customer's network usage or billing history or machine learning-powered data on whether the customer is on the right plan for their needs.


Under the old ETL regime, this would've presented any number of difficulties. Now, though—through the power of virtualizing data in silos—the relevant information can be accessed instantly. Through machine learning, this data is seamlessly stitched together in real-time to produce precisely the insights the service representative needs. Accordingly, the customer experiences a greater degree of satisfaction. A scale—through thousands of similarly-optimized interactions per day—the company increases its efficiency and, thus, its revenue.


Here, all the time, effort, and resources once sucked up by the ETL model are vastly reduced, as are data replication and security concerns. The data can be analyzed where it is without the expense of hauling it all into one central location.


Telecommunications companies end up with faster customer insights, a higher competitive advantage, and—as a direct consequence—more significant innovation at a lower cost. As a result, every aspect of the business is improved, from network planning to customer service.


Faster access to information inevitably enables better and quicker decision-making—the engine of any company's growth.



Lead image generated with stable diffusion.