Those 3 terms are closely related to each other. They are at the foundation of how current software is being written: by enabling software and system modularity. It allows different individuals, teams and organisations to work separately on common projects and systems in parallel.
Those familiar with the premises outlined in the book Sapiens, a brief history of humankind, know that the ability of the Homo Sapiens to rally, in great number, around common abstractions and concepts is what made it so incredibly successful, and differentiated it from the other primates and species from the "homo" genus.
An API(Application Interface), is such a common abstraction as mentioned here above. It is a convention, a concept agreed upon by independent parties on how to work and collaborate together around common abstractions . It is the Lingua franca between different software entities.
Each party in this dialogue is viewed by the other side as a simplified abstraction. The intrinsic complexity of each part is hidden from the other. This is what is meant by abstraction layer.
A good and well known example in the Unix world is "The "everything" is a file". The applications are pretending they are using and talking to files. No matter what complex hardware or concept is hidden behind it, they are all accessed with the simple and known, open, read, write, close API functions.
The more the underlying architecture and concepts evolve underneath those API's/abstraction layers, the more inefficient, the more difficult it is to make full use of newer capabilities.
It is as if you tried to describe the modern world with the same Latin used by the Romans 2000 years ago. You wouldn't be able to describe and express most of todays objects and concepts in a efficient and clear manner.
Nowadays, it is very difficult to write high performance networked applications, with the simple read, write and socket abstractions. To do it, you need to be creative, find loopholes or bypass the abstraction all together.
To fully exploit new features, new concepts and keep on relying of the benefits of separation of concerns, the API's and abstracted concepts need to evolve too.
Modern cloud native applications are often scalable and distributed systems based on a microservice architecture.
The technology stack typically used to implement micro-services is something like this:
Lets say a micro-service wants to share some data stored in its container with one of its peer-services:
There is large overhead due to the various abstraction layers we have to go through to get the task done. Lots of lost CPU/$ because of it, lots of performance lost because of it.
One of the reason for it, is that the API's were are using. the abstraction layers we are using have not evolved.
There is no real reason to use a virtual machine anymore which abstracts a real physical machine. The concept of a server as we used to think off is actually not necessary anymore.
We are trying to use yesterdays data-center architectures based on servers, OS and virtual machines to support today;s workloads and applications.
As mentioned above, it is as, if you would use old Latin to describe today's world. It would be incredibly inefficient.
Note, I am not pretending that today's micro-service based applications are better than old monolithic server based ones. It is just that developing new applications using old concepts holds you back, is complicated and expensive.
It is time to rethink the data-center bottom up and re-juvinile the whole stack of abstraction layers and API's we are using.
Previously published at https://computersandt.blogspot.com/2020/04/remaining-relevant-abstraction-layers.html