paint-brush
What is Fluid Computing: Apps and The Future Aheadby@divyangmetaliya
955 reads
955 reads

What is Fluid Computing: Apps and The Future Ahead

by Divyang MetaliyaNovember 30th, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Current cloud-centric approach requires centralisation of the 'decision making' which creates a tonne of hurdles at the deployment level in the large scale working conditions. Current CPU utilisation at data center servers (less than 60%) will prove to irrelevant to large scale implementation as the processing power is more extensive than information delivery rates. Cloud computing is an approach that intends to bypass the economic burden of 'prioritising' traffic over the internet by telcos and focus on the operational aspects.

Company Mentioned

Mention Thumbnail

Coin Mentioned

Mention Thumbnail
featured image - What is Fluid Computing: Apps and The Future Ahead
Divyang Metaliya HackerNoon profile picture

Today's buzzword 'fluid computing' is one of the most underrated terms in the architecture world. Fluid computing is the master set of fog computing, edge computing, mist computing, and cloud computing, which is one of the most versatile architectures that can help in radically changing the way businesses function for good. In essence, the flow of computing resources, including the CPUs, storage, and memory into the functionalities such as routers and servers through virtual devices, is known as fluid computing. 

This term is not recognised by the tech community to the extent other models are perceived. Hence, this article is intended to flirt with the radical changes yet to be experienced since current CPU utilisation at data center servers (less than 60%), and memory utilisation (less than 50%) will prove to irrelevant to the large scale implementation as the processing power is more extensive than information delivery rates. (Statistics source: IEEE research paper)

A Famous Analogy for General Understanding

One can understand the fluid computing model by visualising a water cycle. On the ground level (edge), the fog is the first level of water (fluid) near the surface, followed by mist and ultimately going all the way above to the clouds. In terms of computer architecture, the edge represents ground level; fog is a mediator between the physical system and the cyber entities. Mist provides localised processing and forwards the data to the cloud data centers to complete the chain. Clouds respond to the information and complete the cycle to sustain applications. 

What Led to the Development of Fluid Computing

The current cloud-centric approach requires centralisation of the 'decision making' which creates a tonne of hurdles at the deployment level in the large scale working conditions. For instance, if you're trying to run a full-fledged CIOT or IIOT (Consumer/Industrial Internet of Things) system, you will encounter a broad range of problems, including latency, bandwidth utilisation, and the cost factor.

Collectively, they depict the need to localise the intelligence to the 'edge,' i.e., to unite the collection of data and location of the decision making. The sole use of cloud architecture doesn't optimally support the deployment environment's needs. As an example, an IIOT-based factory placed in a remote location will have to rely on the central servers located thousands of miles away. This can generate concerns in the utilisation of the processing capacity and availability of data centers, which we will understand in the later portions of the article.

Hence the situation is addressed by introducing a distributed computing model known as 'Fog Computing' which enables the use of intermediary data processing centers to curb the latency and optimise the process. The information processed by the fog servers is shared with the cloud servers, which connects them with the entire system.

However, the 'Mist Computing' acts as an additional layer between the fog and cloud that acts as a hybrid architecture of centralised and distributed models. Its servers are located relatively near to the data centers within a defined boundary, such as metropolitan areas. 

Market Statistics: By 2022, mobile data traffic may reach one zettabyte per annum globally. (Cisco)

An organisation has to consider the cost of running data centers, the rate of CPU utilisation, against the use of 'softwarization' and 'digital twins.' The practical implementation of CIOT and IIOT systems would pose challenges to these aspects since the communication channels are the ones that will affect the quality of processes.

The seriousness of this problem can be understood by the fact that telcos exploited masses all around the world due to a monopoly over the 'bandwidths'. So, you would have to literally pay for being served better than others as in the case of using cell-phones in the past.

Fluid computing is an approach that intends to bypass the economic burden of 'prioritising' traffic over the internet by telcos and focus on the operational aspects. It is also to be noted that the hardware or network are not a part of improvisation, as in the case of classical approaches. 

A Hypothetical Case to Seer the Complexities

Consider the case of an IoT-based intelligent fleet management system for a city. The number of cars on the roads is a function of people willing to go from one place to another, the availability of all the roads in the city, the number of vehicles from other areas, and events held during particular time periods at different locations.

Digging deep into the situation, consider breakdowns in cars taking place at different localities and deteriorating weather conditions. Under such circumstances, preventing accidents could become tricky since the connection bandwidths shall face problems in facilitating decision making due to conflicts in prioritising. As a result, accidents would prove to be a threat to security. Relying on the cloud can prove to be a wrong decision when it comes to real-life applications on large scales.

Insights: Customer-centric designs will be the future of next-generation operating models. (McKinsey)

What Does it Do to Computing?

A large section of its applications are meant for the enterprise-based utilisation where the processes are executed, monitored, managed, and controlled by the use of computers. In conventional models, the devices that actuate processes and sense feedbacks were connected to the data application by a network.

It has backlashes on the security, speed, scalability, and cost considerations owing to the technological shortcomings. Fluid computing acts as an elastic medium embedded into the whole cyber-physical system.

Traditional platforms will struggle to provide adequate bandwidth for communication on large scales, which will give birth to the latency problems at the application level. On the other hand, virtual peripherals and putting the intelligence towards the edge will eliminate the issues arising out of limited bandwidth for transferring data from application centers and data centers.

Edge architecture enhances internal communication among the peripherals and smartly distributes data stored locally and over the cloud. The fog nodes help in localising the intelligence down to the LAN level that allows prompt real-time processing of information.

In terms of computational capacity generated, the relative cost of network connectivity and storage is lowered. Hence the degree of scalable operations is benefited. Mist servers connect the fog servers with the cloud servers to complete the link between the distributed model and the centralised cloud system. This layer will facilitate communication between the two while adding to the localisation of decision making. 

When implementing this architecture, one needs to consider the requirement of sensitivity and the required edge functionalities since it is not vital to use the cloud in all operations. 

To Understand its Robustness, Consider the Following Situations

Which one among the human being or hydra would face higher threats in case of continuous exposure to external threats? In the case of a human being in a war, they can be quickly neutralised with a headshot. But, in the case of a hydra, the regionalised nervous system makes it impossible to eliminate by targeting it on a particularly high significance portion of the body. 

Also, your payroll software uses a blend of centralisation and regionalisation by sourcing inputs from a multitude of activities and generating a single output in the form of a compensation figure. Similarly, fluid computing smartly juggles between the need for localising the decision-making and data storage along with the centralisation requirements in order to optimise the operational efficiency of the system. 

Other Architecture Forms that may Mend Ways in the Future

Quantum Computing

This is truly a blast from the past, which appeared in research papers from Einstein and the group. It banks on the quantum theory of developing particles together and harnessing the property of maintaining state despite separation. Making changes in one particle will be reflected on the other ones located at infinite distances.

Quantum entanglement is gaining grounds in practical implementation, which will probably lead to the elimination of the networking theory. The superposition of two quantum states using the Schrödinger equation (linear differential equation) is also a probable computing theory.

Terminator/Hard to be a God: Biological computers

These popular Sci-Fi concepts may get materialised in the future with the rapidly improving genetic engineering. Recently, scientists developed bacteria that ate plastic to solve the pollution problem. It is very much possible that in the future, we will create organisms that process data like the core life-sustaining activity of photosynthesis exhibited by the plants. Interfacing them with software entities will open doors to broader possibilities. 

Summing Up

The coming days will witness dramatic changes in the computing world with the advent of industrial-scale applications of fluid architecture. Advantages such as scalability, increased efficiency through virtual appliances, higher security, and cost optimisation are some of the principal reasons to choose a dynamic model over current ones. However, a lot depends on the standardisation of the products available in the market.

We would require the use of open source technologies and extensive public libraries in order to contain the volatility in the equipment life cycles, updating, and clubbing newer devices in the old systems. Every component of the cyber-physical system, including actuators, sensors, data storage centers, servers, and networking structures, is to be managed with standardisation.

Hence, both manufacturing and software development would require to take interoperability into account. The ability to flow, i.e., change as per the functional requirements, will help in increasing the quality of life globally. The economic benefits would be another jewel studded in the crown as the increase in convenience going hands in hands with deflation is a rare phenomenon.