For companies in any industry, having computing as distributed as possible has become essential, that is to say, they must be able to process data as close to the place where they are produced, in order to make faster and smarter decisions about them.

According to a study performed by the consulting firm Grand View Research, Edge Computing global market will hit US$43.4 billion by 2027[1]. Along with this, Gartner projections indicate that by 2025, 75% of business data will be processed outside of conventional data centers or the cloud[2].

Both figures confirm a clear trend, consolidation of edge-based solutions and platforms.

We have been noticing a greater investment for connectivity in Latin America for years, though what really matters is not just reaching the main hubs but making all that infrastructure that is being developed as ubiquitous as possible, in order to reach the last locations where businesses and people need that connectivity.

Therefore, in addition to improving connectivity it is essential to make computing as distributed as possible, that is, companies must be able to process data as close to the points where they are captured, in order to act quickly on them. This implies building smarter networks, with processing capacity, and from a customer perspective, the possibility of not only connecting to these centers in different ways, but also the ability to run applications in the most appropriate way (physical servers, virtual machines, containers).

Thus, for any application, be it business, entertainment, or personal, latency becomes a critical element in defining response times, since the lower it is, the greater the ability to offer better services and a better user experience.

Now, the design of those applications to be deployed in edge solutions is key, since it must consider  the point where we will look for data, up to where they will be processed from the very beginning. This goes beyond basic hardware technology as it implies a synergistic integration of the architectures involved. Along with this, we must not forget that these sets of systems require highly trained human resources, so having trained human capital with the skills this type of industry calls for is critical.

Finally, it is essential to have all necessary internal processes to ensure the highest possible availability. High availability is a critical element when we think about this type of user experiences. We must have the necessary systems in place to prevent any impact on final customer’s service.

Considering the above, an Edge-based solution must be the optimal development of any application and transaction, especially because our customers’ data are involved, so from the operator’s standpoint, this technology must be implemented under the “each time closer, each time faster and also each time safer” premise.

As players in this industry, we are witnessing a strong and rapid adoption of edge-based solutions, both because supply is increasingly higher and vendors are strongly driving development, while customer demand is also growing.

This is how a perfect combination is set up where growing adoption is observed, and specific cases that will help transform solutions into an almost unimaginable reality today, and that ultimately will change the way we live and work in the future.

 

[1] https://www.grandviewresearch.com/industry-analysis/edge-computing-market 

[2] https://www.gartner.com/en/research/methodologies/magic-quadrants-research 

Jon Paul "JP" McLeary

Autor:
Gabriel del Campo
Regional VP of Data Center, Cloud & Security
Lumen, LATAM

Disponible en Español (Spanish) Português (Portuguese (Brazil))