Colocation – a bridge between the past and the future for legacy systems

Colocation – a bridge between the past and the future for legacy systems

Digital transformation is moving forward at full speed, though on several occasions, particularly in highly-regulated industries or those of broad track-records, it collides with an immensely powerful obstacle -i.e. legacy systems, which despite being on the verge of inefficiency and technological obsolescence, still provide a robust and reliable service.

Companies are not always ready to upgrade or replace these apps., many times due to the huge size of projects, many others because of associated costs, sometimes because of the risks involved, not to mention that technology investments have usually been excessive in previous years and cost amortization is required.

Similarly, cultural barriers still persist making companies keep some critical systems under control, despite the massive use of the cloud.

In these particular cases, colocation services can be used to gain agility and to create a perspective thinking ahead. Banking and financial services, oil and gas, mining, healthcare, manufacturing, and utilities are among the industries that meet these characteristics.

The intermediate step

Let’s remember that colocation allows an organization to host its mission-critical equipment in a high-availability data center. Just taking that single step brings multiple benefits: lower maintenance costs, reduced need for dedicated resources for internal infrastructure management, lower risks of failing to comply with security and energy consumption related regulations, and even business continuity is guaranteed in the event of a natural disaster or an unexpected situation.

However, for companies this could also mean a bridge between the past and future of their IT strategies, a turning point between owned infrastructures and hybrid cloud environments, an intermediate step on their journey to transformation.

Because colocation service represents in many cases the first step in the transformation and modernization project.

More fluent migration

Therefore, when migration is unavoidable due to obsolescence or because the market pushed with new demands and needs the old system can no longer solve, the direct connection with the cloud providers will be already established making transfer smoother and decreasing downtime.

The global colocation market adjusted to companies’ new needs and understood as a necessary steppingstone to the cloud, is gaining ground: according to the Statista statistics portal, the business moved just over US$50 billion in 2021 and will generate just over US$136 billion by 2028.

Digital transformation is advancing at full speed. And having tools in place like colocation service to remove obstacles is essential for companies to remain competitive thinking ahead.

Author:
Gabriel del Campo
VP Data Center, Cloud & Security
Cirion Technologies

Solving the unknown- the best solution for the “hybrid cloud” equation

Solving the unknown- the best solution for the “hybrid cloud” equation

Hybrid cloud continues to prevail in the market: according to IDC estimates, by the end of this year, it will be present in at least 90% of organizations worldwide.  The rationale for this trend is given by the significant role of the cloud in digital transformation strategies, as well as for the fluidity and flexibility levels proposed by this model.

“Hybrid cloud” as we call it, is a technological scenario that includes some local presence or owned infrastructure (private cloud) and a set of apps., data, and processes migrated to a public cloud. Standing out among its features is the fact that workloads move in a fluid, transparent, and automated manner between the different environments, as necessary. 

It is not a turnkey solution that all companies can replicate to obtain the same results, but a true equation that poses new unknowns in each case in particular. Clearing them is the key to achieving all the benefits that this type of architecture potentially provides. 

Network speed, processing capacity, architecture optimization, latency achieved, cost control, and overall governance stand out among the variables at stake.

User experience

Focus should be on user experience to find the right pathway. In any case it should be the one to meet users’ expectations no matter if they are buying a product online, making a financial transaction, completing work for the office, studying, or even playing or chatting with friend. Its geographical location or access device is not relevant either -experience must be analogous regardless of these factors.

Therefore, analyzing where information is processed, and response immediacy is key.  When it comes to workloads that require immediate response, reducing data transportation time to the processing point is crucial. On the other hand, because of their dynamics or functionality some workloads are better located in a core data center. Therefore, the architecture assembly and the decision where each load is processed, are essential to achieve greater performance.

Connectivity amongst these points is fundamental as well.  In a game of balance, we will have to explore multiple alternatives to identify those internet providers that offer good interconnection with their peers -to guarantee an appropriate performance of end-to-end architecture, optimally concatenating all hybrid cloud links and, at the same time, which enable cost effectiveness.

With an eye on latency

Throughout this path, the eye will be on latency — delays in data transfer must be minimal and performance must reach its maximum potential, even when dealing with demanding applications such as video, VR, or games.

On the other hand governance provides project with a business purpose aligned direction, assuring compliance with regulations and organizational policies, analyzing, and monitoring performance, assigning resources appropriately and establishing continuous improvement criteria, not only from a technology perspective but from people’s and processes’ as well. 

Eventually, hybrid cloud enables the best use of services and the greatest cost savings. But these benefits are not magical: to disentangle this equation a right understanding of the model is key, as well as a correct governance implementation, and of course accuracy at the time of math.

Author:
Gabriel del Campo
VP Data Center, Cloud & Security
Cirion Technologies

Edge Computing: Building faster and smarter networks

Edge Computing: Building faster and smarter networks

For companies in any industry, having computing as distributed as possible has become essential, that is to say, they must be able to process data as close to the place where they are produced, in order to make faster and smarter decisions about them.

According to a study performed by the consulting firm Grand View Research, Edge Computing global market will hit US$43.4 billion by 2027[1]. Along with this, Gartner projections indicate that by 2025, 75% of business data will be processed outside of conventional data centers or the cloud[2].

Both figures confirm a clear trend, consolidation of edge-based solutions and platforms.

We have been noticing a greater investment for connectivity in Latin America for years, though what really matters is not just reaching the main hubs but making all that infrastructure that is being developed as ubiquitous as possible, in order to reach the last locations where businesses and people need that connectivity.

Therefore, in addition to improving connectivity it is essential to make computing as distributed as possible, that is, companies must be able to process data as close to the points where they are captured, in order to act quickly on them. This implies building smarter networks, with processing capacity, and from a customer perspective, the possibility of not only connecting to these centers in different ways, but also the ability to run applications in the most appropriate way (physical servers, virtual machines, containers).

Thus, for any application, be it business, entertainment, or personal, latency becomes a critical element in defining response times, since the lower it is, the greater the ability to offer better services and a better user experience.

Now, the design of those applications to be deployed in edge solutions is key, since it must consider  the point where we will look for data, up to where they will be processed from the very beginning. This goes beyond basic hardware technology as it implies a synergistic integration of the architectures involved. Along with this, we must not forget that these sets of systems require highly trained human resources, so having trained human capital with the skills this type of industry calls for is critical.

Finally, it is essential to have all necessary internal processes to ensure the highest possible availability. High availability is a critical element when we think about this type of user experiences. We must have the necessary systems in place to prevent any impact on final customer’s service.

Considering the above, an Edge-based solution must be the optimal development of any application and transaction, especially because our customers’ data are involved, so from the operator’s standpoint, this technology must be implemented under the “each time closer, each time faster and also each time safer” premise.

As players in this industry, we are witnessing a strong and rapid adoption of edge-based solutions, both because supply is increasingly higher and vendors are strongly driving development, while customer demand is also growing.

This is how a perfect combination is set up where growing adoption is observed, and specific cases that will help transform solutions into an almost unimaginable reality today, and that ultimately will change the way we live and work in the future.

 

[1] https://www.grandviewresearch.com/industry-analysis/edge-computing-market 

[2] https://www.gartner.com/en/research/methodologies/magic-quadrants-research 

Jon Paul "JP" McLeary

Autor:
Gabriel del Campo
Regional VP of Data Center, Cloud & Security
Lumen, LATAM

What the cloud-smart model is and what companies must do to achieve it

What the cloud-smart model is and what companies must do to achieve it

After one year of cloud adoption exponential growth, businesses that aim to continue taking advantage of this technology -to maintain their competitive edge- need to enhance their perspective and turn to a cloud-smart model — a natural and necessary outcome of the cloud massive migration, accelerated by the pandemic. 

In fact, COVID-19 allowed many companies to confirm those benefits the cloud had been promising for years: cost control, unlimited scalability and flexibility, the possibility of accessing data and applications anytime, anywhere, and from any device; increasingly lower latency thanks to the evolution of connectivity technologies and a gradually higher deployment, and, most importantly, the key agility to guarantee business continuity under any circumstance (including, as it turned out, the most complex ones). 

Although many companies saw themselves forced to resort to the cloud out of necessity or obligation, persistent good outcomes were key in their decision of developing a strategic vision: this is why it is not surprising that the market consultant IDC detected that in the post-pandemic world 90% of organizations will work under a cloud-first scheme. This means that what is and is not uploaded to the cloud is no longer evaluated, the business itself is based on this technology instead. The same consulting firm determined that only in Latin America cloud infrastructure growth will take place across all segments this year: 29.4% in Infrastructure as a Service; 36.9% in Platform as a Service; and 19.9% in Software as a Service. 

In this maturity growth, the multi-cloud idea becomes increasingly stronger. Why limit yourself to having everything in a single private cloud or hire a public cloud when you can select multiple clouds, each according to their area of greatest experience or best performance? Thus, if one cloud provider is a storage expert, and another in application performance, you can get the best of each. A multi-cloud strategy provides several benefits: from total flexibility to whole workload optimization, including cost-benefit streamlining. 

One thought applicable to any cloud data migration is making sure that long-term business needs will be considered. Limiting growth by stagnation due to a poor cloud provider choice, makes no sense. Organizations can bring specific workloads to a platform or move them where they can leverage the best pricing and terms agreed for certain requirements. Additionally, a simultaneous cloud access approach also facilitates a smooth transfer and an almost instant shift between clouds, should you need to switch providers for any reason whatsoever. 

Companies are now facing this possibility of scaling to the next level and leverage the cloud to differentiate themselves in the marketplace. Having an expert’s support knowledgeable of how to analyze opportunities, segregate needs, assess return on investment (ROI) and choose the best ways of implementation to assure that  services will be executed optimally, without unnecessary expenses or hires is essential along this pathway. Thus, a new milestone will be conquered: being a cloud-smart company. 

Jon Paul "JP" McLeary

Autor:
Marcela Cueli
Product Director
Lumen LATAM