Public, private or hybrid clouds? What are they and how do they support your business?

Public, private or hybrid clouds? What are they and how do they support your business?

In companies’ digital transformation processes, the cloud is a very important tool, as it allows them to be adaptable, scalable, and resilient. 

According to the IDC study FutureScape: Global IT Industry 2002 – Forecasts and Implications in Latin America, it is estimated that by 2023, 80% of companies will use cloud-related services.  This growth is leveraged by the interest companies have developed around cloud utility to handle large volumes of data and manage applications that are critical to the business.  

Considering the increasing demand for data generation, cloud applications, and companies’ current demands, Felipe Gómez, Director of Data Center, Security and Cloud at Cirion Technologies for the Andean region, shares some basic concepts about the cloud, its features and the advantages offered to companies according to their needs.   

The cloud is an enormous network of global connected remote servers working as a single ecosystem.  It enables data storage and management, as well as the execution of applications and delivering content or services (email, social networks, streaming, etc.). This means that users can access this data, information and services from any device with an internet access.   

“Deciding to use the cloud gives companies greater agility, security, and dynamism in their operations.   There are three options for implementing cloud usage in corporate management: the public cloud, the private cloud, and the hybrid cloud”, adds Gómez.  

Public cloud, the fastest 

Selecting a public, private or hybrid cloud is determined according to the company’s needs and forecasts. 

Public clouds are the most common type of implementation of cloud computing; the web features (such as servers or storage) belong to a service provider, which manages and offers them through the Internet.   AWS (Amazon Web Services), Microsoft- Azure, and Google Cloud are a few examples of public cloud providers.   

One of the main advantages of the public cloud is its ability to scale quickly, enabling a faster implementation of environments.   The costs depend on the type of servers or peripherals required.  

Private cloud: greater security and availability, and lower latency 

This type of cloud offers a greater level of security and control than those of the public cloud because the services offered aren’t always shared and remain available to the user through an internal private network hosted in a local environment.   The private cloud is closer to the end customers’ applications and gives them the capacity to obtain lower latency and better performance.   The private cloud will always be an alternative for its stable costs in relation to the contract period. 

It is ideal to access applications’ services such as SAP or application virtualization, call center platforms or unified communications.   

Many companies also feel more comfortable with a private cloud, as they have local support in Spanish, which represents an important differential for many IT Directors.  

For most, the future is hybrid 

The adoption of cloud solutions brings organizations closer to the latest digital technologies and facilitates the development of innovative projects to achieve business objectives. 

To have a proprietary technological structure or just a private cloud, or to migrate the entire business to the public cloud are options that are becoming increasingly less contemplated by companies.  Operating with a mix of both clouds (public and private) will be the alternative adopted in the coming years; this is called hybrid cloud, clarifies Gómez. 

Many organization decided to have their productive environments in private clouds and their development environments in public clouds, because this type of scheme can offer greater security and speed and lower latency for critical applications. 

The IT infrastructure is becoming increasingly complex and companies must look for a scalable data management solution in order to remain afloat.   This type of data management in the cloud has become the solution of choice for many companies.   This is because companies are broadly adopting the cloud (public, private or hybrid) because it offers cost-savings, data availability, flexibility, scalability and many other advantages.  

According to the study “Cloud Computing Survey 2022”, almost three fourths (72%) of those responsible for IT decision making say that their organization is adopting cloud-based services when acquiring new technical capabilities.  

Information security is a priority for companies 

Any cloud migration process requires a decision from management, but also employee training and the support of experts to achieve an appropriate security governance.  

Whatever the choice for companies, when this task is trusted to specialists, they can obtain a correct scaling of needs and an efficient migration.   Cirion Technologies centers its efforts in offering companies a solution that can be adjusted to their expectations and guarantee increased security when managing business-critical information.  

What are Edge Data Centers, and how do they benefit us?

What are Edge Data Centers, and how do they benefit us?

Almost taken from a sci-fi movie, corporate technology has begun implementing what is known as a new generation of mobile computing centers, tools that will continue to change societies, and that represent one of the keys for digital cities development.  

Let’s start from the beginning. What are these new Edge Data Centers, and what are they for?

These new tools provide the same computing resources found in traditional data centers supported in large infrastructures.  However, this new type of Data Centers are usually called edge, since they are installed in the field where data processing, analytics, and real-time action/decision making are needed; thus avoiding sending digital data to the large traditional data centers, which are usually quite far from the place where end-user devices generate information.

With these systems, edge data centers can deliver cached content and cloud computing resources to any mobile device in real time. The concept works off traditional computing since we are now looking at a distributed IT architecture where client data is processed as close to the originating source as possible. 

Since Edge data centers are smaller and positioned closer to end users, their advantage lies on providing faster services with minimal latency, because they are processed at the same time of capture, thus sparing “data journey” to large data centers to obtain computing capabilities that translate into immediate action, e.g., on consumers on a given geographical location.

Another feature is that less time sensitive (delay) data can be sent to traditional large data centers -fixed infrastructure and facilities-, for historical analysis, big data, and long-term storage.

The so-called edge data centers must be easy to deploy and remove, designed to respond to temporary needs and actions, and much smaller than a shipping container.

The million-dollar question; will Edge data centers replace large, facility-based, traditional data centers? The answer is no because they perform contrasting functions.

The first one (perimeter or edge) comes to tackle and meet extremely specific computing needs, in a limited time and place. While the latter are a kind of “mother ship” and will continue to be the basic infrastructure with the greatest storage and computing capacity for digital data, and with the highest physical and digital security standards. 

Future is around the corner

Further to the pandemic’s impact in the region, today we can assure as a fact that digital information consumption has increased exponentially in Latin America, and that more data is required, something that impacts the Edge.

According to a report by Americas Market Intelligence [1](AMI) digital information consumption has done nothing but grow.  For example, with a 73% average internet penetration Latin America is heading to a digital zenith, with Argentina leading internet penetration in the region.  But, without a doubt, Brazil is the market of greatest interest, due to its population and geographic size.

Then, if we consider that Edge infrastructure investment go hand in hand with appropriate communications networks and that 5G deployment requires more distributed points of presence to distribute content, Edge Data Centers will soon play a significant role.

Edge Data Centers can help implementing a series of changes that will shape several transformations in our societies. For example:

1- 5G: Where a decentralized cell network made of edge data centers can help provide low latency for 5G in use cases with high device density.

2- Telecommunication companies: With cell-tower edge data centers, telecom companies can get better proximity to end users by connecting mobile phones and wireless sensors.

3- Internet of Things: Edge data centers can be used for data generated by IoT devices.  An edge data center would be used if data generated by devices needs more processing.

4-Healthcare. Some medical equipment, such as those used for robotic surgeries, would require extremely low latency and network consistency, of which, edge data centers can provide.

5-Autonomous vehicles. Edge data centers can be used to help collect, process, and share data between vehicles and other networks, which also relies on low latency.

6-Smart factories. Useful for machine predictive maintenance, as well as predictive quality management. It can also be used for efficiency regarding robotics used within inventory management.

Gartner[2] predicts that 75% of corporate data will be created and processed in Edge by 2025. So, we are facing a scenario where this technological trend will create an exponential growth of Edge Data Centers region and worldwide.  Will we be ready?

[1] https://americasmi.com/insights/consumo-de-medios-en-america-latina-cambios-marketing-publicidad-digital/

[2] https://www.gartner.com/smarterwithgartner/what-edge-computing-means-for-infrastructure-and-operations-leaders

Author:
Pablo García
Senior Sales Support Manager – Data Center & Security
Cirion Technologies Chile

Colocation – a bridge between the past and the future for legacy systems

Colocation – a bridge between the past and the future for legacy systems

Digital transformation is moving forward at full speed, though on several occasions, particularly in highly-regulated industries or those of broad track-records, it collides with an immensely powerful obstacle -i.e. legacy systems, which despite being on the verge of inefficiency and technological obsolescence, still provide a robust and reliable service.

Companies are not always ready to upgrade or replace these apps., many times due to the huge size of projects, many others because of associated costs, sometimes because of the risks involved, not to mention that technology investments have usually been excessive in previous years and cost amortization is required.

Similarly, cultural barriers still persist making companies keep some critical systems under control, despite the massive use of the cloud.

In these particular cases, colocation services can be used to gain agility and to create a perspective thinking ahead. Banking and financial services, oil and gas, mining, healthcare, manufacturing, and utilities are among the industries that meet these characteristics.

The intermediate step

Let’s remember that colocation allows an organization to host its mission-critical equipment in a high-availability data center. Just taking that single step brings multiple benefits: lower maintenance costs, reduced need for dedicated resources for internal infrastructure management, lower risks of failing to comply with security and energy consumption related regulations, and even business continuity is guaranteed in the event of a natural disaster or an unexpected situation.

However, for companies this could also mean a bridge between the past and future of their IT strategies, a turning point between owned infrastructures and hybrid cloud environments, an intermediate step on their journey to transformation.

Because colocation service represents in many cases the first step in the transformation and modernization project.

More fluent migration

Therefore, when migration is unavoidable due to obsolescence or because the market pushed with new demands and needs the old system can no longer solve, the direct connection with the cloud providers will be already established making transfer smoother and decreasing downtime.

The global colocation market adjusted to companies’ new needs and understood as a necessary steppingstone to the cloud, is gaining ground: according to the Statista statistics portal, the business moved just over US$50 billion in 2021 and will generate just over US$136 billion by 2028.

Digital transformation is advancing at full speed. And having tools in place like colocation service to remove obstacles is essential for companies to remain competitive thinking ahead.

Author:
Gabriel del Campo
VP Data Center, Cloud & Security
Cirion Technologies

Solving the unknown- the best solution for the “hybrid cloud” equation

Solving the unknown- the best solution for the “hybrid cloud” equation

Hybrid cloud continues to prevail in the market: according to IDC estimates, by the end of this year, it will be present in at least 90% of organizations worldwide.  The rationale for this trend is given by the significant role of the cloud in digital transformation strategies, as well as for the fluidity and flexibility levels proposed by this model.

“Hybrid cloud” as we call it, is a technological scenario that includes some local presence or owned infrastructure (private cloud) and a set of apps., data, and processes migrated to a public cloud. Standing out among its features is the fact that workloads move in a fluid, transparent, and automated manner between the different environments, as necessary. 

It is not a turnkey solution that all companies can replicate to obtain the same results, but a true equation that poses new unknowns in each case in particular. Clearing them is the key to achieving all the benefits that this type of architecture potentially provides. 

Network speed, processing capacity, architecture optimization, latency achieved, cost control, and overall governance stand out among the variables at stake.

User experience

Focus should be on user experience to find the right pathway. In any case it should be the one to meet users’ expectations no matter if they are buying a product online, making a financial transaction, completing work for the office, studying, or even playing or chatting with friend. Its geographical location or access device is not relevant either -experience must be analogous regardless of these factors.

Therefore, analyzing where information is processed, and response immediacy is key.  When it comes to workloads that require immediate response, reducing data transportation time to the processing point is crucial. On the other hand, because of their dynamics or functionality some workloads are better located in a core data center. Therefore, the architecture assembly and the decision where each load is processed, are essential to achieve greater performance.

Connectivity amongst these points is fundamental as well.  In a game of balance, we will have to explore multiple alternatives to identify those internet providers that offer good interconnection with their peers -to guarantee an appropriate performance of end-to-end architecture, optimally concatenating all hybrid cloud links and, at the same time, which enable cost effectiveness.

With an eye on latency

Throughout this path, the eye will be on latency — delays in data transfer must be minimal and performance must reach its maximum potential, even when dealing with demanding applications such as video, VR, or games.

On the other hand governance provides project with a business purpose aligned direction, assuring compliance with regulations and organizational policies, analyzing, and monitoring performance, assigning resources appropriately and establishing continuous improvement criteria, not only from a technology perspective but from people’s and processes’ as well. 

Eventually, hybrid cloud enables the best use of services and the greatest cost savings. But these benefits are not magical: to disentangle this equation a right understanding of the model is key, as well as a correct governance implementation, and of course accuracy at the time of math.

Author:
Gabriel del Campo
VP Data Center, Cloud & Security
Cirion Technologies

Edge Computing: Building faster and smarter networks

Edge Computing: Building faster and smarter networks

For companies in any industry, having computing as distributed as possible has become essential, that is to say, they must be able to process data as close to the place where they are produced, in order to make faster and smarter decisions about them.

According to a study performed by the consulting firm Grand View Research, Edge Computing global market will hit US$43.4 billion by 2027[1]. Along with this, Gartner projections indicate that by 2025, 75% of business data will be processed outside of conventional data centers or the cloud[2].

Both figures confirm a clear trend, consolidation of edge-based solutions and platforms.

We have been noticing a greater investment for connectivity in Latin America for years, though what really matters is not just reaching the main hubs but making all that infrastructure that is being developed as ubiquitous as possible, in order to reach the last locations where businesses and people need that connectivity.

Therefore, in addition to improving connectivity it is essential to make computing as distributed as possible, that is, companies must be able to process data as close to the points where they are captured, in order to act quickly on them. This implies building smarter networks, with processing capacity, and from a customer perspective, the possibility of not only connecting to these centers in different ways, but also the ability to run applications in the most appropriate way (physical servers, virtual machines, containers).

Thus, for any application, be it business, entertainment, or personal, latency becomes a critical element in defining response times, since the lower it is, the greater the ability to offer better services and a better user experience.

Now, the design of those applications to be deployed in edge solutions is key, since it must consider  the point where we will look for data, up to where they will be processed from the very beginning. This goes beyond basic hardware technology as it implies a synergistic integration of the architectures involved. Along with this, we must not forget that these sets of systems require highly trained human resources, so having trained human capital with the skills this type of industry calls for is critical.

Finally, it is essential to have all necessary internal processes to ensure the highest possible availability. High availability is a critical element when we think about this type of user experiences. We must have the necessary systems in place to prevent any impact on final customer’s service.

Considering the above, an Edge-based solution must be the optimal development of any application and transaction, especially because our customers’ data are involved, so from the operator’s standpoint, this technology must be implemented under the “each time closer, each time faster and also each time safer” premise.

As players in this industry, we are witnessing a strong and rapid adoption of edge-based solutions, both because supply is increasingly higher and vendors are strongly driving development, while customer demand is also growing.

This is how a perfect combination is set up where growing adoption is observed, and specific cases that will help transform solutions into an almost unimaginable reality today, and that ultimately will change the way we live and work in the future.

 

[1] https://www.grandviewresearch.com/industry-analysis/edge-computing-market 

[2] https://www.gartner.com/en/research/methodologies/magic-quadrants-research 

Jon Paul "JP" McLeary

Autor:
Gabriel del Campo
Regional VP of Data Center, Cloud & Security
Lumen, LATAM