Register now for better personalized quote!

HOT NEWS

The future of cloud computing, from hybrid to edge to AI-powered

Mar, 20, 2023 Hi-network.com
Getty/metamorworks

The World Meteorological Organization's (WMO) recognizes ten basic cloud types. Ranging from cirrus to nimbostratus to our favorite here in gray and rainy Oregon, plain ol' stratus, these clouds come in quite a few varieties. So do the clouds we use in technology -the ones that live in giant data centers.

Actually, these days, tech clouds also differ from each other, because they don't all live in data centers anymore. Some behave the same way, but live entirely on premises in private installations. Some go all the way from "the edge" (the point of contact with reality) to the racks of servers we generally associate with the cloud as well as traditional data centers.

In 1997, Professor Ramnath K. Chellappa, now of Emory University, used the term cloud computing for what appears to be the first time in a talk named "Intermediaries in Cloud-Computing." Since then, we've seen substantial changes in what cloud computing means to IT operations, as well as how it's applied to solving societal-level challenges.

Also: How edge-to-cloud is driving the next stage of digital transformation

What does the future of cloud computing look like in the next 3-5 years? It's gotten even more interesting since the pandemic. Since 2020, cloud computing has taken a tremendous leap forward, with many businesses fulfilling their ten year road maps in ten months -- or, for some, even in ten weeks -- because of the massive growth in demand for digital information and e-commerce.

The meaning of cloud computing is changing

The idea of scalable, meterable, centrally manageable, on-demand computing infrastructure works. It fits the business models of a wide range of operations, from startup operations to vast enterprises. Today, it's better to define cloud computing based on these characteristics than it is to continue to define it as computing infrastructure managed by someone else in an outsourced data center. 

Cloud computing has become so much more about managing and scaling resources and distributing workloads than the location and contents of hardware racks.

In a big way, that's why we've seen the rise of hybrid cloud and multi-cloud. Certain applications work great in a public cloud infrastructure. Other applications, usually because of latency, governance, or security issues, are better suited for on-premises operations. And, because it's never 100% safe to rely on a single vendor for a solution, multi-cloud has taken off as a wise bet-hedging strategy.

Multi-cloud meets a number of core business needs. It provides a way to prevent single-vendor lock-in and avoid the enormous switching costs that come when switching away from an entrenched vendor. It provides a fail-over option in the event that a vendor has serious technical difficulties or, worse, shuts down and ceases operations.

Multi-cloud is also appealing because it allows IT operators to choose vendors that are best for each workload. But it's important to be aware that this practice could lead to an unexpected level of lock-in. Vendors with unique or superior offerings are harder to switch away from if necessity requires it.

While multi-cloud is usually defined as clouds across vendors, it's also possible to think of multi-cloud in much the same way as hybrid cloud. You might have cloud services running in public cloud operations as well as cloud services running on-premise. This mix of multi-hybrid cloud may well become something of the new normal.

Also: Unlock your trapped data: Driving insights from edge-to-cloud

But, no matter how it's distributed, these new fully hybrid cloud operations all benefit from scalability, meterability, manageability, and on-demandability. Overall, this allows you to adapt to constantly changing business requirements while maintaining control over your data.

Key to this will be an increasing use of containers and orchestration tools. Containers are like virtual machines, but are lighter, more portable, and more scalable than VMs, which must also contain an entire simulated machine architecture. By managing containers across environments, complexity is reduced -and any time complexity is reduced, errors and system failures are also reduced.

Container usage also leads to automation for tasks like deployment, scaling, security, and monitoring, which then reduces the chance of errors and increases reliability -and saves time. 

Beyond these benefits, container usage provides a level of portability that also helps prevent lock-in, and provides a way to automate logistics and fail-over to be responsive to business needs, crises, and opportunities.

We can expect growth in serverless computing, where the unit of measure is not a complete server, but the workload, module, or application. Serverless reduces the need for infrastructure because the modules only execute when needed, without the need to spin up either a full VM or even a container. If an application has a wide range of traffic loads, serverless can provide both a cost-saving and load management benefit.

As we move forward over the next few years, edge computing will become even more prevalent. Edge devices will become more powerful, but also far more demanding, requiring large amounts of data to be processed in real-time, yet also shared organization-wide for analysis and insight.

Expect edge devices to be more connected, and to be left unattended more often. As they increase in power and intelligence (with AI and ML playing a big part here), they'll be able to perform more functions in areas of reduced or intermittent connectivity, and in areas of extreme weather or environmental conditions.

This is where 5G (and eventually 6G) come into play. 5G handles intermittent connections better, and has the ability to switch frequency and "beamform" to reach areas traditional cellular connectivity hasn't been able to reach. 5G is also putting far more intelligence into the field, which will substantially reduce latency, allowing edge devices to communicate with "the mother ship" far more quickly and with far greater responsiveness.

The growth of AI in cloud management

Increased reliance on complex cloud environments will increase the management challenges. Here, AI can help in at least four ways:

  • Automate routine tasks:There are a huge number of basic IT management tasks, like resource allocation and scaling, that could be managed by an AI. Any place where there could be a script, there could be an AI. Expect that the AI will get to the point that it takes less work to set up than a script.

  • Analyze data:Whether it's customer usage patterns, sentiment analysis, workload impacts on resources, or many other areas of observation, AI can provide insights into network operations and customer behavior that can improve reliability and highlight opportunities.

  • Improve user experience:Expect AI to provide front-end and even second tier customer support via chat and email. On one hand, it puts users into the hands of machines, but it also frees up human technicians to help with the more challenging customer problems.

  • New services and business models:As we've seen with ChatGPT, more and more cloud services are finding a way to add an AI component as a value add. Expect this trend to increase with more and more ways that AI can streamline and assist the humans who are doing the work and using the services.

Then, of course, there's security. With a security skills shortage and ever-increasing cybersecurity threats, preventing and mitigating attacks needs to be a top corporate priority -but it's also a greater and greater challenge. Attackers are becoming more sophisticated, with a lot more experience in launching successful attacks. As we've discussed, increasing complexity provides more points of failure, and more points attackers can exploit.

Even small companies are now managing a mind-boggling amount of information, both in motion and at rest. We're talking about terabytes, petabytes, and exabytes of both flow and storage. The only way data of that volume, moving at high velocity, can be managed is through software. But with the rapid changes and growth in bad actor sophistication, regular programming and pattern identification protection will simply not keep up.

This is one of those cases where AI isn't just optional or nice to have. AI will become the bulwark required to protect businesses where nothing else can move fast enough.

Cloud vendors will have to change, too

All of this leads to the fact that cloud vendors will also have to transform to keep up. So we'll return to the example that we've been using throughout our series on the evolution of the cloud: HPE GreenLake.

Also: Digital transformation powered by edge-to-cloud comes to life in this scenario of a big-box retailer

HPE GreenLake is a service platform that allows organizations to consume and manage resources on-demand, whether they are in the cloud or on premises. 

HPE GreenLake provides scalable on-premise support, so you can treat physical hardware, locked behind doors you control, as if it were a fluid cloud-based resource. By combining all of these resource delivery services into one offering, and then providing centralized management and deployment tools, it's possible to manage a widely distributed, multi-vendor IT operation from one centralized resource.

Moving forward, expect GreenLake to offer more tools and resources, as well as be able to handle greater challenges -- allowing technology leaders to scale the organization (or reduce it) in response to environmental and market conditions.

As we look forward 3-5 years, the watchword is "more." More connections, more power, more AI, more security challenges, more nodes, more devices, more locations, more options, more money, more efficiency, more bang-for-the-buck, more vendors -more, more, more. And, of course, more complex challenges in managing it all.

tag-icon Hot Tags : Innovation Cloud

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.
Our company's operations and information are independent of the manufacturers' positions, nor a part of any listed trademarks company.