Ivo Ivanov, CEO of DE-CIX, 27 November 2023
In today’s business world, speed is of the essence. Modern business demands the shortest time to market, streamlined production processes and logistics, fast response times for customer service 24/7, not to mention innovation and unbeatable ideas for creating digital products and data-driven business models. Increasingly, all of this is being achieved through the support of AI models, which are rapidly growing in number and potential. Commercially available AI as-a-service solutions, which enable companies to train AIs for their own specific use cases, live in the cloud. In fact, almost all AI set-ups will benefit from being in the cloud, boosting the AI with cloud-native agility, scalability, and the convenience of being available from anywhere 24/7. As a result, companies need to update their cloud strategies to keep pace with the rapidly changing technology landscape, explains Ivo Ivanov, CEO of DE-CIX. Not only does the increasing use of AI demand an enabling multi-cloud scenario, but also the way companies connect to the cloud is becoming an essential component in achieving business success.
The combination of cloud and AI is accelerating business processes massively. It has been shown that cloud adoption leads to faster revenue growth, lower operating expenses, and increased resilience in crises. A cloud and digital transformation process can reduce a company’s time to market by as much as 50%. Meanwhile, according to research by Accenture, AI adoption has the potential to boost profitability rates by on average 38% by 2035.
AI is helping companies to achieve their essential KPIs
Many business KPIs – ranging anywhere from first call resolution (FCR) to product innovations based on customer behavior – are becoming dependent on AI support. From the design of the product to the production, and on to the usage by customers, AI supports efficiency, time to market, customer support, innovation, and personalization. Take one example: AI is now being embedded into the infotainment systems of cars, enabling not only personalization according to the driver’s or passenger’s preferences, but also the detection of the state of the driver (such as tiredness) and the taking of appropriate mitigating actions.
To do this, the AI model needs data from the sensors within the car in almost real-time. The connectivity from the digital product to the cloud must therefore be resilient, reliable, and very fast. The same, critically, goes for e-health applications for telemedical diagnostics. It also goes for the inclusion of AI into payment and e-commerce apps to increase sales by making more intelligent recommendations, and for chatbots providing after-sales support. With the latest releases of OpenAI’s ChatGPT and Google’s PaLM – among others – coming onto the market in 2023, AI is booming, and with it the business potential.
AI from the cloud has many advantages over a locally-hosted model: It is cloud-native, it is faster (as long as you get the connectivity right), more convenient, and vastly easier in terms of infrastructure planning. With the advent of AI-supported applications and services, the time has come for all enterprises to embrace multi-cloud. Although a lot of AI training and processing will happen in one particular cloud, many other applications that feed data to or receive data from the AI model may well be housed in other clouds or in cloud-based applications. Companies that shy away from the assumed complexity of managing multi-cloud scenarios do so at the risk of missing out on innovative potential.
Poor connectivity design slows everything down
In our fast-paced business world, agility, and flexibility – which are essential for keeping ahead of the competition in the race to the customer – depend on short decision pathways, and direct communication. The same goes for data and applications in the cloud: Short data pathways lead to faster reaction times. Therefore, the way you connect to your clouds and AI applications will impact on how well you can do everything else.
Within the cloud, data can be processed at lightning speeds. But when the cloud is dependent on receiving data from external sources – for example from the company infrastructure or other clouds – this can slow things down. Because it’s not only the data, services, and applications in the cloud that are important, it’s also the speed that data can travel to and from the cloud.
The problem is that data cannot travel faster than the speed of light. While this may seem fast, making long and unpredictable detours through the public Internet to get data into the cloud slows down the transfer of data to the cloud. What’s more, it also potentially places the data at risk of exposure. Direct connections from the company infrastructure to cloud services optimize the transmission of data by shortening and securing the data pathways. Connecting the company’s IT infrastructure to a Cloud Exchange on an interconnection platform in order to access clouds means that there is a direct pathway to your clouds at the highest possible speeds. As a result, AI workloads are not subject to the capriciousness of the public Internet, and can thus offer close to real-time analytical insights, timely recommendations or warnings, and all manner of ingenious support.
But it’s not just about optimizing the connectivity to the clouds – this is just the first step. Secondly, the pathway between clouds can be reduced dramatically by using a cloud routing service implemented directly on the interconnection platform. A cloud routing service makes it possible for clouds to talk directly to one another (cloud-to-cloud communication), so that data does not need to first travel back to the company infrastructure. Such a service also ensures interoperability between clouds, making an AI-enabling multi-cloud scenario more manageable and increasing the performance of applications across all systems.
Finally, by also directly connecting to end user access networks over an interconnection platform (also known as “peering”), the performance of customer-facing apps and customer service chatbots, as well as personalization and customization systems for example, can also be given a performance boost. In this way, you can make your data work for you – fast, flawlessly, and securely.
It isn’t just THE cloud – data needs a safe home in many clouds
Taking a step back from AI for a moment, there is one final KPI that cannot be ignored: Business continuity. Many companies have come to realize that the cloud offers greater resilience than keeping data in on-premise infrastructure. But you shouldn’t need AI to understand that using just one cloud places the company at risk of cloud concentration. With an exclusive relationship with one cloud provider, this provider can become a single point of failure in the event of an outage. So, a multi-cloud strategy is more than just a nice-to-have, it’s an essential part of every business continuity and disaster recovery strategy. Set up correctly, a multi-cloud environment provides continuous access to company data and applications regardless of any localized issues. Here, one important ingredient is redundant access to cloud onramps from geographically separated data center locations via a distributed interconnection platform. Secondly, a cloud routing service can be used to set up continuous backups and the seamless and automated transition to the backup cloud in the event of an outage. This is a guarantee of business continuity.
The race is on: getting ahead of the competition with AI from the cloud
So, the race is on. Already today, 35% of businesses are using AI support. The market for AI as a service is expected to grow by close to 40% per year until 2030, by which time the market as a whole is forecast to reach close to two trillion US Dollars. To be a part of this exciting trend, you need to be in the cloud. But the trick is not simply to be in clouds. Instead, it’s about getting the most out of your clouds by making sure that your cloud connectivity is robust, resilient, and lightning fast.