There is no doubt that the adoption of AI and other high-performance computing has accelerated at an extraordinary pace. The AI market is estimated to be expanding at a CAGR of 35.9% and with 83% of companies claiming that AI is a top priority in their business plans, this trend shows no signs of slowing down.
This rapid growth has fueled a thriving data center development industry but has led many data intensive businesses to scale their capacity reactively rather than considering the nature of the workloads involved.
At the same time, concerns around the significant environmental and financial impact of powering and cooling such energy hungry workloads, alongside the connectivity and latency demands of AI computing, and an increasingly fragile political and economic landscape globally - it is not surprising that many businesses are challenging their MSP to ensure their data is in the most appropriate place.
Maximizing Efficiency
Every digital activity starts a process in a data center, from internet banking to streaming tv shows, simulation modelling to interacting with your favorite conversational AI tool. All these pursuits require the processing of data - but that is where the similarities end.
Digital infrastructure used for generative AI or financial trading calculations, for example, requires the almost instantaneous processing of enormous amounts of data that necessitate incredibly fast networks.
Delays in response time caused by data travelling long distances within slower networks would be detrimental to the success of the product and as a result these types of workloads must be located within a short distance of the data source or end users - such as an edge or localized data center. These smaller facilities are often in metro locations where there is closer proximity to fiber routes, internet exchanges and submarine cables to enable access to ultra-low latency networks and redundant internet connectivity.
Metro sites are located in areas where land is at a premium, this means they often have a more dense infrastructure architecture - both of which can necessitate higher operational costs. Yet it is often completely unnecessary to host all business data in these types of data centers and certainly drives an unnecessary expense. By segmenting data by latency requirements, workloads that are latency agnostic such as the training of AI workloads can be located in more rural locations where land is at less of a premium and there is space for larger campuses offering capacity at a lower cost.
Enhancing Sustainability
Another significant consideration in data location is that of sustainability. Data centers built to suit AI workloads require an incredible amount of energy to power and cool the associated digital infrastructure. With the current electricity usage of the global data center market estimated to be 55GW, coupled with the introduction of ESG related legislation such as CSRD in the EU, many company leaders are struggling to balance the technological needs of their business with the carbon footprint of their operations. One way to mitigate this issue is to move digital infrastructure to a more sustainable data center location.
The Nordic region, for example, boasts a cooler natural climate and an abundance of renewable energy that can enable highly energy efficient cooling technologies that reduce operational costs. Moreover, the Nordics are forward thinking in its circular economy principles and data center businesses are actively encouraged to recycle waste heat from the data center cooling process for the benefit of the local community. All atNorth data centers can accommodate the latest in energy efficient cooling such as Direct Liquid Cooling and Direct to Chip Cooling. The business has also formed a number of heat reuse partnerships including a collaboration with Wa3rm – a leader in the development of circular and bio-based operations for waste streams - to reuse excess heat to grow vegetables close to its DEN02 data center in Denmark and with Kesko Corporation in Finland that will utilize waste heat from atNorth’s new FIN02 campus to heat a neighbouring branch of one of its stores.
These heat reuse partnerships reduce the carbon footprint of the data center and the receiving organization and also enable clients to essentially decarbonize their IT workloads.
Sovereignty & Compliance
Within some data intensive businesses there are workloads that are so confidential they must be hosted on premise, others are bound by country specific legislation and must be within national borders or within the EU for example. A rise in political uncertainty has led many companies to more carefully consider where their data is hosted because hosting data outside of a specific jurisdiction may leave workloads exposed to regulatory or geopolitical risks - a report by CIVO found that 84 percent of UK IT leaders are now concerned that geopolitical developments could threaten their ability to access and control their data.
At the same time, conflicting legislation such as the EU’s GDPR and the US’ CLOUD Act - the former requiring strict controls on the use of personal data even if the data is processed outside of the EU, whilst the latter demands access to data held by US providers even if it is hosted outside of the US - has further fueled uncertainty in the industry and has undoubtedly given rise to an increasing number of sovereign cloud providers that offer a hybrid model, enabling the flexibility to keep sensitive data on premise and utilizing private or public cloud networks where appropriate. In this instance segmentation mitigates the risk of non-compliance and data security - an essential consideration for a business’ reputation long term.
We are worlds away from a time when companies kept all their data on premise or within their local data center facility. The modern demands on data are too great and at a time when business efficiency, sustainability and data security are competing for the attention of company leaders, it is no longer viable to take a ‘one-size-fits-all’ approach to the hosting of digital infrastructure. By segmenting business data by use case, processing speed, sensitivity and regulatory requirements, it's possible to host each workload in the most suitable location. Such a robust segmentation strategy can ensure the resilience and security of your data whilst simultaneously reducing costs and environmental impact - for long term sustainable growth.