Moore’s Law, sustainability and data centres

Issue 7 2022 Infrastructure, Information Security

Natalya Makarochkina.
Natalya Makarochkina.

An important principle in the development of IT over the decades has been Moore’s Law. Simply put, it predicted that transistor density in processors would double every two years as development progressed. Despite many predictions of its demise, it has more or less remained a guiding principle. However, what is perhaps less well known is a similarly persistent trend in the data centre space.

Despite a sixfold increase in the data being processed since 2010, data centre energy consumption only increased by 6% to 2018 (www.securitysa.com/*se2). How has that been possible, and how does it inform sustainability developments into the future?

Where does the data come from?

To contextualise this development, we must first understand where the data processing increase has come from. The Apple iPad was debuted in 2010, which also saw the introduction of Instagram and Microsoft’s Azure cloud service. 2011 introduced us to Minecraft, Snapchat and Uber, with 2013 bringing Amazon’s Alexa, accompanied by Xbox One and PlayStation. 2017 brought Fortnite and TikTok.

Social media engagement over the period increased manifold, while global data production went from estimates of 2 zettabytes in 2010 to 41 zettabytes in 2019. IDC estimates the global data load will rise to a staggering 175 zettabytes by 2025.

The pandemic’s effect has been substantial, with the MENA region seeing a big increase in messaging and social media usage: Social media isers in MEA and Latin America spend the most time on social networks, averaging over 3,5 hours a day.

More than half of users in MEA (57%) reported (in May 2020) spending even more time on social media as a result of the pandemic. Similarly, in a separate study, 71% of Middle East respondents reported WhatsApp and other messaging apps usage increased since the outbreak of the pandemic.

What impact does all that data have?

To understand the impact of this data explosion, a concept has been developed called data gravity (www.securitysa.com/*se3). Coined by engineer David McRory, the term refers to the tendency of an accumulation of data to attract applications and services toward it, precipitating further accumulation, which can lead to immobilisation of the data as well as underutilisation. Data that grows too big, too fast can become immobile, reducing its value and increasing its opacity. Only low-latency, high-bandwidth services, combined with new data architectures, can combat this growing and largely undocumented phenomenon.

What tech developments have made this possible?

Multiple technological developments can account for this data explosion being handled with only minimal increases in energy consumption, from improvements in processor design and manufacture, through power supply units and storage, but also the migration of workloads from on-premises infrastructure to the cloud.

Schneider Electric has been committed to sustainable business for decades. That has meant a renewed focus on efficiency in all aspects of design and operation. Gains have been made in efficiency in power and cooling, with UPS systems and modular power supplies showing significant gains with each generation, culminating in the likes of the current Galaxy VL line. This line’s use of lithium-ion batteries has not only increased efficiency, it has also extended operational life, reduced environmental impact by reducing raw materials usage, and facilitated ‘energised swapping’, where the addition and/or replacement of power modules can be performed with zero downtime, while increasing protection to operators and service personnel.

Advances in cooling, such as flow control through rack, row and pod containment systems, liquid cooling,and intelligent software control, ensure that the pure data processing gains are met and matched.

By ensuring that every link in the chain of power from energy grid to rack is as efficient, intelligent and instrumented as possible, we provide the right basis for the rapid development in computing, networking and storage.

Where do software and apps fit in?

Another key element of the technological development that has allowed such relentless efficiency has been the application of better instrumentation, data gathering and analysis that allows for better control and orchestration. This was illustrated by Google’s DeepMind AI, where the energy used for cooling was reduced at one of its data centres by some 40% in 2016, which represented a 15% overall reduction in power usage. This was accomplished using historical data from data centre sensors such as temperature, power, pump speeds, setpoints, etc. to improve data centre energy efficiency. The AI system predicted the future temperature and pressure of the data centre over the coming hour and made recommendations to control consumption appropriately.

The development of data centre infrastructure management (DCIM) systems has continued apace too, allowing the integration of AI to take advantage of all of these hardware and infrastructure developments. These experiments are now features, allowing unprecedented visibility and control. For those designing for new developments, software such as ETAP allows power efficiency to be built into the design from the outset, while also accommodating microgrid architectures.

What new data sources will contribute to this?

The data explosion is expected to continue increasing with developments such as industrial IoT and 5G, with increasing general automation and autonomous vehicles as driving factors. The data that will be generated far from the centralised data infrastructure must be handled, processed and turned into intelligence quickly, where it is needed.

New data architectures are expected to improve efficiency in how all of that is handled. Edge computing is seen as an important approach to manage more data being generated at the edge.

In one example, genomic research generates terabytes of data, often daily. Sending all that data to a centralised data centre would be slow, demand high bandwidth and be inefficient. The Wellcome Sanger Institute created an edge computing approach (www.securitysa.com/*se4) that allowed it to process data close to where it is produced – the genomic sequencers – with only what is necessary centralised. This saves on storage, bandwidth and speeds the time to intelligence from data. “That is where the edge paradigm has come to us,” said Simon Binley, data centre manager, Sanger Institute.

Modular data centres, micro data centres and better storage management will all contribute to handling this developing wave efficiently, keeping that data centre energy consumption line flat into the future. In the MENA region, 5G and centralisation with edge architectures will be balanced by more hyperscale facilities linking major demand centres.

What effects will this have on the whole data ecosystem?

However, efficiency must extend through not just the supply chain, but also throughout lifecycles. Vendors, suppliers and partners must all be engaged to ensure that no part of the ecosystem lags in applying the tools to ensure efficiency. This applies as much in the design of new equipment and applications as it does through working life and decommissioning. Understanding how an entire business ecosystem impacts the environment will be vital to truly achieving net-zero goals.

Agreed standards (www.globalreporting.org/standards/), transparency and measurability are all vital factors to ensure results.

These considerations are taking hold across the region and great efforts are being made to do better. Greater transparency is now accepted and embraced, with more and more organisations reporting their progress.

Tools and processes shared

The data centre sector has much that will be of use to organisations and industries going on the journey of sustainability towards increasing circularity. With the expertise and experience in efficiency, combined with the tools and intelligence from operations, and deep commitments to tight targets for net-zero operations, the data centre sector can not only handle the data explosion and digital demands of the world, but do so sustainably, while providing others with the tools and insights to do the same for their respective sectors.





Share this article:
Share via emailShare via LinkedInPrint this page



Further reading:

There is a SaaS for everything, but at what cost, especially to SMEs?
Editor's Choice Information Security Security Services & Risk Management
Relying on SaaS platforms presents significant cybersecurity risks as the number of providers in your landscape increases, expanding your attack surface. It is important to assess the strength of the SaaS providers in your chain.

Read more...
Addressing today’s mining challenges: cyber risks beyond IT
Editor's Choice Information Security Mining (Industry)
Despite the mining industry’s operational technology systems being vulnerable to cyberattacks, many decision-makers still see these threats as purely an IT issue, even though a breach could potentially disrupt mining operations.

Read more...
How to effectively share household devices
Smart Home Automation Information Security
Sharing electronic devices within a household is unavoidable. South African teens spend over eight hours per day online, making device sharing among family members commonplace. Fortunately, there are methods to guarantee safe usage for everyone.

Read more...
Western Digital reveals new solutions
Products & Solutions News & Events Infrastructure
Western Digital unveiled new solutions and technology demonstrations at the Future of Memory and Storage Conference 2024. The innovations cater to diverse market segments, from hyperscale cloud to automotive and consumer storage.

Read more...
Fortinet establishes new point-of-presence in South Africa
News & Events Information Security
Fortinet has announced the launch of a new dedicated point-of-presence (POP) in Isando, Johannesburg, to expand the reach and availability of Fortinet Unified SASE for customers across South Africa and southern African countries.

Read more...
New tools for investigation and robust infrastructure security
News & Events Information Security
Cybereason continues to enhance its security platform, with recent updates introducing improvements in file search operations, investigation query results, and cloud workload protection, providing more granular data and faster key artefact identification.

Read more...
The Duxbury Services Gateway revolutionises the Edge
Products & Solutions Infrastructure
Duxbury Networking has announced the launch of the Duxbury Services Gateway (DSG) range. These cost-effective edge compute appliances are designed to meet the diverse needs of South African businesses including SD-WAN, Firewall, and IP PBX applications.

Read more...
Navigating the evolving tech landscape in 2024 and beyond
Residential Estate (Industry) Infrastructure
Progress in the fields of AI, VR and social media is to be expected, but what is not, is our fundamental relationship with how we deploy solutions in our business and how it integrates with greater organisational strategies and goals.

Read more...
Eight terabyte desktop SSD
Products & Solutions Infrastructure
Western Digital has expanded its SanDisk portfolio with the new 8 TB SanDisk Desk Drive, its highest capacity yet on an external desktop solid state drive (SSD), also available with 4 TB

Read more...
78% of organisations highly concerned about cloud security
Information Security Infrastructure
As organisations develop and deploy more cloud applications, security becomes more complicated. Many organisations are adopting a hybrid or multi-cloud approach, which has expanded the attack surface and increased complexity.

Read more...