Optimizing your infrastructure for energy efficiency is a must for both your wallet and the environment. Simple choices in how you architect your platforms can make a huge difference in how much energy is being wasted – and in turn – how much money is going down the drain.
In our previous blog, we explained why moving to the cloud can be greener for your business, but we also noted that moving to the cloud is not enough on its own. In this blog, we outline three main areas of your workload that you need to pay attention to when it comes to optimizing energy expenditure for green computing.
Optimizing Data Storage
Data types can be thought of as three layers in a “pyramid”: long-term (‘cold’) data, mid-term (‘warm’) data and short-term (‘hot’) data.
Long-term data, at the bottom of the “pyramid”, could be archived for years and does not need to be accessed regularly. This is where the largest quantity of data falls. Mid-term data might need to be accessed every now and then, but it is not actively or continuously being used. However, short-term data is being used for immediate needs and therefore must be quickly accessible.
The way in which each of these types of data is stored is critical, as storing it in the wrong way wastes both energy and money. Long-term data can be stored on spinning drives that spin down when not in use so that they only consume energy when you need it. You could even go as far as switching the equipment off when the data is not needed. This saves lots of energy, at the cost of having to wait a little longer when the data is actively needed.
The higher you go up the pyramid, the more energy – and money – are required to store that data in an accessible way. ‘Warm’ data can be stored on either traditional spinning drives or commodity SSD drives, to make sure that it is available relatively quickly, at an acceptable cost in both energy and money.
‘Hot’ data is often stored on more expensive and fast drives like fast SSD’s or even NVME to ensure it is always immediately available for use. Of course, it wouldn’t make sense to store short-term data on long-term infrastructure, but the trick is figuring out whether data really needs to be in the short-term category. Minimizing the data stored in the top of the “pyramid” and placing it rather with the mid-term or long-term data will save both energy and money for your business.
Optimizing Data Processing
Batch Processing
There are a number of ways in which you can process data – one of which is known as Batch Processing. If, for example, you are an administrative company, you probably process a lot of data around the end of the month – like paying salaries and sending invoices. When large amounts of data like these need to be processed in very similar, repetitive ways, the process usually is automated. Typically, you can time it according to when you want the data to be processed. This is where some smart decisions can be made.
During the day, there are likely activities taking place which won’t take place at night – for example, video calls. In order to save energy and use processing power as efficiently as possible, the compute capacity used for video calls during the day can be used for scheduled batch processing at night. This means that equipment is used continuously, preventing waste.
Alternatively, with a hybrid cloud environment, Batch Processing could be scaled up on the cloud and scaled down again when no longer in use.
Continuous Processing
Not all data can be processed in batches but indeed must be continuously processed. Some applications, for example, are always being used but perhaps are used more at certain times. This means that the lowest consumption level can be run on dedicated servers in a data center because you know they will be in use 24/7. The peaks in consumption are where it is beneficial to scale up in the cloud when necessary.
Caching
Of course, processing data once is cheaper and more sustainable than processing the same data multiple times. Often, systems generate results or calculations in real time which makes a lot of sense in certain scenarios such as social media. However, for data that doesn’t frequently change or need updating, the result can be reusable – for example, when you need to extract a certain pdf file. Therefore, the trick is to look for opportunities where you can use caching to prevent double computations. This will save energy that can be better used elsewhere.
Transferring Data
In short, the less data you transfer, the more money and energy you save. This is because the transmission networks that data is sent through are typically quite energy intensive. There are a few ways to reduce the amount of data being sent through.
Firstly, caching mechanisms can be used to prevent double processing – as explained above. These mechanisms can easily be built into applications and have added benefits of making the user experience better for end users, as well as reducing storage and transmission costs.
Secondly, compression algorithms can be used to reduce the amount of data being transferred. Using the most efficient compression algorithms is key here.
Conclusion
Optimizing your workloads for energy efficiency can be done in a wide range of ways. But data storage, data processing and the transferring of data are certainly three large energy consumers that should be looked at in order to help make your businesses greener.
Learn more about Leaseweb Hybrid Cloud or read about our commitments to sustainability.