Espace Presse – What’s the future for the cloud in the face of savings issues?

How to summarize the issue of savings for the cloud today?

Today, the development of the cloud has reached a stage comparable to electricity in the time of Nikola Tesla (between the end of the 19th century and the beginning of the 20th century), i.e., we have identified the main principle and we are beginning to create the first infrastructure (to continue the analogy with electricity, it is equivalent to the first lines and the first electrified homes). The cloud also comes to homes through, for example, photo sharing, or public clouds such as those offered by web giants such as Google, Apple, Facebook (Meta), Amazon and Microsoft or the Chinese company Baïdu . Now, more than a century later, electricity is widely deployed and available anywhere, at all times, on demand.
For the cloud, it will be the same thing: it becomes an on-demand computing and storage tool-a “commodity” to use the Anglo-Saxon neologism. The question that then arises is: how can we avoid overboarding and wasting this resource, especially considering what we are starting to see from cloud energy consumption?

© Fotolia

In what aspects can we do to move towards a more economical cloud?

The first main part is
implementation of hardware and software components. Thanks to the expertise gained in building processors and memories for embedded calculations (small size and low power consumption), we are able to produce components that are both economical and very efficient. For example, the ARM processor, which is endowed with a simpler architecture and therefore consumes less energy than conventional processor families, and developed by ARM Ltd since 1983, began by covering telephones to offer them great energy autonomy. It is primarily integrated with cloud computing and storage media, which are associated with hardware accelerators that optimize certain components of consuming computing (such as Kalray circuits, a spin-off of CEA). CEA is very active in this field, with proposals for new architectures and accelerators aimed at greatly reducing the energy required for computing very large data, especially through in-memory solutions. in computing that lowers data transfer and therefore its energy cost.

CEA is an expert in software engineering

© Adobe Stock

In addition to optimizing processors and memory,
it is also important to optimize the application, i.e. the software. At CEA, we have long had strong expertise in software engineering. Close to industrial applications, our teams have, for example, designed tools capable of programming critical processes-which should be specifically responsive and optimized for their environments. They are now mobilizing to adapt these tools to the constraints of calculation and storage in the cloud, and allow, for example, to best take advantage of the interrelationships between algorithms, and the electronic components where they work.

All of this work is absolutely essential, but we can also act on the system as a whole.

Software Optimization

© Fotolia

Which means?

You have to remember thata cloud is actually several clouds. In fact, if the cloud is a set of machines connected to a network in a data center, the Cloud as a whole is in fact a set of data centers “talking to each other.” But the latter are often located at a great distance from the applications they process – for example, for two people making a videoconference between several kilometers, the data will sometimes pass through data centers located several hundred kilometers away. Thus we see the emergence of the concept of
“cloud on the side” or field cloud: smaller clouds closer to applications, i.e., the place where we produce or calculate data. This is a revolution that is happening now. If we retreat, it will be a kind of galaxy of clouds emerging.

Is it really more economical to have a space of clouds than a few large data centers?

Decentralization and edge clouds provide capabilities that better suit local needs, and many works seek to optimize this local use. In addition, they process proximity data, which therefore does not have to travel thousands of kilometers, which reduces energy consumption. Finally, while global clouds should be generalized, local cloud can specialize thanks to the knowledge of the type of data to be processed, and thus increase their efficiency, thus avoiding the consumption associated with transportation over long distances. The effect of this savings should be clearly scientifically assessed, depending on the case of use. But not everything can be done locally, especially when data must be shared between users, such as for a videoconference, or when the amount of data requires very high computing power, such as in digital simulations of an airplane.

Thus, the question arises in the distribution of calculations between those that can be done in these cloud edges, and those for which we need large data centers with different computing capacities. akoThe organization of this distribution of calculations is a real issue of savings : if we are wrong, we may over-consume the clouds on the side because we will over-demand them calculations which they do not measure or, conversely, we will build data centers that run idle because they don’t have the appropriate tasks allocated. Therefore, orchestration algorithms are major issues in decentralized clouds.

Finally, this question of taking calculations brings up
guarantees to be provided so that a calculation can take place correctly : because if there are errors, we restart the calculations, we duplicate the efforts, and we consume more energy.

Levers for more economical clouds

© Pixabay

Is there any other possible action to move towards a more economical cloud?

The last lever of action is
human consciousness. That is, how can we ensure that the operator, as well as the user of these systems knows how to save resources? It can pass
visualization of our tools in the cloud, which will allow us to see where the hot spots or cold spots of our digital territory are. From there, the energy balance will be displayed. Thus, we can offer the user solutions to reduce his consumption, and thus his carbon footprint – such as what is already available for electricity. At the same time, the more we visualize our consumption in the cloud, the more it will need to have
tools to monitor and adapt this consumption “on the fly” to improve energy efficiency. These are truly “digital twins” of the cloud infrastructures that need to be built, and their construction will be the challenge of extensive research in modeling, simulation and optimization.

The energy consumption of cloud and data centers in some numbers

  • Worldwide, data centers now use 3% of the electricity produced and its share is expected to increase significantly in the coming years. For 2020, it is estimated at 650 terawatt hours, i.e., a consumption higher than in France.
  • Each year, an average-sized data center uses 600,000 cubic meters of water for cooling
  • According to a European Commission report published by the end of 2020, the consumption of European data centers dropped from 53.9 TWh/year in 2010 to 76.8 in 2018.

Leave a Comment