Bitcoin is not normally associated with energy efficiency, but the recent explosion of interest in the cryptocurrency has raised fears that new technologies can place undue pressure on the world’s power supply.
Thousands of specialized computers worldwide are now employed to create or “mine” bitcoins and, in the process, authenticate transactions and protect the system. These mining enterprises have become huge endeavors, consuming megawatts of power. In fact, bitcoin mining is now so energy-intensive that its estimated power consumption matches that of the Czech Republic, according to Digiconomist, a cryptocurrency tracking website.
Although mining for the cryptocurrency has cooled a bit over the past six months, Bitcoin’s massive power draw neatly illustrates the challenge of technological advancement: While new technologies provide an opportunity to dramatically address some of the fundamental problems in the world and create new economic opportunities, they also require a tremendous amount of power to operate.
And the expected growth in these emerging data-driven technologies, such as artificial intelligence, autonomous vehicles, or in this case, blockchain technology, means they can’t be powered sustainably using our current IT infrastructure. The amount of energy needed is simply too great, and given that most of our energy still comes from fossil fuels, this presents a potentially serious ecological problem. When it comes to powering the IT infrastructure of the future, the transition to both a clean power system and more efficient IT architectures is critical.
Needed: New tech that consumes less energy
Incremental improvements in the energy efficiency of computing infrastructure can help mitigate this problem in the short and medium term, but in the longer term, we need to develop completely new technologies that consume far less energy.
In recent decades, we’ve taken for granted the notion that as microprocessors become smaller, making computers cheaper and more powerful, they would also become more energy-efficient. In 1965, Intel co-founder Gordon Moore observed that the number of transistors per square inch on an integrated circuit had doubled every year since its invention. He predicted that this trend would continue for the foreseeable future. However, processor-centric, high-power computing architecture is reaching the limits of physics and the certainty of Moore’s Law is fading. More efficient computing is no longer a guarantee. And this is happening at a time when we are experiencing a dramatic increase in the demand for data processing power from a growing number of networked devices, the 5G cellular network, and emerging technologies such as artificial intelligence.
The world’s IT infrastructure is built on iterations of a technology first developed in the 1960s, and that model cannot be applied to a world where literally everything computes, including watches, home appliances, and cars. The number of connected IoT devices is forecast to grow from 8.4 billion in 2017 to more than 20 billion by 2020, according to the International Energy Agency (IEA). As these new devices become connected over the coming years, they will also drive exponential growth in demand for energy, most notably in data centers and network services.
Energy use forecasts
Data centers worldwide currently consume around 1 percent of the world’s electricity, and while data center workload will triple by 2020, related energy demand is expected to grow by only 3 percent, thanks to continued efficiency gains, according to the IEA. Data networks, on the other hand, are less predictable. The mainstay of the digital world, these networks currently consume around 1 percent of total electricity demand, with mobile networks forming around two-thirds of the total, the IEA notes. By 2021, electricity consumption from data networks could increase by as much as 70 percent, or fall by up to 15 percent, depending on the efficiency gains achieved.
Beyond the next three to five years predictions about energy use are more difficult to make, but given the expected exponential growth in data it seems clear that linear efficiency improvements to our current IT architecture alone will not be enough. The onus is on the leading technology companies to come up with potential solutions. We need to completely change the paradigm.
In the short term, big technology companies such as Hewlett Packard Enterprise (HPE) are focused on developing efficiency improvements for existing IT architectures while transitioning power supplies to more sustainable sources. We have committed to increase the energy performance of our product portfolio 30 times by 2025, from a 2015 baseline, and will source 50 percent of our total electricity consumption from renewable sources by 2025. Other big tech companies have announced similar initiatives.
Consumption IT, edge computing increase efficiency
Technology companies are also offering services designed to improve the efficiency of their customers’ IT operations. While many companies have embraced hybrid IT, they now have extensive IT infrastructures that include a mixture of legacy IT, cloud-based IT, and on-premises systems. Running those hybrid operations, while also helping organizations launch new digital initiatives, can be challenging. Embracing a consumption-based approach can help. It allows those companies to pay for only the IT services and resources they consume while handing off the day-to-day mundane tasks such as backup to an IT service provider and eliminating expensive, time-consuming capital investments in IT equipment.
This approach can improve the energy, equipment, and resource efficiency challenges that our customers face. For example, most servers are underutilized by more than 80 percent, according to a Natural Resources Defense Council report, yet still consume considerable amounts of energy while doing nothing. A trusted IT partner can help an IT department optimize infrastructure configurations, saving organizations an average of $75,800, according to IDC research.
Similarly, edge computing is reducing energy expenditure and cooling needs. Through IoT devices, edge computing helps to accelerate digital transformation by turning huge amounts of machine-based data into actionable intelligence closer to the source. It eliminates the need to send that data back across a relatively slow connection to be processed at the data center, and back out to be implemented.
These new IoT technologies also offer opportunities to drive new levels of efficiencies across entire industries. For example, many manufacturing-intensive companies now use big data analytics to crunch a constant stream of data coming from sensors placed around their costly machines and equipment. They can then accurately predict maintenance needs to prevent downtime and reduce operational costs. Cities, too, are modernizing their electric grids and buildings for enhanced efficiency and reliability through automated controls and metering technologies.
But as these data-intensive technologies become commonplace, we’ll need more sophisticated technologies to manage and gain insights from that data. Extracting data’s true potential will require exascale performance, the ability to process 1 quadrillion calculations per second, and will require machines to be 100 times faster and more energy efficient than today’s fastest supercomputers.
Teams in the United States, China, the European Union, and Japan are working on creating exascale technologies. Innovations such as HPE’s Memory-Driven Computing, which puts memory, not processing, at the center of the computing platform, is enabling a new level of performance, efficiency, and flexibility. HPE’s research project The Machine, for example, will use in the order of 1 percent of the energy per calculation achievable today, with the ability to perform complex processes in such high-compute operations as climate science, cancer research, and artificial intelligence.
R&D to watch
Many companies are rethinking computing architectures and processes, and innovating radically different technologies that will consume energy proportionally to the work being done. R&D currently underway to unlock this include:
Non-volatile memory (NVM): New technologies such as magnetoresistive RAM (MRAM) consume no energy when idle and are much more energy efficient than dynamic RAM (DRAM), currently the dominant memory in most computer applications. DRAM is a volatile memory and vulnerable to power interruptions, and so requires power to maintain the information stored inside it. It therefore consumes a great deal of energy to keep it secure. MRAM, however, is nonvolatile, so power interruptions are not a concern.
Photonics: Using photons rather than electronics (electrical links), this technology uses microscopic lasers to shoot hundreds of times more data down an optical fiber, eliminating the use of copper wires and saving huge amounts of energy to power and cool those systems. These fibers are also very small, creating space efficiency and making physical installation much easier.
Gen-Z: This new open ecosystem in computing allows memory and compute to scale independently of one other, according to the needs of the application, without computing architecture constraints. This means energy consumption is not burdened by unnecessary components such as CPUs, which are typically added to a system just to enable more memory capacity.
By bringing together technologists and problem-solvers from diverse backgrounds, we are reinventing computing architecture to achieve performance and efficiency gains not possible today. Workloads will be completed much more quickly, allowing for such innovations as AI or big data analytics to become commonplace. This improved computing power will, in turn, reshape our industries and let us tackle more complicated challenges such as disease diagnosis or mitigating climate change. It’s why we are working hard to make this technology a reality.
Source > hpe.com