Quantum computing is always mixed, which requires constant coordination

Quantum computing is always mixed, which requires constant coordination

The modern computing revolution was driven by the development of central processing units (CPUs), which became smaller and more complex over time. This development eventually led to the microprocessor, the main form of CPU we use today. Along the way, more specialized chips emerged - graphics processing units (GPUs), field-programmable gate arrays (FPGAs), and application-specific integrated circuits (ASICs). Each specialized chip accelerates and improves processing performance in a different dimension and unlocks new computing capabilities.

[[352848]]

The emergence of quantum has prepared for the next revolution in computing power

Each new computing option has led to a hybridization of computing power. Instead of simply sending instructions to a CPU, we can now spread computing across a range of devices, each capable of solving a specific set of problems.

The increase in computing options has also complicated the computing environment. This complexity brings two challenges. One is the need to create a stable and scalable architecture in design to facilitate the execution of multi-device computing jobs. The second is to ensure that these jobs are actually run in an efficient, optimized and repeatable manner. In other words, we not only need to design multi-device architectures, but also need to coordinate computing between them.

This allows us to better understand the quantum stack. A quantum stack is a stack of quantum computing devices that use a mix of computing power. The structure of the stack necessarily involves both classical processors and quantum computing devices. Even within a single quantum algorithm today, computation is shared between classical and quantum processors.

The structure of a quantum stack reflects complexity

The complexity of the architecture is further compounded by the reality that, just like access to high-performance GPUs and HPC resources in other architectures, access to quantum devices is and will be remote.

At the same time, organizations attempting quantum capabilities will also rely heavily on their own and internal private cloud assets in order to protect their evolving IP.

Quantum hardware and software continue to advance

Because both quantum hardware and software are constantly evolving, the architecture of the quantum stack and the orchestration of its components must allow for a degree of “switchability.” That is, the quantum architecture must have a degree of flexibility that enables organizations to try new technologies and new ways of coordinating without being tied to any one solution. The emphasis on interoperability in the design of quantum-related technologies foreshadows a continued need for adaptability.

The hybrid nature of quantum stacks

In addition to describing some of the unique features of hybrid quantum architectures, these include:

First, the hybrid nature of the quantum stack reflects a broader trend toward hybridity in the architecture of computing devices.

Second, the intrinsic differences between quantum devices and various classical devices mean that they will not replace each other.

Third, the inherent complexity of hybrid architectures requires orchestration tools to simplify and optimize their performance.

The relative advantages of classical and quantum

There are relative advantages between classical and quantum devices that, at least in part, reflect their relative maturity. The earliest mechanical computing devices date back to the mid-1800s, and the first programmable computers appeared in the mid-1930s. Since then, classical computers have continued to advance, roughly in line with Moore's Law. Today, they perform an incredible range of functions, including simulations of quantum devices.

Quantum computing in the 20th century

Quantum computing is a product of the 20th century. The theories of quantum physics were not coalesced until the 1920s, and Richard Feynman did not propose the basic concept of a quantum computing device until 1982. That said, quantum processing technology is approaching a tipping point where, in some cases, it will outperform classical devices.

Quantum devices – getting more powerful

As quantum devices continue to improve, they will become more powerful than classical devices in some functions. While traditional devices rely on binary bits, which can have a value of either 1 or 0, quantum devices rely on qubits that can exist in a linear combination of two states at the same time.

The state of a qubit can also be entangled with the states of other qubits, meaning that the behavior of one qubit can affect the behavior of many. Because of these unique properties, adding more qubits creates a network effect that quickly gives quantum devices more computing power than classical alternatives.

Given these differences, how should we think about the relative strengths of classical and quantum computing devices?

Now and in the future, classical computing will be best for everything from data preparation and parameter selection to post-processing, plotting, and certain types of data analysis. Currently, high-performance computers and supercomputers are also the best tools for analyzing massive data sets.

Of course, the advantages that Classic devices have in certain situations are not just due to the inherent characteristics of these devices. They also stem from the fact that there is an established ecosystem of best practices, optimizations, and tools for these use cases.

The power of quantum

One of quantum's relative strengths is its ability to extract information from small data sets by extensively analyzing the data from multiple directions. These capabilities will have a significant impact on machine learning and modeling the evolution of complex but rare phenomena, such as financial crises and global pandemics, when data is difficult to obtain.

Quantum computing allows for an enhanced ability to sample from probability distributions that are difficult to sample using classical techniques. This has many applications in solving optimization and machine learning problems, such as generative models.

Finally, as first proposed by Richard Feynman, quantum devices can be used to simulate quantum systems, such as the interactions between molecules, in ways that classical devices could never achieve.

Quantum devices will not replace classical devices

Quantum devices can be used to solve specific problems, especially those that are difficult to solve on classical computers. The inherent capabilities of quantum technology will enable it to accelerate advances in biology, chemistry, logistics and materials science.

The entire computing field has long been moving toward hybrid mode development. Quantum computing follows this trend, mainly because hybrid mode provides a special form of computing power.

Since few companies were willing to invest in (or could afford) quantum hardware in the early days, they built classical architectures that could access quantum devices as needed.

Organizations where quantum disruption is predicted—chemical and material sciences, pharmaceuticals, financial services, logistics, security, and others—should focus specifically on developing these architectures and cultivating the other necessary resources to ensure quantum readiness.

In addition to classical computing power, these resources include the talent and in-house expertise needed for quantum.

Quantum Future

Looking ahead, quantum computing will probably always be a "hybrid" technology. First, it's not necessary to use quantum computing to do something that classical computers already do well. Second, cost remains an issue. Quantum devices are and will always be expensive and specialized. It's not cost-effective to use them to do something that advanced computing systems can already do.

Finally, to return to a point made above: since quantum computing can and should be applied to problems different from those that can be solved by classical computers, the real business challenge is to identify exactly those problems or aspects of problems for which quantum devices are best suited in specific industries.

Orchestration and hybrid approaches

When it comes to the need for orchestration, there are things to learn from hybrid cloud infrastructure. 69% of enterprises have already adopted a hybrid cloud approach, and the complexity involved has led many enterprises to adopt cloud management. And this management, like the management of cloud-native architectures, takes the form of orchestration.

Hybrid quantum stacks, especially those that rely on both cloud and on-premises/private cloud resources, similarly require management and coordination to ensure programs, experiments, and processes run smoothly.

This orchestration requires workflow management tools that are abstracted from the underlying hardware. The abstraction is necessary in part because of the proliferation of quantum devices and related tools.

To effectively experiment with this expanding toolset, organizations need the flexibility to move from one hybrid configuration to the next without having to rewrite all configurations based on the underlying hardware. An effective workflow management system must facilitate this interoperability.

Quantum backend

When new quantum backends become available, the orchestration should be able to switch from one to the other in one line. Similarly, the orchestration should support the ability to change the optimizer used in a variable quantum algorithm so that performance can be compared without writing additional code.

Orchestration should be able to combine source code from multiple frameworks and libraries, eliminating the tedious work of setting up new environments and freeing up time to focus on running actual experiments.

Improve work efficiency

To scale up work, a certain level of hardware is necessary when building and using hybrid quantum architectures. Orchestration tools must be adaptable, not only to account for the diversity of existing hardware, but also to account for other problems that may arise.

These workflow management and orchestration tools must be able to keep pace with the accelerated pace of quantum technology development in the past year alone. In fact, the adaptability these tools provide will itself drive widespread adoption of quantum technology.

Today's microprocessors bear little resemblance to the tube-based CPUs of the past.

In fact, the current iPhone has 1 million times more RAM, 7 million times more ROM, and processes information 100,000 times faster than the computer that Apollo 11 landed on the moon and brought back.

As quantum processors mature, they will eventually become about the same distance as current classical computing devices, allowing them to solve problems that are beyond the reach of even the best of these devices.

Such comparisons point to the dramatic changes that quantum computing will bring, and harnessing this power now and in the future will require quantum and classical devices to work together in a hybrid model.

In this way, companies are able to solve a wide range of business problems. As these hybrid machines transform security and machine learning, they will impact every aspect of our daily lives.

in conclusion

From a purely practical perspective, a hybrid approach is the most efficient, cost-effective and productive way to approach quantum computing. Relying on classical devices to perform those tasks for which they are best suited and for which they have been optimized over the past 50 years is not only the right path, it is the only best path.

The reason is that quantum devices and classical devices not only solve problems differently, they solve different problems. That's why saying "quantum computing will do this or that" is a bit of a misnomer. The fact is that the real revolution will be driven by the combined power of classical and quantum in increasingly powerful hybrid solutions.

<<:  How to redirect HTTP to HTTPS in Nginx

>>:  Goodbye, endless pop-up ads

Recommend

Hyperscale Data Centers vs Micro Data Centers

Some say the data center of the future is very di...

China Telecom is the best at number portability

Although the three major operators seem to have r...

CloudCone: $14/year KVM-512MB/10GB/3TB/Los Angeles data center

CloudCone has launched a flash sale for 2021, off...

What is Zigbee and why is it important for your smart home?

Zigbee is a widely used smart home protocol that’...

Pivo: $5/month KVM-4GB/40G SSD/2TB/Phoenix Data Center

Pivo claims to have started in 1997(?). The merch...

Step-by-step analysis: How to write a data analysis report?

[51CTO.com Quick Translation] As a data analyst, ...

Operating system: Introduction to SSH protocol knowledge

Today I will share with you the knowledge related...

Innovations in the future communications infrastructure for wireless networks

As technology advances, the need for faster and m...

Beyond 5G: The next generation of wireless technology is coming

The transition to 5G is still underway, but talk ...