While Quantum Computing technology is relatively nascent, it influences a new generation of simulations that are already running on classic computers and are now accelerated with the NVIDIA quantum SDK.
Twenty-seven years before Steve Jobs introduced a pocket-sized computer, physicist Paul Benioff published an article showing that, in theory, it was possible to build a much more robust system that could be hidden in a thimble: a computer. Quantum.
The name of this concept comes from the subatomic physics that it tried to take advantage. Benioff described in 1980 still drives research today, including efforts to build the next big thing in computing: a system that could make a PC look just as picturesque as an abacus.
Richard Feynman, a Nobel Prize winner whose witty lectures brought physics to a broad audience, helped establish the field, outlining how such systems could simulate wacky quantum phenomena more efficiently than traditional computers.
So What Is Quantum Computing?
Quantum computing uses the physics that governs subatomic particles to perform sophisticated parallel calculations, replacing the more simplistic transistors in today’s computers.
Quantum computers calculate using qubits, computing units that can be on, off, or whatever value in between, rather than the bits in traditional computers that are on or off, one or zero. The qubit’s ability to live in the intermediate state, called superposition, adds a powerful ability to the computing equation, making quantum computers superior for some types of math.
What Does a Quantum Computer Do?
Quantum computers can perform calculations that would take a long time for classical computers if they could finish them at all.
For example, today’s computers use eight bits to represent any number between 0 and 255. Thanks to features like superposition, a quantum computer can simultaneously use eight qubits to represent all numbers between 0 and 255.
It’s a feature like parallelism in computation: All possibilities are calculated simultaneously rather than sequentially, providing tremendous accelerations.
So while a classical computer performs long division calculations one at a time to factor a huge number. A quantum computer can get the answer in one step. Boom!
That means quantum computers could reshape entire fields, like cryptography, that rely on factoring huge numbers today.
A Big Role for Small Simulations
That could be just the beginning. Some experts believe that quantum computers will push past the limits that now hamper simulations in chemistry, materials science. And anything else involving worlds built from the nano-sized bricks of quantum mechanics.
Quantum computers could even extend the life of semiconductors by helping engineers create more sophisticated simulations of the quantum effects they are beginning to find in today’s smaller transistors.
Experts say that quantum computers will ultimately not replace classical computers but rather complement them. And some predict that quantum computers will be used as accelerators just as much as GPUs accelerate today’s computers.
How Does Quantum Computing Work?
Don’t expect to build your quantum computer as a DIY PC with parts pulled from discount bins at your local electronics store.
The few systems in operation today typically require cooling that creates working environments just above absolute zero. They need that arctic computing to handle the fragile quantum states that power these systems.
An example graphs how difficult it can be to build a quantum computer: to create a qubit, a prototype suspends an atom between two lasers. Try it in your home workshop!
Quantum computing needs muscles to create something called entanglement. That’s when two or more qubits exist in a single quantum state. A condition that is sometimes measured by electromagnetic waves just one millimeter wide.
If you add too much energy to that wave, you will lose the entanglement, overlap, or both.
What’s the State of Quantum Computers?
A few companies, such as Alibaba, Google, Honeywell, IBM, IonQ, and Xanadu, operate the first versions of quantum computers today.
Today they provide dozens of qubits. But qubits can be noisy, which sometimes makes them unreliable. To reliably address real-world problems, systems need tens or hundreds of thousands of qubits.
Experts believe it could be a couple of decades before we reach a high-fidelity era in which quantum computers are beneficial.
Predictions of when we reach the so-called supremacy of quantum computing. The moment when quantum computers perform tasks that classical computers cannot. Is the subject of a heated debate in the industry.
Accelerating Quantum Circuit Simulations Today
The good news is that the world of AI and machine learning puts both accelerators and GPUs on the spot. Which can perform many kinds of operations that quantum computers would compute with qubits.
So classical computers are already finding ways to host quantum simulations with GPUs today. For example, NVIDIA ran a state-of-the-art quantum simulation on Selene, our in-house AI supercomputer.
At the GTC keynote, NVIDIA introduced the SDL cuQuantum to accelerate quantum circuit simulations running on GPUs. The first works suggest that quantum will be able to offer accelerations of orders of magnitude.
The SDK takes an agnostic approach to provide a variety of tools that users can choose from to best suit their strategy. For example, the state vector method provides high-fidelity results, but its memory requirements grow exponentially with the number of qubits.
That creates a practical limit of about 50 qubits on today’s largest classic supercomputers. However, we have seen great results (below) using quantum to speed up quantum circuit simulations using this method.
Also Read : The Ten Most Valuable Companies In The World
Search for the Right Tenant
Are You Searching for the Right Tenant? You have taken the plunge and invested in your first buy-to-let property. You…
Information Theory The Big Idea Reading Answers – Detail Summary
Information Theory The Big Idea Reading Answers What Does Information Theory Mean? Information Theory The Big Idea Reading Answers. Information…