Article

Quantum Supremacy: Building a Quantum Computer

20 February 2018 | Maryana Kartashevska | About a 5 minute read
Tags: Bits, computing, quantum computing, qubits


Intel has recently announced its 49-qubit superconducting quantum chip. Aptly named Tangle Lake – after a chain of lakes in Alaska and alluding to the very cold temperatures required for superconducting qubits to operate – it builds on their 17-qubit superconducting chip, released only a few months earlier.

 

At such a pace and with significant investment into the sector, quantum computers are quickly becoming a reality. Every major player in the tech space is now part of the race to build the first super computer.

 

So, what are the most popular approaches to building physical qubits?

 

Focus on quantum computing

It is widely recognised that quantum computing is the future of information technology and its prominence is becoming more apparent every year. In 2016, the European Commission has announced a flagship initiative to invest one billion euros into quantum technologies from 2018, which is nearly double its cumulative investments into the sector over the past 20 years.

 

There have also been a number of notable recent investments coming from the private sector. Intel has invested $50 million into QuTech – a Netherlands-based advanced research facility for quantum computing and quantum internet. Google has partnered with Nasa and the Universities Space Research Association to research the possible applications of quantum computing.

 

The industry is evolving fast. These and other tech behemoths like IBM, Microsoft, Bell Labs are now running quantum research labs in-house. A wealth of smaller companies has also sprung up, specialising in different areas of the quantum stack – from physical qubits and hardware to software architecture and quantum applications.

 

Near-term objectives

The near-term goal is to reach what has been dubbed as quantum supremacy – a state where a quantum computer is able to perform well-defined computational tasks beyond the capabilities of the best modern classical computers. This threshold is pegged at 50 fully functional qubits (based on the work of John Preskill, Professor of Theoretical Physics at Caltech).

 

In other words, qubits that are entangled, in superposition and interacting with each other in a carefully choreographed manner. I have already covered the basics of quantum computing theory in my previous blogs (see Cat, Bits and Quantum Computers, Detangling Quantum Entanglement, and Getting Physical: Bits and Quantum Error Correction) but here’s a quick refresher.

 

Qubits operate through the principle of quantum interference, which treats all elementary particles as waves and allows us to measure the interaction between them. The resulting amplitude of this interaction is the probabilistic number of one state or another occurring. Except unlike probabilities, amplitudes can be positive, negative and even complex numbers, and can interfere destructively to cancel each other out.

 

To successfully operate a quantum computer, we need to be able to manipulate the patterns of this interference through a quantum algorithm, maintaining the fragile quantum state of the physical qubits and correcting for any errors without prematurely observing the answer. Only under these conditions can a meaningful solution be derived and the benefits of a quantum over classical computer become apparent, as explained in the Implications of the Quantum Revolution.

 

Popular approaches

The initial hurdle is to get 50 or so stable qubits to cooperate and different companies are approaching the challenge in different ways. Below is a quick overview of the main three:

 

Approach 1 – Superconducting circuits – aka ‘the popular one’

– How: By sending a resistance-free current around a circuit loop made of metallic material and using a microwave signal to control superposition. This is the most popular approach and one that has so far yielded the highest number of entangled qubits

-Pros: Based on the existing semiconductor chip technology, scalable, fast and relatively cost-effective

-Cons: Susceptible to environmental noise, which causes decoherence (see Getting Physical: Bits and Quantum Error Correction)

-Who: Google, Intel, IBM

 

Approach 2 – Trapped ions – aka ‘the stable one’

-How: Using electrodes to trap and confine ions (atoms with a positive or negative electrical charge) in vacuum, which are then manipulated into superposition by finely tuned lasers. This creates a highly stable and well-isolated quantum system

-Pros: Longevity of qubits in superposition, high fidelity (an important measure of the closeness between the quantum states) and low error rates

-Cons: The highly-isolated environment inhibits qubit entanglement, makes the system slow and presents scalability issues

-Who: IonQ

 

Approach 3 – Diamond vacancies – aka ‘the solid-state one’

-How: By exploiting tiny atomic defects in the diamond crystal lattice, where the absence of a carbon atom creates a vacancy with free electrons that can be used as qubits (aka nitrogen-vacancy centres)

-Pros: Can operate at room temperature with no need for extreme freezing or vacuum; a natural light emitter, which helps to preserve superposition and read information off

-Cons: Qubit entanglement and scalability

-Who: Quantum Diamond Technologies

 

A number of other approaches are also being investigated. For example, Intel is looking at silicone dots, which is similar to superconducting circuits. D-Wave has developed a 2000 qubit system, which provides a measurable increase in processing speed over classical computers but only addresses a limited class of optimisation problems.

 

It’s hard to say which approach will reach quantum supremacy first. One thing is clear though – quantum is the future of computing.

 

Read More From This Author

Share this blog post

Related Articles

Careers

We’re looking for bright, dynamic people to join our team!

Discover More Roles