IBM Q, 2017. CC BY 2.0
Technology |Explainer

The era of quantum computing? Not quite yet

  1. Quantum tech could advance machine learning and chemistry
  2. Today's quantum computers are slow, bulky, and large
  3. Future may hold great possibilities and major security threats

Quantum computers could be the next great technological leap, revolutionizing the way we solve vastly complex computations and affecting every aspect of life. Companies are now racing for “quantum supremacy” – the point at which quantum computers surpass conventional ones. Yet while scientists long to create large-scale quantum computers, security experts fear they could pose a grave threat to security and cryptography. So what is the current state of this technology, and where is it heading?

Quantum computing uses quantum mechanics, a branch of physics that deals with the microscopic. To physicists, we experience the world through classical mechanics, which describes macroscopic objects that can be perceived through the senses. The classical world is orderly, intuitive, and predictable.

In the world of the very small, however, all conventional understanding of reality goes out the window.

“Nature isn’t classical, it’s quantum,” said Martin Albrecht, a lecturer in information security at Royal Holloway University, London. “So, emulating nature can be done a lot better if you have a quantum computer than not.

“We want to simulate bits of nature to understand them, for example, in chemistry. If this bit of nature is subject to quantum effects then we’ll have trouble emulating it on a classical computer, [meaning] that it is quite computationally expensive or we have to abstract some aspects away.”

Albrecht added that emulating quantum effects – which would allow greater chemical understanding and could lead to better medicine, to give one example – would be much easier on a quantum computer.

What is quantum computing?

The idea of quantum computing was proposed by Nobel prize-winning physicist Richard Feynman in 1982 and pioneered by Oxford physicist David Deutsch three years later.

Feynman wondered: Could we simulate physics on a computer?

A quantum computer follows the logic of quantum mechanics to process more information, much faster, potentially solving many more calculations.

In a conventional, or classical, computer, the smallest units of information exist in bits, consisting of either ones or zeroes. A quantum computer uses quantum bits – or qubits – which can exist in both states at once. This is a quantum effect known as a superposition, famously represented by Schrödinger’s cat, which is both dead and alive at the same time.

You can edit and add more


In classical computing, the ones and zeroes in bits solve problems step-by-step. As ones and zeroes can exist simultaneously in qubits, a quantum computer does not have to go through problems step-by-step, but can evaluate more than one probability at a time. As a result, quantum computers can, in theory, solve in a matter of seconds problems that classical computers would take thousands of years to work out. 

To put speed into perspective, in 2015, Google published a paper saying that its quantum computer was 100 million times faster than a classical computer.

Another quantum effect that these computers use is entanglement, which proposes that particles are linked and affect one another even when separated at a great distance. Einstein once described entanglement as “spooky action at a distance,” because he did not believe it to be true. However, recent developments proved that Einstein was wrong. Spooky or not, entanglement is a proven phenomenon (Economist).

By utilizing phenomena such as superposition and entanglement, a quantum computer can factor in large numbers and process vast calculations simultaneously. 

Quantum computers today

Today’s quantum computers are slow, bulky, and large. They’re difficult to make because scientists still struggle to control the superposition of qubits, which tend to collapse. Currently, they’re also “noisy” – meaning they’re easily exposed to outside noise – creating errors in the process.  

Classical computers use a built-in fan to cool down and quantum computers are no different. Qubits, however, heat up quickly in a way that a standard fan can’t deal with. Quantum computers are large because they need to cool the qubits to extremely low temperatures and insulate them from outside interference. The idea is to isolate qubits enough to maintain “quantum coherence,” a state of cooperation between particles.

Quantum computers took a recent step forward when an international team of scientists developed (Nature) a programmable quantum processor made with silicon. The team used microwave energy to suspend two electronic particles in silicon, which were then used to perform a set of calculations. With this development, scientists hope quantum computers will be easier to control and manufacture.

IBM scientist Stefan Filipp with the cryogenic refrigerator that will keep qubits at temperatures colder than the deepest parts of outer space. Inside the canister it gets to 10 milliKelvin, 10 thousandths of a degree above absolute zero. 2016. Credit: IBM, via Flickr. CC BY 2.0

Canadian company D-Wave Systems is the first in the world to produce and sell commercial quantum computers, with customers including Google and NASA. Others, such as IBM, sell time usage. The public can now register online and access IBM’s five and 16-qubit quantum computers in the IBM Q experience via the cloud. 

IBM quantum computer. Staff member Katie Pooley, at the Thomas J. Watson Research Center, examining a cryostat with the new prototype of a commercial quantum processor inside. Credit: Andy Aaron, IBM, 2017. CC BY 2.0

According to Vivien Kendon, a physicist at Durham University, IBM is leading the way in quantum progress. “[IBM] is training, bringing up a generation of people who are not afraid to go, ‘Yeah, I can program a quantum computer and I’ve done it,’” she said. “It helps to just try it, just do the experiment and see what happens.”

More than a year after its launch, IBM Q has a community of more than 60,000 users with 1.7 million quantum experiments. Many use it as a platform for education and research. To encourage more students, developers, and teachers to try out its quantum platform, IBM announced in January 2018 it’s offering prizes to those who use IBM Q.

Potential real-world applications

Quantum computers are yet to solve any real-world problems, but there are various theoretical claims (IBM) for what they can do. These include machine learning, pattern recognition, evaluating and predicting financial markets, and a better grasping of chemistry, which could lead to better drugs.

Right now, however, experts agree that the first application for quantum computing is simply to simulate quantum systems.

“The most realistic way to think of it, at this stage of what we know about computing, is to think that quantum computers will come along as coprocessors,” said Kendon. Coprocessors are chips that assist the main processor, the CPU, by performing specialized functions more quickly.

A section of IBM’s 16-qubit processor, 2016. Credit: IBM, via Flickr. CC BY 2.0

“In some cloud computing there will be the possibility that a piece of my calculation is really hard, but a quantum processor can calculate that piece of it, return the answer up, and I’ll finish the computation classically,” Kendon said.

In such a case, a quantum coprocessor would be designated to one type of specialized calculation.

“Hybrid algorithms where only parts of it are quantum, and coprocessors to run those bits, are going to be the practical way to do things, at least to start with,” she said.

Post-quantum cryptography

A much feared potential application of quantum computing is its threat to cryptography and encryption. Since quantum computers can solve calculations faster, the idea is that they could break current encryption systems.

But how real is the threat?

“That depends on how much you care about keeping things secret,” said Kendon. “It’s always relative to how important the information is you’re protecting, how long you need to protect it for.”

In other words, though quantum computers can potentially crack encryption, the threat is only serious depending on the information you need to keep secure. For example, a quick bank transaction isn’t as important as top-secret government data. Unsurprisingly, companies investing heavily in post-quantum cryptography are security agencies (Technology Review) such as the U.S. National Security Agency (NSA) and the UK’s Government Communications Headquarters (GCHQ).

“Quantum just changes what stuff you can compute, it doesn’t change the issues for what you might need to do to keep stuff secret,” said Kendon.

You can edit and add more


According to Albrecht, quantum computers today will take at least 20 years to break, say, a system like RSA – named after its inventors Ron Rivest, Adi Shamir, and Leonard Adleman – one of the most widely used public-key cryptosystems. But he added that now is the time to worry because the standardization and adoption of cryptography takes time.

“At this point [security experts] are standardizing post-quantum cryptography. And once that has happened, in five to seven years time, there is really no point in using pre-quantum cryptography,” he said.

“So even if you’re skeptical about the timeline for quantum computing, the timeline for post-quantum cryptography is set.”

Companies are already preparing for this, including the U.S.-based National Institute of Standards and Technology (NIST), which says government agencies should be ready to switch to quantum cryptography by 2025. In 2016, in an effort to develop cryptographic systems that are secure against both quantum and classical computers, NIST put out a call asking the public to submit proposals for post-quantum cryptography. 

The race for ‘quantum supremacy’

Though quantum computers are real they’re not considered “true” by technologists. They’ll become so only after reaching the benchmark of “quantum supremacy,” defined as the point quantum computers can outperform classical computers in solving complex calculations. 

The term “quantum supremacy” was coined by theoretical physicist John Preskill and promoted by Google’s AI Quantum Laboratory in a 2017 article, which claimed that it’ll reach the benchmark in five years.

“Quantum supremacy is supposed to mark this turnover point where a quantum computer is actually better than a classic computer,” Daniel Gottesman, a physicist at Perimeter Institute, told WikiTribune.

“But, of course, the first problems that you can see this improvement on are going to be problems that are not interesting for themselves, they’re just interesting because they mark this transition,” Gottesman said. 

The benchmark doesn’t necessarily mean they’ll solve real-world problems.

Even so, the race for quantum supremacy is on between the tech giants.

IBM rolled out the world’s first 50-qubit quantum computer in November 2017, giving it a small head start (The Next Web). In January, Intel announced it had created a 49-qubit computing chip (Fast Company).

Alibaba has teamed up with the Chinese Academy of Science (Nikkei) to build a quantum computing lab. Microsoft, despite entering the space 12 years ago, said in January that it’s “imminently close” to building a working qubit (Financial Times, may be behind paywall).

A D-Wave 2000Q Systems commercial quantum computer. Credit: D-Wave.

Governments around the world are also heavily investing in quantum technologies. The UK has pledged £270m over five years, Canada is putting in $50m, and Australia $25m over the same period (Guardian). In 2016, the European Commission said it will launch a €1 billion initiative in the field, a project that’s gradually taking shape (Nature).

Changing ideas of what computers can do

Fears of the “imminent threat” (Washington Post) of quantum computing may be real, but its harmful impact is perhaps overstated.

After all, “it’s the same kind of dichotomy that you have for most technologies,” said Gottesman. “Maybe the best analogy is to just regular computers, we use them for lots of interesting, useful stuff, but also for lots of horrible things. It’s just a question of the people that are operating them.”

Quantum computing is still in its infancy and won’t power up your laptop anytime soon. There is, nonetheless, real progress.

“We are reaching this point where things seem to be accelerating … we’re not there yet, but we’re getting close,” said Gottesman.

Perhaps, according to Kendon, the fear lies in a “fundamental change” on the “theoretical level” of what a computer can do. 

“What changed for people was, it changed our notion of what is hard to compute and what is easy to compute,” Kendon said.

In other words, it changed how we perceive the limits of human technology.

Timeline of key developments

  • 1982 – Nobel-prize winning physicist Richard Feynman proposes the idea of quantum computing
  • 1994 – Mathematician Peter Shor comes up with a quantum algorithm that can crack encryption if it runs on a computer with a large number of qubits
  • 1998 – IBM announces its first working qubit
  • 2011 – D-Wave launches its first commercially available quantum computer
  • 2016 – IBM lets users play around with its cloud-based 5-qubit quantum computer, the IBM Q
  • 2017 – Google reveals plans to reach quantum supremacy in five years
  • 2017 – IBM announces its first 50-qubit quantum computer
  • 2018 – Scientists develop (Nature) a programmable quantum processor made with silicon

You can edit and add more


Talk (8)


Ros To

""To put speed into perspective, in 20..."

Rakesh Jaddu

"This literally is the best summarizat..."
Linh Nguyen

Linh Nguyen

"That's a good point Robert – have y..."

Robert Bloom

"The biggest issue with that, Not many..."

Started by

United Kingdom
Linh is a staff journalist at WikiTribune with a background in the humanities. She covers the Middle East, Asia, conflict and technology. Though based in London, she has freelanced across Asia, the UK and U.S.

History for Story "The era of quantum computing? Not quite yet"

Select two items to compare revisions

19 February 2018

14:31:40, 19 Feb 2018 . .‎ Linh Nguyen (Updated → accept changes by Ben)
14:18:54, 19 Feb 2018 . .‎ Ben Throop (Updated → Changed link to Feynman proposal to the full text at
11:25:11, 19 Feb 2018 . .‎ Linh Nguyen (Updated → tweak)
10:46:35, 19 Feb 2018 . .‎ Valdemar Baes Aaholst (Updated → Change of link text)

17 February 2018

12:47:49, 17 Feb 2018 . .‎ Alan Hewitt (Updated → quantum computers don't "simulate" superposition and entanglement, they make use of them)

16 February 2018

15:33:11, 16 Feb 2018 . .‎ Linh Nguyen (Updated → approved edits by Patrik)
15:22:35, 16 Feb 2018 . .‎ Patrik Wallström (Updated → short clarification of the comparison)
13:42:35, 16 Feb 2018 . .‎ Linh Nguyen (Updated → tweak)
13:40:56, 16 Feb 2018 . .‎ Linh Nguyen (Updated → added horizontal line)
13:25:59, 16 Feb 2018 . .‎ Linh Nguyen (Updated → tweak summary)
12:18:19, 16 Feb 2018 . .‎ Ed Upright (Updated → publish)
12:17:50, 16 Feb 2018 . .‎ Ed Upright (Updated → save)
12:11:30, 16 Feb 2018 . .‎ Ed Upright (Updated → save)
11:46:48, 16 Feb 2018 . .‎ Ed Upright (Updated → save)
11:19:44, 16 Feb 2018 . .‎ Linh Nguyen (Updated → save)
11:11:35, 16 Feb 2018 . .‎ Linh Nguyen (Updated → )
10:45:42, 16 Feb 2018 . .‎ Linh Nguyen (Updated → )

15 February 2018

18:59:05, 15 Feb 2018 . .‎ Chuck Thompson (Updated → tweaks)
17:09:49, 15 Feb 2018 . .‎ Ed Upright (Updated → save)
17:07:29, 15 Feb 2018 . .‎ Ed Upright (Updated → save)
16:43:57, 15 Feb 2018 . .‎ Ed Upright (Updated → save)
15:59:24, 15 Feb 2018 . .‎ Ed Upright (Updated → save)
15:43:17, 15 Feb 2018 . .‎ Ed Upright (Updated → save)
15:25:39, 15 Feb 2018 . .‎ Ed Upright (Updated → save)
14:36:26, 15 Feb 2018 . .‎ Linh Nguyen (Updated → )
14:35:17, 15 Feb 2018 . .‎ Linh Nguyen (Updated → update)
14:17:20, 15 Feb 2018 . .‎ Ed Upright (Updated → save)
10:27:31, 15 Feb 2018 . .‎ Linh Nguyen (Updated → update)
10:09:48, 15 Feb 2018 . .‎ Linh Nguyen (Updated → link)
10:02:26, 15 Feb 2018 . .‎ Linh Nguyen (Updated → tweak)
09:59:10, 15 Feb 2018 . .‎ Linh Nguyen (Updated → tweak)
09:51:39, 15 Feb 2018 . .‎ Linh Nguyen (Updated → tweak)

14 February 2018

16:45:10, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
16:36:23, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
16:34:44, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
16:34:23, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
16:29:34, 14 Feb 2018 . .‎ Linh Nguyen (Updated → )
16:27:01, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
16:17:00, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
16:05:55, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
16:04:05, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
15:40:00, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
15:18:37, 14 Feb 2018 . .‎ Linh Nguyen (Updated → edits)
15:17:36, 14 Feb 2018 . .‎ Linh Nguyen (Updated → edits)
15:10:46, 14 Feb 2018 . .‎ Linh Nguyen (Updated → )
12:42:54, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)
12:42:26, 14 Feb 2018 . .‎ Linh Nguyen (Updated → update)

Talk for Story "The era of quantum computing? Not quite yet"

Talk about this Story

  1. Rewrite

    “To put speed into perspective, in 2015, Google published a paper saying that its quantum computer was 100 million times faster than a classical computer.”

    Isn’t it misleading to suggest that Google have a computer that is 100 million times faster than a conventional computer or am I misreading this paragraph?

  2. Other

    This literally is the best summarization of quantum computing. The links really help too!

  3. Rewrite

    The biggest issue with that, Not many profs available to actually teach how to use one. It is the upcoming thing, but the thought so far is mostly below rudimentary.
    Great article!

    1. Rewrite

      That’s a good point Robert – have you yourself tried out IBM Q? Would love to hear your experience if so.

      And thank you!

  4. Other

    That’s the best article I have ever read on quantum computing.

    1. Rewrite

      Thank you Conall, I’m touched by your reply 🙂

  5. Other

    Thanks for one of the most balanced reports on Q computing I’ve seen.

    1. Rewrite

      Thank you James, I’m glad you thought so!

Subscribe to our newsletter to receive news, alerts and updates

Support Us

Why this is important and why you should care about facts, journalism and democracy

WikiTribune Open menu Close Search Like Previous page Next page Back Next Open menu Close menu Play video RSS Feed Share on Facebook Follow us on Twitter Follow us on Instagram Follow us on Youtube Connect with us on Linkedin Email us Message us on Facebook Messenger Save for Later