Skip to main content
Q
q
Glossary Term

Quantum computing

The next generation of computing is cold, quantum, and will upend current cybersecurity standards.

By IT Brew Staff

less than 3 min read

Back to Glossary

Definition:

Instead of using the classical digital units known as “bits,” a quantum computer relies on qubits to store and process data and to perform highly complex calculations. Qubits, which are made by manipulating particles like photons and electrons, can process many inputs simultaneously on computers operating at temperatures approaching absolute zero.

Why, though?

Some proposed uses for quantum computers include improved sensors and better simulations for technologies like advanced batteries and processes like agricultural fertilization.

The leaps to quantum

In 1994, computer scientist Peter Shor presented an algorithm that factored large numbers much faster than the day’s classical machines. “Shor’s algorithm” demonstrated potential risks in modern encryption, which, in many cases, worked by multiplying large prime numbers.

By 2001, IBM and Stanford implemented Shor’s algorithm, factoring the number 15 on a 7-qubit processor.

D-Wave Systems sold its first quantum computer in 2011, and companies like IBM and Google followed with their own quantum machines.

A race to encrypt

The high processing capabilities of quantum computers put many public-key encryption standards at risk of being cracked. To withstand quantum computer attacks and the encryption supporting online sessions, transaction verification, and user-identity assurance, the National Institute of Standards and Technology has urged organizations to begin upgrading its encryption to quantum-resistant options, released in August 2024.

“The process of switching over will not be flipping a switch. That’s why you have to start playing with this now. You have to start implementing it,” Johannes Ullrich, dean of research for the SANS Technology Institute, told IT Brew in August 2024.