Topic List

Click a topic to learn more.


Look Up Another Term


Redirected from: classic vs. quantum computing

Definition: classical vs. quantum computing


The difference between computing since its inception for business records in the 1950s and the emergence of quantum computing decades later. Classical computing deals with an input-process-output loop, which reads a batch of transactions and updates the appropriate records as in the example below. For common interactive applications so prevalent today, the input is a selected menu function, and the processing is the action that satisfies that function.

A Huge Contrast
Quantum computing on the other hand deals with finding or creating the appropriate algorithms for a solution. From the programming point of view, quantum computing is mostly math. The two areas of computing are dramatically different in concept as well as in the experience that program designers require. If a problem cannot be solved with a mathematical formula, a quantum computer is of little value. In fact, if a quantum computer were applied to routine commercial data tasks, it could take even longer to process them than a classical computer. See quantum computing.






These Could Not Be Further Apart
Classical computing is input-process-output. Quantum computing solves a problem with a mathematical formula. The math sample was excerpted from Nielsen and Chuang's book, which is sometimes called the "bible of the quantum information field."