Quantum Computing
Download
Report
Transcript Quantum Computing
Norman Littlejohn
COSC480
Quantum Computing
History
How it works
Usage
The number of transistors in a processor
double every ~18 months
What happens in the year 2020-2030?
The next step: Quantum Computing
A device for computation that makes direct use
of quantum mechanical phenomena, such as
superposition and entanglement, to perform
operations on data
Harness the power of atoms and molecules to
perform memory and processing tasks
Argonne National Laboratory around 30 years
ago
Paul Benioff – credited for first applying
quantum theory to computers in 1981
Create an quantum turing machine
Turing machine – theoretical device consisting
of tape of unlimited length divided into
squares which can hold 0 or 1. Instructions are
read off of the tape. One calculation at a time
Quantum Turing – tape and read/write head
exist in quantum state. The positions can be 0
or 1, or a superposition of 0 or 1. One million
calculations at a time
Superposition - a fundamental principle of
quantum mechanics. It holds that a physical
system (say, an electron) exists partly in all its
particular, theoretically possible states (or,
configuration of its properties) simultaneously;
but, when measured, it gives a result
corresponding to only one of the possible
configurations
Modern computers work with bits in one of
two states ( 0 or 1 ).
Quantum computers encode information as
quantum bits, or qubits, which can exist in
superposition, which allows more states.
Qubits represent atoms, ions, photons, or
electrons and their respective control device
that work together to act as computer memory
and a processor.
30-qubit processor = modern computer
running at 10 teraflops (10 trillion floatingpoint operations per second).
Typical desktops measured in gigaflops
Ion traps use optical or magnetic fields (or a
combination of both) to trap ions.
Optical traps use light waves to trap and control
particles.
Quantum dots are made of semiconductor material
and are used to contain and manipulate electrons.
Semiconductor impurities contain electrons by using
"unwanted" atoms found in semiconductor material.
Superconducting circuits allow electrons to flow with
almost no resistance at very low temperatures.
Entanglement – attempting to look at
subatomic particles could bump them and
change the value.
Looking at a qubit in superposition to
determine the value will assume the value of 0
or 1, but not both which is the same function of
digital computers.
Measure indirectly to preserve integrity
Outside force to two atoms makes them
entangled. When disturbed, one atom will
choose a spin (or value), and the second atom
will choose the opposite spin.
1998 – Los Alamos and MIT researchers
Spread a single qubit across three nuclear
spins. Spreading it made it harder to corrupt
and allowed researchers to use entanglement to
study reactions indirectly
2000 – Los Alamos Lab
7-qubit quantum computer within a single drop of
liquid.
Used nuclear magnetic resonance (NMR) to
manipulate particles in the atomic nuclei of
molecules of trans-crotonic acid. Electromagnetic
pulses forced the particles to line up. Particles in
position paralell or counter to the magnetic field
let the quantum computer mimic information
encoding
Trans-crotonic acid = fluid of 6H atoms and 4C
atoms.
2001 – IBM and Stanford University
Demonstrated Shor’s Algorithm (finding prime
factors of numbers).
7-qubits
2005 – Institute of Quantum Optics and
Quantum Information (Innsbruck University)
First qubyte created (8 qubits)
Made through use of ion traps
2006 – Waterloo and Massachusetts
Quantum control on a 12-qubit system
2007 Canadian Company D-Wave
Demonstration of 16-qubit quantum computer
Computer solved a sudoku puzzle and other
pattern matching problems
Promised a practical system by 2008, but many
believed it to be impossible
D-Wave One
D-Wave Homepage
Ability to factor large numbers allows for
useful decoding and encoding secret
information
Modern encryption methods are simple
compared to quantum computers’ methods
Search large databases in a fraction of time it
would take modern computers
Think Big
Quantum Computing:
History
Development
How it works
Usage