# Quantum Information Theory Introduction Quantum Information Theory Introduction Abida Haque [email protected] Outline

Motivation Mixed States Classical Information Theory

Quantum Information Theory Motivation How many bits are in a qubit? If we have a random variable X distributed according to some probability distribution P, how much information do you learn from seeing X?

Sending Information Alice wants to send Bob a string x. Classically: Sending Information

Quantumly: Alice does a computation to create the state. Can only discriminate quantum states if they are orthogonal. But more generally Mixed States

Using vectors for a quantum system is not enough Model quantum noise for when you implement a quantum system Eg, the device outputs Mixed States Basis:

Represent by: Examples Examples II More general scenario

Alice samples with probability . Alice sends

Bob picks POVMs from Bob measures and receives , where given with probability Bob tries to infer X from Y.

More general scenario Perspectives Bob sees Alice sees

Joint Mixed System Note that Alice and Bob see with probability . Classical Information Theory Alice samples a random message X with probability P(X). How much information can Bob learn about X?

Examples If P is the uniform distribution, then Bob gets n bits of info from seeing X. If P has all its probability on a single string, Bob gets 0 bits of information seeing X. Shannon Entropy

Properties: H is concave. 1 ( ) = ( ) log ()

Claude Shannon Examples X is uniformly distributed

Examples II X has all its probability mass on a single string. Classical Information Theory How much information does Bob learn from seeing X? Maximum: How much does Bob actually learn?

Classical Information Theory How much information does Bob learn from seeing X? Maximum: How much does Bob actually learn? Two correlated random variables on sets How much does knowing Y tell us about X?

Joint Distribution, Mutual Information 1 ( , )= ( , ) ( , )

, The more different the distributions for P(x) and P(y) are on average, the greater the information gain. Examples If X and Y are independent then: If X and Y are perfectly correlated:

Analog of Shannons Indistinguishable States What if you see Then

Von Neumann Entropy are the eigenvalues

=1 1 ( )= log = ( )

John von Neumann Von Neumann Entropy Equivalently: 1 ( )= tr log

( ) Example

Maximally mixed state = Quantum Mutual Information If is the joint state of two quantum systems A and B

Example then Holevo Information The amount of quantum information Bob gets from seeing Alices state: =

Given Alices choices for and Recall

Alice samples with probability . Alice sends Bob picks POVMs from Bob measures and receives , where given with probability Bob tries to infer X from Y.

Holevos Bound n qubits can represent no more than n classical bits. Holevos bound proves that Bob can retrieve no more than n classical bits. Odd because it seems like quantum computing should be more powerful

than classical. Assuming Alice and Bob do not share entangled qubits. Alexander Holevo Thank You!

Abida Haque [email protected]