Quantum Information Theory Introduction

Quantum Information Theory Introduction

Quantum Information Theory Introduction Abida Haque [email protected] Outline

Motivation Mixed States Classical Information Theory

Quantum Information Theory Motivation How many bits are in a qubit? If we have a random variable X distributed according to some probability distribution P, how much information do you learn from seeing X?

Sending Information Alice wants to send Bob a string x. Classically: Sending Information

Quantumly: Alice does a computation to create the state. Can only discriminate quantum states if they are orthogonal. But more generally Mixed States

Using vectors for a quantum system is not enough Model quantum noise for when you implement a quantum system Eg, the device outputs Mixed States Basis:

Represent by: Examples Examples II More general scenario

Alice samples with probability . Alice sends

Bob picks POVMs from Bob measures and receives , where given with probability Bob tries to infer X from Y.

More general scenario Perspectives Bob sees Alice sees

Joint Mixed System Note that Alice and Bob see with probability . Classical Information Theory Alice samples a random message X with probability P(X). How much information can Bob learn about X?

Examples If P is the uniform distribution, then Bob gets n bits of info from seeing X. If P has all its probability on a single string, Bob gets 0 bits of information seeing X. Shannon Entropy

Properties: H is concave. 1 ( ) = ( ) log ()

Claude Shannon Examples X is uniformly distributed

Examples II X has all its probability mass on a single string. Classical Information Theory How much information does Bob learn from seeing X? Maximum: How much does Bob actually learn?

Classical Information Theory How much information does Bob learn from seeing X? Maximum: How much does Bob actually learn? Two correlated random variables on sets How much does knowing Y tell us about X?

Joint Distribution, Mutual Information 1 ( , )= ( , ) ( , )

, The more different the distributions for P(x) and P(y) are on average, the greater the information gain. Examples If X and Y are independent then: If X and Y are perfectly correlated:

Analog of Shannons Indistinguishable States What if you see Then

Von Neumann Entropy are the eigenvalues

=1 1 ( )= log = ( )

John von Neumann Von Neumann Entropy Equivalently: 1 ( )= tr log

( ) Example

Maximally mixed state = Quantum Mutual Information If is the joint state of two quantum systems A and B

Example then Holevo Information The amount of quantum information Bob gets from seeing Alices state: =

Given Alices choices for and Recall

Alice samples with probability . Alice sends Bob picks POVMs from Bob measures and receives , where given with probability Bob tries to infer X from Y.

Holevos Bound n qubits can represent no more than n classical bits. Holevos bound proves that Bob can retrieve no more than n classical bits. Odd because it seems like quantum computing should be more powerful

than classical. Assuming Alice and Bob do not share entangled qubits. Alexander Holevo Thank You!

Abida Haque [email protected]

Recently Viewed Presentations

  • Reading Strategy p94  Paraphrase the opening lines #1-12

    Reading Strategy p94 Paraphrase the opening lines #1-12

    Paraphrase the opening lines #1-12 that introduce the subject of the poem. The narrator describes spring and people's awakened desire to go on pilgrimages. ... What did you learn about the Yeoman in this passage? Reading Strategy lines 104-121.
  • Announcements - Lane Community College

    Announcements - Lane Community College

    Concern about ecoli, mad cow disease, mercury poisoning, cancer-causing heterocyclic amines and polycyclic aromatic hydrocarbons created in cooking meat, esp BBQ. Lean meat and fish is bad because it is calorie-dense and is replacing calories which could be taken up...
  • Rate Laws - Oneonta

    Rate Laws - Oneonta

    The Arrhenius Equation: Two Point Version. The activation energy for the gas phase decomposition of . t-butyl propionate is 164 kJ. C. 2 H 5 COOC(CH 3) 3 (g) (CH 3) 2 C=CH 2 (g) + C 2 H 5....
  • Linear Algebra and Its Applications

    Linear Algebra and Its Applications

    Three equivalent ways to express a system of linear equations. In the future, we will switch frequently and freely among the following 3 types of mathematical objects: Systems of linear equations. Vector equations. Matrix equations. Note also that the matrix...
  • Artificial Intelligence and Its Implications On the Future

    Artificial Intelligence and Its Implications On the Future

    This idea that AI could become sentient and turn against humans is, understandably, making many apprehensive about the furthering of this field. Despite this stigma AI has in popular culture, experts agree that anything similar to scenario like that is...
  • Lectures 1 & 2 (2010.03.05 & 2010.03.06)

    Lectures 1 & 2 (2010.03.05 & 2010.03.06)

    (B and C) Views of the actual L-shaped molecule, based on X-ray diffraction analysis. All tRNAs have very similar structures. (D) The linear nucleotide sequence of the molecule, color-coded to match A, B, and C.
  • Estructura Y Diseño Del Quirofano

    Estructura Y Diseño Del Quirofano

    El mejoramiento del ambiente en Quirófano se obtiene principalmente con un personal bien entrenado, que usa la pijama quirúrgica correctamente, que transita solo lo indispensable dentro del quirófano, que habla lo menos posible durante la intervenciones y que se apega...
  • TORTORA  FUNKE  CASE Microbiology AN INTRODUCTION EIGHTH EDITION

    TORTORA FUNKE CASE Microbiology AN INTRODUCTION EIGHTH EDITION

    Times New Roman MS Pゴシック Arial Wingdings Blank 1_Blank Chapter 10, part A Taxonomy Taxonomy Taxonomy The Three-Domain System The Three-Domain System Slide 7 Endosymbiotic Theory Scientific Names Taxonomic Hierarchy Species Definition Kingdoms of the Domain Eukarya Prokaryotes References for...