6.004 L13: Introduction to the Physics of Communication

6.004 Fall ‘98 L13 (10/27) Page 1

The Digital Abstraction Part 1: The Static Discipline Tx

Vol

Voh

Noise Rx

Vil

Vih 6.004 Fall ‘98 L13 (10/27) Page 2

What is Information?

Uncertainty Uncertainty Information Resolves ______________

6.004 Fall ‘98 L13 (10/27) Page 3

How do we measure information?

Error-Free data resolving 1 of 2 equally likely possibilities = 11bit bit ________________ of information. 6.004 Fall ‘98 L13 (10/27) Page 4

How much information now?

33bits 3 independent coins yield ___________ of information bits 88 # of possibilities = ___________

6.004 Fall ‘98 L13 (10/27) Page 5

How about N coins ? ........................ N independent coins yield NN # bits = ___________________________ 22NN # of possibilities = ___________

6.004 Fall ‘98 L13 (10/27) Page 6

What about Crooked Coins? Ptail = .25

Phead = .75

# Bits = - Σ pi log2 pi (about .81 bits for this example) 6.004 Fall ‘98 L13 (10/27) Page 7

How Much Information ?

. . . 00000000000000000000000000000 . . . None None(on (onaverage) average)

6.004 Fall ‘98 L13 (10/27) Page 8

How Much Information Now ? ...0101010 1010101010101...

...0101010 1010101010101...

Predictor None None(on (onaverage) average) 6.004 Fall ‘98 L13 (10/27) Page 9

How About English? • 6.JQ4 ij a vondurfhl co8rse wibh sjart sthdenjs. • If every English letter had maximum uncertainty, log log(26) (26) average information / letter would be _________ 22 • Actually, English has only ______ bits of information per letter if last 8 characters are used as a predictor. 11 • English actually has _______ bit / character if even more info is used for prediction. 2 2

6.004 Fall ‘98 L13 (10/27) Page 10

Why Do These Work?

Redundancy Answer: They Lower _________________ Redundancy 6.004 Fall ‘98 L13 (10/27) Page 11

Data Compression Lot’s O’Redundant Bits Encoder Fewer Redundant Bits Decoder

Lot’s O’Redundant Bits 6.004 Fall ‘98 L13 (10/27) Page 12

An Interesting Consequence: • A Data Stream containing the most possible information possible (i.e. the least redundancy) has the statistics of Random ___________________ !!!!! RandomNoise Noise

6.004 Fall ‘98 L13 (10/27) Page 13

Digital Error Correction Original Message

Encoder Original Message + Redundant Bits

Corrector

Original Message 6.004 Fall ‘98 L13 (10/27) Page 14

How do we encode digital information in an analog world? Once upon a time, there were these aliens interested in bringing back to their planet the entire library of congress ...

6.004 Fall ‘98 L13 (10/27) Page 15

The Effect of “Analog” Noise 01101110

01101110 6.004 Fall ‘98 L13 (10/27) Page 16

Max. Channel Capacity for Uniform, Bounded Amplitude Noise P Tx Noise Rx N P/N Max # Error-Free Symbols = ________________ P/N log Max # Bits / Symbol = _____________________ log22(P/N) (P/N)

6.004 Fall ‘98 L13 (10/27) Page 17

Max. Channel Capacity for Uniform, Bounded Amplitude Noise (cont) P = Range of Transmitter’s Signal Space N = Peak-Peak Width of Noise W = Bandwidth in # Symbols / Sec C = Channel Capacity = Max. # of Error-Free Bits/Sec

C=

W Wlog log2(P/N) 2(P/N)

____________________________ Note: This formula is slightly different for Gaussian noise. 6.004 Fall ‘98 L13 (10/27) Page 18

Further Reading on Information Theory

The Mathematical Theory of Communication, Claude E. Shannon and Warren Weaver, 1972, 1949. Coding and Information Theory, Richard Hamming, Second Edition, 1986, 1980.

6.004 Fall ‘98 L13 (10/27) Page 19

The mythical equipotential wire

V1

V2

V3

6.004 Fall ‘98 L13 (10/27) Page 20

But every wire has parasitics:

-

+

+ -

dI V =L dt

dV I =C dt 6.004 Fall ‘98 L13 (10/27) Page 21

Why do wires act like transmission lines? ...

...

Signals take time to propagate Propagating Signals must have energy Inductance and Capacitance Stores Energy

Without termination, energy reaching the end of a transmission line has nowhere to go - so it Echoes

Echoes _________________________ 6.004 Fall ‘98 L13 (10/27) Page 22

Fundamental Equations of Lossless Transmission Lines ∂V ∂x

V = V(x, t ) -

I = I(x, t)

x

+

...

...

dL l= dx

dC c= dx

∂I ∂x

∂V ∂ I = l ∂ x ∂t ∂ I ∂V = c ∂ x ∂t

6.004 Fall ‘98 L13 (10/27) Page 23

Transmission Line Math Lets try a sinusoidal solution for V and I:

j (ω t t + ω x x )

jω t t

jω x x

V = V0 e = V0 e e j (ω t t + ω x x ) jω t t jω x x I = I0 e = I0e e ∂V ∂ I = l ∂ x ∂t ∂ I ∂V = c ∂ x ∂t

jω xV0 = l jω t I 0 jω x I 0 = c jω tV0 6.004 Fall ‘98 L13 (10/27) Page 24

Transmission Line Algebra jω xV0 = l jω t I 0 jω x I 0 = c jω tV0 ωt = ωx

1 l c

ω x V0 = l ω t I 0 ω x I 0 = c ω t V0

V0 = I0

l c

6.004 Fall ‘98 L13 (10/27) Page 25

The Open Transmission Line

6.004 Fall ‘98 L13 (10/27) Page 26

The Shorted Transmission Line

6.004 Fall ‘98 L13 (10/27) Page 27

Parallel Termination

6.004 Fall ‘98 L13 (10/27) Page 28

Series Termination

6.004 Fall ‘98 L13 (10/27) Page 29

• Series:

Series or Parallel ?

– No Static Power Dissipation – Only One Output Point – Slower Slew Rate if Output is Capacitively Loaded

• Parallel: – Static Power Dissipation – Many Output Points – Faster Slew Rate if Output is Capacitively Loaded

• Fancier Parallel Methods: – AC Coupled - Parallel w/o static dissipation – Diode Termination - “Automatic” impedance matching 6.004 Fall ‘98 L13 (10/27) Page 30

When is a wire a transmission line? t fl = l / v Rule of Thumb:

t r < 2.5 t fl Transmission Line

t r > 5 t fl Equipotential Line

6.004 Fall ‘98 L13 (10/27) Page 31

Making Transmission Lines On Circuit Boards Insulating Dielectric

εr

Copper Trace w

t h

c∝ l∝

εεr w/h r w/h h/w h/w

Voltage Plane

Z0 ∝ v∝

hh/ /(w (wsqrt(ε sqrt(εr r) )) )

1/sqrt(ε 1/sqrt(εr r) )

6.004 Fall ‘98 L13 (10/27) Page 32

Actual Formulas

6.004 Fall ‘98 L13 (10/27) Page 33

A Typical Circuit Board 1 Ounce Copper

w = 0.15 cm

G-10 Fiberglass-Epoxy

t = 0.0038 cm h = 0.038 cm c = 1.9 l = 2.75

pF / cm nH / cm

Z 0 = 38 Ω v = 1. 4 × 10 10 cm / sec ( 14 cm / ns ) 6.004 Fall ‘98 L13 (10/27) Page 34