How About English? • 6.JQ4 ij a vondurfhl co8rse wibh sjart sthdenjs. • If every English letter had maximum uncertainty, log log(26) (26) average information / letter would be _________ 22 • Actually, English has only ______ bits of information per letter if last 8 characters are used as a predictor. 11 • English actually has _______ bit / character if even more info is used for prediction. 2 2
6.004 Fall ‘98 L13 (10/27) Page 10
Why Do These Work?
Redundancy Answer: They Lower _________________ Redundancy 6.004 Fall ‘98 L13 (10/27) Page 11
Data Compression Lot’s O’Redundant Bits Encoder Fewer Redundant Bits Decoder
Lot’s O’Redundant Bits 6.004 Fall ‘98 L13 (10/27) Page 12
An Interesting Consequence: • A Data Stream containing the most possible information possible (i.e. the least redundancy) has the statistics of Random ___________________ !!!!! RandomNoise Noise
6.004 Fall ‘98 L13 (10/27) Page 13
Digital Error Correction Original Message
Encoder Original Message + Redundant Bits
Corrector
Original Message 6.004 Fall ‘98 L13 (10/27) Page 14
How do we encode digital information in an analog world? Once upon a time, there were these aliens interested in bringing back to their planet the entire library of congress ...
6.004 Fall ‘98 L13 (10/27) Page 15
The Effect of “Analog” Noise 01101110
01101110 6.004 Fall ‘98 L13 (10/27) Page 16
Max. Channel Capacity for Uniform, Bounded Amplitude Noise P Tx Noise Rx N P/N Max # Error-Free Symbols = ________________ P/N log Max # Bits / Symbol = _____________________ log22(P/N) (P/N)
6.004 Fall ‘98 L13 (10/27) Page 17
Max. Channel Capacity for Uniform, Bounded Amplitude Noise (cont) P = Range of Transmitter’s Signal Space N = Peak-Peak Width of Noise W = Bandwidth in # Symbols / Sec C = Channel Capacity = Max. # of Error-Free Bits/Sec
C=
W Wlog log2(P/N) 2(P/N)
____________________________ Note: This formula is slightly different for Gaussian noise. 6.004 Fall ‘98 L13 (10/27) Page 18
Further Reading on Information Theory
The Mathematical Theory of Communication, Claude E. Shannon and Warren Weaver, 1972, 1949. Coding and Information Theory, Richard Hamming, Second Edition, 1986, 1980.
6.004 Fall ‘98 L13 (10/27) Page 19
The mythical equipotential wire
V1
V2
V3
6.004 Fall ‘98 L13 (10/27) Page 20
But every wire has parasitics:
-
+
+ -
dI V =L dt
dV I =C dt 6.004 Fall ‘98 L13 (10/27) Page 21
Why do wires act like transmission lines? ...
...
Signals take time to propagate Propagating Signals must have energy Inductance and Capacitance Stores Energy
Without termination, energy reaching the end of a transmission line has nowhere to go - so it Echoes
Echoes _________________________ 6.004 Fall ‘98 L13 (10/27) Page 22
Fundamental Equations of Lossless Transmission Lines ∂V ∂x
V = V(x, t ) -
I = I(x, t)
x
+
...
...
dL l= dx
dC c= dx
∂I ∂x
∂V ∂ I = l ∂ x ∂t ∂ I ∂V = c ∂ x ∂t
6.004 Fall ‘98 L13 (10/27) Page 23
Transmission Line Math Lets try a sinusoidal solution for V and I:
j (ω t t + ω x x )
jω t t
jω x x
V = V0 e = V0 e e j (ω t t + ω x x ) jω t t jω x x I = I0 e = I0e e ∂V ∂ I = l ∂ x ∂t ∂ I ∂V = c ∂ x ∂t
jω xV0 = l jω t I 0 jω x I 0 = c jω tV0 6.004 Fall ‘98 L13 (10/27) Page 24
Transmission Line Algebra jω xV0 = l jω t I 0 jω x I 0 = c jω tV0 ωt = ωx
1 l c
ω x V0 = l ω t I 0 ω x I 0 = c ω t V0
V0 = I0
l c
6.004 Fall ‘98 L13 (10/27) Page 25
The Open Transmission Line
6.004 Fall ‘98 L13 (10/27) Page 26
The Shorted Transmission Line
6.004 Fall ‘98 L13 (10/27) Page 27
Parallel Termination
6.004 Fall ‘98 L13 (10/27) Page 28
Series Termination
6.004 Fall ‘98 L13 (10/27) Page 29
• Series:
Series or Parallel ?
– No Static Power Dissipation – Only One Output Point – Slower Slew Rate if Output is Capacitively Loaded
• Parallel: – Static Power Dissipation – Many Output Points – Faster Slew Rate if Output is Capacitively Loaded