CS154. More on Communication Complexity, Start up Turing Machines

CS154 More on Communication Complexity, Start up Turing Machines 1 Communication Complexity A theoretical model of distributed computing • Function ...
Author: Gabriel York
11 downloads 1 Views 568KB Size
CS154 More on Communication Complexity, Start up Turing Machines 1

Communication Complexity A theoretical model of distributed computing • Function f : {0,1}* × {0,1}* → {0,1} – Two inputs, ∈ {0,1}* and ∈ {0,1}* – We assume | |=| |= . Think of as HUGE

• Two computers: Alice and Bob – Alice only knows , Bob only knows

• Goal: Compute f( , ) by communicating as few bits as possible between Alice and Bob We do not count computation cost. We only care about the number of bits communicated. 2

f(x,y)=0

x

f(x,y)=0 A(x,ε) = 0 B(y,0) = 1 A(x,01) = 1 B(y,011) = 0

y

Def. A protocol for a function f is a pair of functions A, B : {0,1}* × {0,1}* {0, 1, STOP} with the semantics: On input ( , ), let := 0, = ε. While ( ≠ STOP), ++ If is odd, Alice sends = , ⋯ else Bob sends = , ⋯ Output . Number of rounds = − 3

f(x,y)=0

x

f(x,y)=0 A(x,ε) = 0 B(y,0) = 1 A(x,01) = 1 B(y,011) = 0

y

Def. The cost of a protocol P for f on -bit strings is [number of rounds in P to compute f( , )] , ∈ ,

The communication complexity of f on -bit strings is the minimum cost over all protocols for f on -bit strings = the minimum number of rounds used by any protocol that computes f( , ), over all -bit , 4

f(x,y)=0

f(x,y)=0 A(x,ε) = 0 B(y,0) = 1 A(x,01) = 1 B(y,011) = 0

x Example. Let f : {0,1}* × {0,1}*

y {0,1} be arbitrary

There is always a “trivial” protocol: Alice sends the bits of her in odd rounds Bob sends the bits of his in even rounds After rounds, they both know each other’s input! The communication complexity of every f is at most 5

f(x,y)=0

f(x,y)=0

y

x Example. EQUALS( , ) = 1 ⇔

=

What’s a good protocol for computing EQUALS?

???? Communication complexity of EQUALS is at most 2 6

A Connection to Streaming Algorithms

x

y

Let ⊆ {0,1}* Def. : {0,1}*×{0,1}* {0,1} for , with | |=| | as: , = ⇔ ∈

Examples: = { z | z has an odd number of 1s} ⇒ , = PARITY(x,y) = ∑" = { z | z has more 1s than 0s} ⇒ , = MAJORITY(x,y) = { zz | z ∈ {0,1}*} ⇒ , = EQUALS(x,y)

"

+ ∑"

"

mod 2

7

A Connection to Streaming Algorithms

x

y

Let ⊆ {0,1}* Def. : {0,1}*×{0,1}* {0,1} for , with | |=| | as: , = ⇔ ∈

Theorem: If has a streaming algorithm using ≤ % space, then the comm. complexity of is at most 4% + 5. Proof: Alice runs streaming algorithm A on . Sends the memory content of A: this is % bits of space Bob starts up A with that memory content, runs A on . Gets an output bit, sends to Alice. (…why 4%+5 rounds?) 8

A Connection to Streaming Algorithms Let

⊆ {0,1}*

Def.

,

=





Theorem: If has a streaming algorithm using ≤ % space, then the comm. complexity of is at most 4% + 5. Corollary: For every regular language , the communication complexity of

is O(1).

Example: Comm. Complexity of PARITY is O(1) Corollary: Comm. Complexity of MAJORITY is O(log n), because there’s a streaming algorithm for {x : x has more 1’s than 0’s} with O(log n) space

What about the Comm. Complexity of EQUALS? 9

Communication Complexity of EQUALS Theorem: The comm. complexity of EQUALS is &( ). In particular, every protocol for EQUALS needs ≥ bits of communication.

No communication protocol can do much better than “send your whole input”! Corollary:

= {ww | w in {0,1}*} is not regular.

Moreover, every streaming algorithm for needs ( bits of memory, for some constant ( > 0! 10

Communication Complexity of EQUALS Theorem: The comm. complexity of EQUALS is &( ). In particular, every protocol for EQUALS needs ≥ bits of communication. Idea: Consider all the possible ways A & B can communicate. Definition: The communication pattern of a protocol on inputs ( , ) is the sequence of bits that Alice & Bob send.

0 1 1 0 Pattern: 0110

11

Communication Complexity of EQUALS Theorem: The comm. complexity of EQUALS is &( ). In particular, every protocol for EQUALS needs ≥ bits of communication. Proof: By contradiction. Suppose CC of EQUALS is ≤ − . Then there are ≤ − possible communication patterns of that protocol, over all pairs of inputs ( , ) with bits each. Claim: There are ≠ such that on both ( , ) and ( , ), the protocol of Alice and Bob uses the same pattern ). Now, what is the communication pattern on ( , )?

This pattern is also )! (WHY?) So Alice & Bob output the same bit on ( , ) and ( , ). But EQUALS( , ) = 0 and EQUALS( , ) = 1. Contradiction!

12

Randomized Protocols Help! EQUALS needs (

bits of communication, but…

Theorem: For all , of bits each, there is a randomized protocol for EQUALS , using only O(log ) bits of communication, which works with probability 99.9%!

13

Turing Machines

14

Turing Machine (1936) FINITE STATE q10 CONTROL

AI

N

P

U

T



INFINITE REWRITABLE TAPE 15

Turing Machine (1936)

AI

N

P

U

T



INFINITE REWRITABLE TAPE 16

17

Turing Machines versus DFAs TM can both write to and read from the tape The head can move left and right The input doesn’t have to be read entirely, and the computation can continue further (even, forever) after all input has been read Accept and Reject take immediate effect 18

read

write

Σ = {0}

move

0 → 0, R



,R

qaccept

0 → 0, R →

,R

qreject This Turing machine decides the language {0}

19

read

write 0 → 0, R

Σ = {0}

move →

,R

qaccept

0 → 0, R →

,R

0 → 0, R → ,L This Turing machine recognizes the language {0}

20

Deciding the language L = { w#w | w ∈ {0,1}* } STATE q1, FIND qGO LEFT q1,#FIND # q#, FIND q0, FIND q0, FIND and so on…

0

1 0 X

1 X

# 1

0 #

0 X 1

1

1

1. If there’s no # on the tape (or more than one #), reject. 2. While there is a bit to the left of #, Replace the first bit with X, and check if the first bit b to the right of the # is identical. (If not, reject.) Replace that bit b with an X too. 3. If there’s a bit to the right of #, then reject else accept

21

Definition: A Turing Machine is a 7-tuple T = (Q, Σ, Γ, δ, q0, qaccept, qreject), where: Q is a finite set of states Σ is the input alphabet, where Γ is the tape alphabet, where

∉Σ ∈ Γ and Σ ⊆ Γ

δ : Q × Γ → Q × Γ × {L, R} q0 ∈ Q is the start state qaccept ∈ Q is the accept state qreject ∈ Q is the reject state, and qreject ≠ qaccept 22

Turing Machine Configurations q7

1

1

0

1

0

0

0

1

1

0

corresponds to the configuration:

11010q700110 ∈ {Q ∪Γ}* 23

Defining Acceptance and Rejection for TMs Let C1 and C2 be configurations of a TM M Definition. C1 yields C2 if M is in configuration C2 after running M in configuration C1 for one step Suppose δ(q1, b) = (q2, c, L) Then aaq1bb yields aq2acb Suppose δ(q1, a) = (q2, c, R) Then cabq1a yields cabcq2

Let w ∈ Σ* and M be a Turing machine M accepts w if there are configs C0, C1, ..., Ck, s.t. • C0 = q0w • Ci yields Ci+1 for i = 0, ..., k-1, and • Ck contains the accept state qaccept 24

A TM M recognizes a language L if M accepts exactly those strings in L A language L is recognizable (a.k.a. recursively enumerable) if some TM recognizes L A TM M decides a language L if M accepts all strings in L and rejects all strings not in L A language L is decidable (a.k.a. recursive) if some TM decides L 25

w ∈ Σ*

w ∈ Σ* TM

TM

w∈L? yes

no

accept

reject

L is decidable (recursive)

w∈L? yes

accept

no reject or loop

L is recognizable (recursively enumerable)

26

n 2 A Turing machine for deciding { 0 | n ≥ 0 }

Turing Machine PSEUDOCODE: 1. Sweep from left to right, cross out every other 0 2. If in step 1, the tape had only one 0, accept 3. If in step 1, the tape had an odd number of 0’s, reject 4. Move the head back to the first input symbol. 5. Go to step 1. Why does this work? Idea: Every time we return to stage 1, the number of 0’s on the tape has been halved. 27

Step 4

2n

{0 |n≥0}

x → x, L 0 → 0, L

q2 →

,R



x → x, R q0 x → x, R → ,R

0→

x → x, R q1

,R →

qreject

q3 0 → x, R

,R

Step 2

Step 1 0 → 0, R

0 → x, R

qaccept Step 3

,L

q4



,R

x → x, R 28