Skip to content

Commit

Permalink
clarification of CT lemma in practice 6
Browse files Browse the repository at this point in the history
  • Loading branch information
cschaffner committed Dec 4, 2017
1 parent 3b25026 commit a8b98b0
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 1 deletion.
Binary file modified Homework/06-practice-problems.pdf
Binary file not shown.
2 changes: 1 addition & 1 deletion Homework/06-practice-problems.tex
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
Prove the lemma below stating that the capacity per transmission is not increased if we use a discrete memoryless channel many times. For inspiration, look again at the proof of the converse of Shannon's noisy-channel coding theorem.

\medskip
\noindent\textbf{Lemma 7.9.2 in [CT]}\ Let $Y^n$ be the result of passing $X^n$ through a discrete memoryless channel of capacity $C$. Prove that for all $P_{X^n}$, it holds that $I(X^n; Y^n) \leq nC$.
\noindent\textbf{Lemma 7.9.2 in [CT]}\ Let $X_1, X_2, \ldots X_n = X^n$ be $n$ random variables with arbitrary joint distribution $P_{X^n}$. Let $Y^n$ be the result of passing $X^n$ through a discrete memoryless channel of capacity $C$. Prove that for all $P_{X^n}$, it holds that $I(X^n; Y^n) \leq nC$.
\medskip

Does your proof also work in case of coding with feedback (i.e.\ $X_{i+1}$ is allowed to depend on $X^i$ and $Y^i$)? If not, point out the steps in your proof where you use that there is no feedback.
Expand Down

0 comments on commit a8b98b0

Please sign in to comment.