Skip to content

Commit

Permalink
More GitHub + LaTeX cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
geky committed Oct 22, 2024
1 parent 57f6a8b commit 2e9d0e6
Showing 1 changed file with 43 additions and 40 deletions.
83 changes: 43 additions & 40 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -241,7 +241,7 @@ This polynomial has some rather useful properties:

$`
\begin{aligned}
1 - X_j x &= 1 - X_j X_j^-1 \\
1 - X_j x &= 1 - X_j X_j^{-1} \\
&= 1 - 1 \\
&= 0
\end{aligned}
Expand Down Expand Up @@ -736,16 +736,16 @@ Coming back to Reed-Solomon. Thanks to Berlekamp-Massey, we can solve the
following recurrence for the terms $\Lambda_k$ given at least $n \ge 2e$
syndromes $s_i$:

$$
``` math
\Lambda(i) = s_i = \sum_{k=1}^e \Lambda_k s_{i-k}
$$
```

These terms define our error-locator polynomial, which we can use to
find the locations of errors:

$$
``` math
\Lambda(x) = 1 + \sum_{k=1}^e \Lambda_k x^k
$$
```

All we have left to do is figure out where $\Lambda(X_j^{-1})=0$, since
these will be the locations of our errors.
Expand All @@ -766,34 +766,34 @@ magnitudes $Y_j$ is relatively straightforward. Kind of.

Recall the definition of our syndromes $S_i$:

$$
``` math
S_i = \sum_{j \in e} Y_j X_j^i
$$
```

With $e$ syndromes, this can be rewritten as a system of equations with
$e$ equations and $e$ unknowns, our error magnitudes $Y_j$, which we can
solve for:

$$
\begin{bmatrix}
S_0 \\
S_1 \\
\vdots \\
S_{e-1} \\
\end{bmatrix} =
\begin{bmatrix}
1 & 1 & \dots & 1\\
X_{j_0} & X_{j_1} & \dots & X_{j_{e-1}}\\
\vdots & \vdots & \ddots & \vdots \\
X_{j_0}^{e-1} & X_{j_1}^{e-1} & \dots & X_{j_{e-1}}^{e-1}\\
\end{bmatrix}
``` math
\begin{bmatrix}
Y_{j_0}\\
Y_{j_1}\\
\vdots \\
Y_{j_{e-1}}\\
S_0 \\
S_1 \\
\vdots \\
S_{e-1}
\end{bmatrix} =
\begin{bmatrix}
1 & 1 & \dots & 1 \\
X_{j_0} & X_{j_1} & \dots & X_{j_{e-1}} \\
\vdots & \vdots & \ddots & \vdots \\
X_{j_0}^{e-1} & X_{j_1}^{e-1} & \dots & X_{j_{e-1}}^{e-1}
\end{bmatrix}
\begin{bmatrix}
Y_{j_0} \\
Y_{j_1} \\
\vdots \\
Y_{j_{e-1}}
\end{bmatrix}
$$
```

#### Forney's algorithm

Expand All @@ -805,23 +805,23 @@ for $Y_j$ directly, called [Forney's algorithm][forneys-algorithm].
Assuming we know an error-locator $X_j$, plug it into the following
formula to find an error-magnitude $Y_j$:

$$
Y_j = \frac{X_j \Omega(X_j^{-1})}{\Lambda'(X_j^)}
$$
``` math
Y_j = \frac{X_j \Omega(X_j^{-1})}{\Lambda'(X_j^{-1})}
```

Where $\Omega(x)$, called the error-evaluator polynomial, is defined like
so:

$$
``` math
\Omega(x) = S(x) \Lambda(x) \bmod x^n
$$
```

And $\Lambda'(x)$, the [formal derivative][formal-derivative] of the
error-locator, can be calculated by terms like so:

$$
``` math
\Lambda'(x) = \sum_{i=1}^2 i \cdot \Lambda_i x^{i-1}
$$
```

Though note $i$ is not a field element, so multiplication by $i$
represents normal repeated addition. And since addition is xor in our
Expand All @@ -835,23 +835,26 @@ Haha, I know right? Where did this equation come from? How does it work?
How did Forney even come up with this?

To be honest I don't know the answer to most of these questions, there's
very little documentation online about this formula comes from. But at
the very least we can prove that it works.
very little documentation online about where this formula comes from.

But at the very least we can prove that it works.

#### The error-evaluator polynomial

Let us start with the syndrome polynomial $S(x)$:

$$
``` math
S(x) = \sum_{i=0}^n S_i x^i
$$
```

Substituting the definition of $S_i$:

$$
S(x) = \sum_{i=0}^n \sum_{j \in e} Y_j X_j^i x^i
= \sum_{j \in e} \left(Y_j \sum_{i=0}^n X_j^i x^i\right)
$$
``` math
\begin{aligned}
S(x) &= \sum_{i=0}^n \sum_{j \in e} Y_j X_j^i x^i \\
&= \sum_{j \in e} \left(Y_j \sum_{i=0}^n X_j^i x^i\right)
\end{aligned}
```

The sum on the right side turns out to be a geometric series that we can
substitute in:
Expand Down

0 comments on commit 2e9d0e6

Please sign in to comment.