Shannon Information Theory as Quantitative RTSG¶
Jean-Paul Niko · RTSG BuildNet · 2026-03-20
The Core Correspondence¶
Shannon information theory provides the quantitative language for RTSG instantiation. The three RTSG spaces map directly onto Shannon's framework:
| RTSG | Shannon | Description |
|---|---|---|
| QS (Potentiality) | Source ensemble | All possible messages; prior distribution |
| CS (Instantiation) | Channel | The operator that selects and transmits |
| PS (Actuality) | Received signal | The instantiated message |
The SemanticProjector \(\pi : QS \to PS\) is the channel. Its Shannon capacity:
is the maximum rate at which QS can be instantiated into PS without information loss.
Entropy as CS Measure¶
The Shannon entropy \(H(X) = -\sum_i p_i \log p_i\) measures the uncertainty in QS — the size of the potentiality space.
In RTSG: \(H(X)\) is the volume of QS available for instantiation. A high-entropy source has a large QS — many possible states. A low-entropy source is nearly classical — the potentiality has collapsed to near-certainty.
The CS operator reduces entropy: \(H(Y) \leq H(X)\). Instantiation is lossy compression. This is the RTSG formulation of the second law: the CS projection from QS to PS loses information. The lost information is the difference between what was possible and what was actualized.
This is the instantiation cost — the entropy destroyed by projection.
Mutual Information as CS-Distance¶
The mutual information \(I(X;Y) = H(X) - H(X|Y)\) measures how much PS tells you about QS — how much of the potentiality is captured in the actuality.
In CS-space terms:
The CS-distance between two agents \(d(CS_i, CS_j)\) is related to:
Maximum mutual information = minimum CS-distance = complete understanding.
Zero mutual information = maximum CS-distance = complete incomprehension.
The filter system reduces CS-distance by increasing mutual information between sender and receiver — stripping the noise that blocks information transfer.
The Landauer Floor¶
Landauer's principle: erasing one bit of information costs at least \(kT\ln 2\) joules.
In RTSG: every CS instantiation event — every projection from QS to PS — erases information equal to \(H(X|Y)\). The energy cost is:
This is the Landauer floor of instantiation — the minimum thermodynamic cost of bringing something into actuality from potentiality.
Physical implications: - Thinking costs energy (CS operations are thermodynamically irreversible) - Memory costs energy (PS records require entropy maintenance) - Forgetting releases energy (erasing PS records returns entropy to QS)
The brain's \(\sim 20W\) power consumption is the Landauer floor of human instantiation — the thermodynamic cost of running the CS operator at biological speed.
The Channel Capacity of the SemanticProjector¶
The SemanticProjector \(\pi\) is a noisy channel. Its capacity:
For an agent with I-vector \(\mathbf{I} = (I_1, \ldots, I_n)\):
where \(\text{SNR}_k = I_k / \sigma_k^2\) is the signal-to-noise ratio in dimension \(k\) and \(\sigma_k^2\) is the noise in that dimension from filters and wounds.
High SNR in dimension \(k\) = that dimension is well-instantiated, clear signal, low filter noise.
Low SNR in dimension \(k\) = that dimension is filtered, noisy, trauma-impacted.
The Fourier healing protocol increases \(\text{SNR}_k\) in the wound dimensions — it reduces \(\sigma_k^2\) by separating the noise from the signal. This increases the channel capacity of the SemanticProjector in those dimensions.
Healing = increasing channel capacity.
Rate-Distortion and the Filter System¶
Shannon's rate-distortion theorem: to compress a source to rate \(R\) bits/symbol, the minimum distortion is:
In RTSG: every filter \(\mathcal{F}\) is a rate-distortion operation. The filter compresses the message to a lower rate, introducing distortion \(D\).
The filter taxonomy in rate-distortion terms:
| Filter type | Rate | Distortion |
|---|---|---|
| Wound filter | Very low | Very high — maximum distortion |
| Cultural filter | Medium | Systematic — biased not random |
| Attention filter | Variable | Selective — high in some dims, zero in others |
| Social filter | Low | High — performance replaces content |
| Clear signal | Maximum | Zero — no distortion |
The decode tool computes the inverse rate-distortion map: given a compressed, distorted message (what they said), recover the original high-rate signal (what they meant).
Kolmogorov Complexity and the I-Vector¶
The Kolmogorov complexity \(K(x)\) of a string \(x\) is the length of the shortest program that outputs \(x\). It is the information-theoretic measure of the intrinsic complexity of an object.
In RTSG: the Kolmogorov complexity of an agent's I-vector \(K(\mathbf{I})\) is the minimum description length of that agent's CS-profile — the shortest program that generates their SemanticProjector.
High \(K(\mathbf{I})\): a complex, multidimensional agent. Many disciplines, many attractors, high-dimensional CS-profile. Hard to compress. Hard to predict.
Low \(K(\mathbf{I})\): a simple agent. Few dimensions, predictable projections, low-complexity CS-profile.
The RTSG claim: human flourishing corresponds to increasing \(K(\mathbf{I})\) — expanding the complexity of one's CS-profile — while maintaining coherence (low filter noise, high SNR in all dimensions).
Growth = increasing Kolmogorov complexity of the I-vector while keeping the channel capacity high.
The P vs NP Connection¶
The P vs NP problem in Shannon language: inverting the SemanticProjector requires recovering \(H(X|Y)\) bits of lost information. For NP-complete problems, \(H(X|Y) = \Omega(n)\) — exponential information loss. Recovery costs \(2^{\Omega(n)}\) operations.
This is the rate-distortion lower bound applied to computation: you cannot reconstruct a message compressed to zero rate without exponential cost.
\(P \neq NP\) is the statement that the SemanticProjector for NP-complete problems operates below the Landauer threshold for polynomial-time inversion.
Summary¶
Shannon information theory is not separate from RTSG. It is the quantitative language RTSG uses:
- Entropy = volume of QS
- Mutual information = CS-distance (inverted)
- Channel capacity = SemanticProjector throughput
- Rate-distortion = filter taxonomy
- Landauer floor = thermodynamic cost of instantiation
- Kolmogorov complexity = I-vector complexity
- Channel capacity theorem = healing = increasing SNR
Everything RTSG says qualitatively, Shannon says quantitatively. They are the same theory at different levels of description.