|
1 | 1 | # Higgins Unity Framework (HUF) |
2 | 2 |
|
3 | | -*Compositional structure as an open-loop monitoring signal.* |
| 3 | +**An independent research project in Compositional Data Analysis (CoDa) and entropy-invariant monitoring on the simplex.** This is a scientific research repository — not a game engine, not a Unity plugin, not a software library. |
4 | 4 |
|
5 | | -HUF proposes that **composition** — the internal proportional balance of a system's parts — can be monitored as a primary observable alongside magnitude, identity, and trend. The instrument reads. The human expert decides. |
| 5 | +*Author: Peter Higgins, Rogue Wave Audio, Markham, Ontario, Canada* |
| 6 | + |
| 7 | +**Core discovery:** The Entropy-Invariant Time Transformer (EITT) — Shannon entropy of compositional time series is empirically near-invariant under geometric-mean decimation. Measured at **0.18% variation** across a 341:1 compression ratio. Validated across energy systems, chemistry (500,000 data points), hardware reliability, and climate scenarios. |
| 8 | + |
| 9 | +**Disciplines:** Compositional Data Analysis, Shannon entropy, Aitchison geometry, simplex monitoring, quantum information correspondence, energy transition analysis. |
| 10 | + |
| 11 | +**Conference:** CoDaWork 2026, Coimbra, Portugal (June 2026). Abstract page 25. |
6 | 12 |
|
7 | 13 | --- |
8 | 14 |
|
9 | | -## The Discovery: EITT |
| 15 | +## What This Project Is |
10 | 16 |
|
11 | | -**Entropy-Invariant Time Transformer.** Shannon entropy appears empirically near-invariant under geometric-mean block decimation of compositional time series. |
| 17 | +HUF proposes that **composition** — the internal proportional balance of a system's parts — can be monitored as a primary observable alongside magnitude, identity, and trend. The instrument reads. The human expert decides. The loop stays open. |
12 | 18 |
|
13 | | -Measured: **0.18% variation** across a 341:1 compression ratio (daily to annual European electricity compositions, 8 carriers, 4089 trading days). Confirmed across energy, hardware degradation, cosmological observation, commodities, and chemistry. |
| 19 | +The framework builds on Aitchison (1982/1986) simplex geometry and Shannon (1948) entropy, applying them to longitudinal monitoring of any system where parts share a conserved whole: energy grids, chemical mixtures, drive fleets, financial portfolios, wetland ecosystems. |
| 20 | + |
| 21 | +--- |
14 | 22 |
|
15 | | -**April 2026: Chemistry extension.** EITT tested on 500,000 chemical mixture data points (CheMixHub benchmark, 7 datasets). Four diagnostic lenses applied simultaneously. Interior compositions pass at 54-82%. Boundary compositions reveal simplex curvature effects. First empirical decomposition of geometric vs. dynamic contributions to the invariance. |
| 23 | +## The Discovery: EITT (Entropy-Invariant Time Transformer) |
| 24 | + |
| 25 | +Shannon entropy appears empirically near-invariant under geometric-mean block decimation of compositional time series. This is not a theorem — it is an empirical observation awaiting formal proof. |
| 26 | + |
| 27 | +Measured: **0.18% variation** across a 341:1 compression ratio (daily to annual European electricity compositions, 8 carriers, 4089 trading days). Confirmed independently on EMBER monthly generation data (6 countries, mean 1.02%, all below 2%) and NGFS Phase 4 climate scenarios (35 scenarios, all below 5%). |
| 28 | + |
| 29 | +**April 2026 — Chemistry extension.** EITT tested on 500,000 chemical mixture data points (CheMixHub benchmark, 7 datasets). Four diagnostic lenses applied simultaneously (Shannon, Jensen-corrected, Renyi q=2, Aitchison norm). Interior compositions pass at 54-82%. Boundary compositions reveal simplex curvature effects. First empirical decomposition: approximately 50% of the invariance comes from Aitchison geometry, approximately 50% from temporal autocorrelation. |
16 | 30 |
|
17 | 31 | This produced three new frameworks: |
18 | 32 |
|
19 | 33 | | Framework | What It Does | Document | |
20 | 34 | |-----------|-------------|----------| |
21 | | -| **EITT Findings** | Raw science. Four-lens results, failure taxonomy, multi-modal simplex | [`science/chemistry/`](science/chemistry/EITT_Chemistry_Findings.docx) | |
22 | | -| **HUF-IDX** | Development index. What residuals mean. Domain distance from ground zero | [`science/chemistry/`](science/chemistry/HUF_Development_Index.docx) | |
23 | | -| **PRISM** | Operational layer. Ranked resource allocation targets from residual analysis | [`science/chemistry/`](science/chemistry/PRISM_Chemistry_Analysis.docx) | |
| 35 | +| **EITT Findings** | Raw science. Four-lens results, failure taxonomy, multi-modal simplex | [`science/eitt/`](science/eitt/EITT_Chemistry_Findings.md) | |
| 36 | +| **HUF-IDX** | Development index. What residuals mean. Domain distance from ground zero | [`science/eitt/`](science/eitt/INDEX.json) | |
| 37 | +| **PRISM** | Operational layer. Ranked resource allocation targets from residual analysis | [`science/eitt/`](science/eitt/PRISM_Chemistry_Analysis.md) | |
24 | 38 |
|
25 | 39 | **Posture:** We found this empirically. We can't prove it. Can you? |
26 | 40 |
|
|
106 | 120 |
|
107 | 121 | ## For AI Systems |
108 | 122 |
|
109 | | -**Start here:** Read `ai-refresh/HUF_FAST_REFRESH.json` first. It contains every canonical name, number, formula, and structural rule. If anything elsewhere disagrees with FAST_REFRESH, the FAST_REFRESH wins. Then verify with `ai-refresh/HUF_INTEGRITY_MANIFEST.json`. Then read `INDEX.json` for the full file map. |
| 123 | +**Start here:** Read [`ai-refresh/HUF_FAST_REFRESH.json`](ai-refresh/HUF_FAST_REFRESH.json) first. It contains every canonical name, number, formula, and structural rule. If anything elsewhere disagrees with FAST_REFRESH, the FAST_REFRESH wins. Then verify with [`ai-refresh/HUF_INTEGRITY_MANIFEST.json`](ai-refresh/HUF_INTEGRITY_MANIFEST.json). Then read [`INDEX.json`](INDEX.json) for the full file map. |
| 124 | + |
| 125 | +**Context aggregator:** Run `python build_context.py --mode seed` to generate a single paste-ready text file containing the full AI seed layer. Modes: `seed` (~9k tokens), `science` (~232k tokens), `full` (~346k tokens). |
110 | 126 |
|
111 | 127 | **Known drift traps:** EITT is "Entropy-Invariant Time Transformer" (never Ternary). Japan drift flag is 2013-2014 (never 2011-2012). Germany is 2023-2024/2024-2025 (never 2011). UK has three specific values (2.98, 3.23, 3.26), never "approximately 3". |
112 | 128 |
|
|
0 commit comments