chirality-framework


Namechirality-framework JSON
Version 19.4.0 PyPI version JSON
download
home_pageNone
SummaryA semantic calculator for executing a canonical, multi-stage pipeline for structured problem-solving and knowledge generation.
upload_time2025-09-11 00:35:27
maintainerChirality Framework Team
docs_urlNone
authorChirality Framework Team
requires_python>=3.9
licenseNone
keywords llm ontology semantic computing pipeline orchestration static site generator knowledge generation
VCS
bugtrack_url
requirements neo4j python-dotenv click PyYAML openai pytest
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Chirality Framework: A Semantic Calculator

[![PyPI](https://img.shields.io/pypi/v/chirality-framework.svg)](https://pypi.org/project/chirality-framework/)
[![Python versions](https://img.shields.io/pypi/pyversions/chirality-framework.svg)](https://pypi.org/project/chirality-framework/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](LICENSE)
[![Publish](https://github.com/sgttomas/chirality-framework/actions/workflows/python-publish.yml/badge.svg)](https://github.com/sgttomas/chirality-framework/actions/workflows/python-publish.yml)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![GitHub release (latest SemVer)](https://img.shields.io/github/v/tag/sgttomas/chirality-framework?sort=semver&label=release)](https://github.com/sgttomas/chirality-framework/tags)

**Version: 19.3.0** | **Status: DDD-Compliant Architecture**

The Chirality Framework is a meta-ontological, system-agnostic methodology for mapping the solution space to a problem statement in the context of knowledge work. It creates structured semantic relationships that serve as "semantic anchors" to guide LLMs through problem-solving stages across a "semantic valley."

The framework employs two distinct phases with fundamentally different prompting strategies:
- **Phase 1** (Matrices A-E): Uses conversational prompting to build semantic understanding
- **Phase 2** (Tensors M-N): Uses Phase 1 implementation as system prompt for modular cell-by-cell construction

## Core Architecture: Two‑Phase Semantic Computation

### Phase 1: Conversational Semantic Pipeline (Matrices A-E)
Phase 1 uses a conversational dialogue history as the system prompt to create semantic understanding. The dialogue builds the concept of semantic multiplication through examples, develops key concepts organically, and establishes modal ontologies through iterative refinement. This creates a "semantic state" in the LLM that enables proper interpretation.

Semantic‑first transcript with out‑of‑band normalization:
1. **Stage A — Mechanical/Interpreted (Markdown):** Prompts produce human‑readable, markdown‑formatted matrices; the transcript stays clean and creative.
2. **Stage B — Normalization (Strict JSON):** A reusable normalizer prompt converts Stage‑A text into schema‑accurate JSON. Local validation + one diff‑driven retry ensures strict compliance. A small deterministic parser skips the LLM when markdown tables match canonical shapes.

### Phase 2: Modular Tensor Construction (Tensors M-N)
Phase 2 uses the complete Phase 1 implementation (through Matrix E) as the system prompt, then constructs tensors cell-by-cell WITHOUT rolling context. The modular design of the tensors facilitates this approach, with each cell computed independently using semantic cross products to create hierarchical structures.

For a complete technical description, see the **[Canonical Algorithm Documentation](docs/ALGORITHM.md)**.

## The Ontological Modality Path

The sequence of stations in the semantic valley is not arbitrary; it follows a deep, underlying pattern of cognitive modalities. This path describes the *type* of work being done at each stage, revealing a structured cycle of systematic processing, epistemic (knowledge-based) evaluation, and alethic (truth-based) assessment.

| Modality | Station | Operation |
| :--- | :--- | :--- |
| `Problem` | 1. Problem Statement | `[A], [B]` |
| `Systematic` | 2. Requirements | `[C] = [A] · [B]` |
| `Process` | 3. Objectives | `[D] = [A] + [F]` |
| `Epistemic` | 4. Verification | `[K] = [D]^T, [X] = [K] · [J]` |
| `Epistemic` | 5. Validation | `[Z] = shift([X])` |
| `Process` | 6. Evaluation | `[G], [P], [T], [E] = [G] · [T]` |
| `Alethic` | 7. Assessment | `[M] = [R] × [E]` |
| `Epistemic` | 8. Implementation | `[W] = [M] × [X]` |
| `Alethic` | 9. Reflection | `[U] = [W] × [P]` |
| `Alethic` | 10. Resolution | `[N] = [U] × [H]` |

**Clarifications:**
- **X/Z Modality:** Both Verification (S4) and Validation (S5) are `Epistemic`. S4 strictly precedes S5.
- **E Modality:** Evaluation (S6) is a `Process` modality, not Epistemic.

For a detailed explanation of this conceptual architecture, see the **[Project Philosophy Documentation](docs/PHILOSOPHY.md)**.

## Quick Start

The recommended way to use the framework is to compute the entire Phase 1 pipeline and view the results.

### Prerequisites
- Python 3.9+
- An OpenAI API key set as the `OPENAI_API_KEY` environment variable in a `.env` file in the project root.

### Step 1: Install and Set Up

```bash
# Install with all dependencies
pip install -e ".[dev,openai]"

# Ensure the lens catalog is generated (only needs to be done once)
python3 -m chirality.interfaces.cli lenses ensure
```

### Step 2: Run Phase‑1 (semantic‑first) and extract structured JSON

Recommended end‑to‑end (semantic transcript + strict JSON):

```bash
# Use a stronger model if desired
export CHIRALITY_MODEL=gpt-5
export CHIRALITY_TEMPERATURE=1.0

# Run Phase‑1 in relaxed (markdown) mode and extract strict JSON
python -m chirality.interfaces.cli \
  phase1-dialogue-run \
  --lens-mode auto \
  --relaxed-json \
  --extract-structured \
  --reasoning-effort low \
  --out runs/latest_run

# Artifacts
# - runs/latest_run/phase1_dialogue.jsonl                (clean transcript)
# - runs/latest_run/phase1_relaxed_output.json           (Stage‑A content)
# - runs/latest_run/phase1_structured.json               (JSON + validation report)
# - runs/latest_run/phase1_structured_matrices.json      (matrices‑only for DB ingest)
```

Stage‑A only through Matrix C (quick test):

```bash
python -m chirality.interfaces.cli \
  phase1-dialogue-run \
  --lens-mode auto \
  --relaxed-json \
  --stop-at C_interpreted \
  --extract-structured \
  --out runs/c_stageA
```

Run extraction later (CI/CD):

```bash
python -m chirality.interfaces.cli \
  phase1-extract \
  --from runs/latest_run/phase1_relaxed_output.json \
  --out  runs/latest_run/phase1_structured.json \
  --matrices-only  # optional: write only matrices
```

Print matrices (quick view):

```bash
python - <<'PY'
import json, pathlib
p = pathlib.Path('runs/latest_run/phase1_structured_matrices.json')
data = json.loads(p.read_text())
for m in ['C','F','D','K','X','Z','G','T','E']:
  if m in data['matrices']:
    print(f'== Matrix {m} ==')
    for k,v in data['matrices'][m].items():
      if isinstance(v, dict) and 'elements' in v:
        print(f'[{k}]')
        for row in v['elements']:
          print(' | '.join(map(str,row)))
        print()
PY
```


## Development

To set up the development environment and run tests, please refer to the instructions in `CONTRIBUTING.md`.

**Key Development Notes:**
- **DDD Architecture**: Clean separation of domain/application/infrastructure/interfaces layers
- **Prompt Assets**: Located in `chirality/infrastructure/prompts/assets/` following DDD principles
- **Single CLI Entry Point**: Use `chirality` command (via `chirality.interfaces.cli:main`)
- **Output Channels**: Logs go to stderr, data goes to stdout (for CI/CD integration)
- **Guard Scripts**: Run before commits to prevent legacy code drift
- **Structured outputs & transport**: Adapter uses typed input parts; Stage‑B enforces JSON via `json_object` and validates locally (plus one retry). For reasoning models (e.g., GPT‑5), unsupported sampling params (like `top_p`) are omitted automatically.

Additional docs:
- `docs/INTERFACE.md`: Producer mirror of the chirality-app contract (app mode).
- `GEMINI.md`: Guidance for using Gemini/AI assistants with this repo.
 - `CLAUDE.md`: Guidance for using Claude Code with this repo.
 - `AGENTS.md`: Notes for agentic coding assistants working on this project.
repo.
 - `CLAUDE.md`: Guidance for using Claude Code with this repo.
 - `AGENTS.md`: Notes for agentic coding assistants working on this project.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "chirality-framework",
    "maintainer": "Chirality Framework Team",
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "llm, ontology, semantic computing, pipeline orchestration, static site generator, knowledge generation",
    "author": "Chirality Framework Team",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/97/c9/ab09372ca58ec1720550082669f006b9163b87fd80533657a7d2cdb6b118/chirality_framework-19.4.0.tar.gz",
    "platform": null,
    "description": "# Chirality Framework: A Semantic Calculator\n\n[![PyPI](https://img.shields.io/pypi/v/chirality-framework.svg)](https://pypi.org/project/chirality-framework/)\n[![Python versions](https://img.shields.io/pypi/pyversions/chirality-framework.svg)](https://pypi.org/project/chirality-framework/)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](LICENSE)\n[![Publish](https://github.com/sgttomas/chirality-framework/actions/workflows/python-publish.yml/badge.svg)](https://github.com/sgttomas/chirality-framework/actions/workflows/python-publish.yml)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n[![GitHub release (latest SemVer)](https://img.shields.io/github/v/tag/sgttomas/chirality-framework?sort=semver&label=release)](https://github.com/sgttomas/chirality-framework/tags)\n\n**Version: 19.3.0** | **Status: DDD-Compliant Architecture**\n\nThe Chirality Framework is a meta-ontological, system-agnostic methodology for mapping the solution space to a problem statement in the context of knowledge work. It creates structured semantic relationships that serve as \"semantic anchors\" to guide LLMs through problem-solving stages across a \"semantic valley.\"\n\nThe framework employs two distinct phases with fundamentally different prompting strategies:\n- **Phase 1** (Matrices A-E): Uses conversational prompting to build semantic understanding\n- **Phase 2** (Tensors M-N): Uses Phase 1 implementation as system prompt for modular cell-by-cell construction\n\n## Core Architecture: Two\u2011Phase Semantic Computation\n\n### Phase 1: Conversational Semantic Pipeline (Matrices A-E)\nPhase 1 uses a conversational dialogue history as the system prompt to create semantic understanding. The dialogue builds the concept of semantic multiplication through examples, develops key concepts organically, and establishes modal ontologies through iterative refinement. This creates a \"semantic state\" in the LLM that enables proper interpretation.\n\nSemantic\u2011first transcript with out\u2011of\u2011band normalization:\n1. **Stage A \u2014 Mechanical/Interpreted (Markdown):** Prompts produce human\u2011readable, markdown\u2011formatted matrices; the transcript stays clean and creative.\n2. **Stage B \u2014 Normalization (Strict JSON):** A reusable normalizer prompt converts Stage\u2011A text into schema\u2011accurate JSON. Local validation + one diff\u2011driven retry ensures strict compliance. A small deterministic parser skips the LLM when markdown tables match canonical shapes.\n\n### Phase 2: Modular Tensor Construction (Tensors M-N)\nPhase 2 uses the complete Phase 1 implementation (through Matrix E) as the system prompt, then constructs tensors cell-by-cell WITHOUT rolling context. The modular design of the tensors facilitates this approach, with each cell computed independently using semantic cross products to create hierarchical structures.\n\nFor a complete technical description, see the **[Canonical Algorithm Documentation](docs/ALGORITHM.md)**.\n\n## The Ontological Modality Path\n\nThe sequence of stations in the semantic valley is not arbitrary; it follows a deep, underlying pattern of cognitive modalities. This path describes the *type* of work being done at each stage, revealing a structured cycle of systematic processing, epistemic (knowledge-based) evaluation, and alethic (truth-based) assessment.\n\n| Modality | Station | Operation |\n| :--- | :--- | :--- |\n| `Problem` | 1. Problem Statement | `[A], [B]` |\n| `Systematic` | 2. Requirements | `[C] = [A] \u00b7 [B]` |\n| `Process` | 3. Objectives | `[D] = [A] + [F]` |\n| `Epistemic` | 4. Verification | `[K] = [D]^T, [X] = [K] \u00b7 [J]` |\n| `Epistemic` | 5. Validation | `[Z] = shift([X])` |\n| `Process` | 6. Evaluation | `[G], [P], [T], [E] = [G] \u00b7 [T]` |\n| `Alethic` | 7. Assessment | `[M] = [R] \u00d7 [E]` |\n| `Epistemic` | 8. Implementation | `[W] = [M] \u00d7 [X]` |\n| `Alethic` | 9. Reflection | `[U] = [W] \u00d7 [P]` |\n| `Alethic` | 10. Resolution | `[N] = [U] \u00d7 [H]` |\n\n**Clarifications:**\n- **X/Z Modality:** Both Verification (S4) and Validation (S5) are `Epistemic`. S4 strictly precedes S5.\n- **E Modality:** Evaluation (S6) is a `Process` modality, not Epistemic.\n\nFor a detailed explanation of this conceptual architecture, see the **[Project Philosophy Documentation](docs/PHILOSOPHY.md)**.\n\n## Quick Start\n\nThe recommended way to use the framework is to compute the entire Phase 1 pipeline and view the results.\n\n### Prerequisites\n- Python 3.9+\n- An OpenAI API key set as the `OPENAI_API_KEY` environment variable in a `.env` file in the project root.\n\n### Step 1: Install and Set Up\n\n```bash\n# Install with all dependencies\npip install -e \".[dev,openai]\"\n\n# Ensure the lens catalog is generated (only needs to be done once)\npython3 -m chirality.interfaces.cli lenses ensure\n```\n\n### Step 2: Run Phase\u20111 (semantic\u2011first) and extract structured JSON\n\nRecommended end\u2011to\u2011end (semantic transcript + strict JSON):\n\n```bash\n# Use a stronger model if desired\nexport CHIRALITY_MODEL=gpt-5\nexport CHIRALITY_TEMPERATURE=1.0\n\n# Run Phase\u20111 in relaxed (markdown) mode and extract strict JSON\npython -m chirality.interfaces.cli \\\n  phase1-dialogue-run \\\n  --lens-mode auto \\\n  --relaxed-json \\\n  --extract-structured \\\n  --reasoning-effort low \\\n  --out runs/latest_run\n\n# Artifacts\n# - runs/latest_run/phase1_dialogue.jsonl                (clean transcript)\n# - runs/latest_run/phase1_relaxed_output.json           (Stage\u2011A content)\n# - runs/latest_run/phase1_structured.json               (JSON + validation report)\n# - runs/latest_run/phase1_structured_matrices.json      (matrices\u2011only for DB ingest)\n```\n\nStage\u2011A only through Matrix C (quick test):\n\n```bash\npython -m chirality.interfaces.cli \\\n  phase1-dialogue-run \\\n  --lens-mode auto \\\n  --relaxed-json \\\n  --stop-at C_interpreted \\\n  --extract-structured \\\n  --out runs/c_stageA\n```\n\nRun extraction later (CI/CD):\n\n```bash\npython -m chirality.interfaces.cli \\\n  phase1-extract \\\n  --from runs/latest_run/phase1_relaxed_output.json \\\n  --out  runs/latest_run/phase1_structured.json \\\n  --matrices-only  # optional: write only matrices\n```\n\nPrint matrices (quick view):\n\n```bash\npython - <<'PY'\nimport json, pathlib\np = pathlib.Path('runs/latest_run/phase1_structured_matrices.json')\ndata = json.loads(p.read_text())\nfor m in ['C','F','D','K','X','Z','G','T','E']:\n  if m in data['matrices']:\n    print(f'== Matrix {m} ==')\n    for k,v in data['matrices'][m].items():\n      if isinstance(v, dict) and 'elements' in v:\n        print(f'[{k}]')\n        for row in v['elements']:\n          print(' | '.join(map(str,row)))\n        print()\nPY\n```\n\n\n## Development\n\nTo set up the development environment and run tests, please refer to the instructions in `CONTRIBUTING.md`.\n\n**Key Development Notes:**\n- **DDD Architecture**: Clean separation of domain/application/infrastructure/interfaces layers\n- **Prompt Assets**: Located in `chirality/infrastructure/prompts/assets/` following DDD principles\n- **Single CLI Entry Point**: Use `chirality` command (via `chirality.interfaces.cli:main`)\n- **Output Channels**: Logs go to stderr, data goes to stdout (for CI/CD integration)\n- **Guard Scripts**: Run before commits to prevent legacy code drift\n- **Structured outputs & transport**: Adapter uses typed input parts; Stage\u2011B enforces JSON via `json_object` and validates locally (plus one retry). For reasoning models (e.g., GPT\u20115), unsupported sampling params (like `top_p`) are omitted automatically.\n\nAdditional docs:\n- `docs/INTERFACE.md`: Producer mirror of the chirality-app contract (app mode).\n- `GEMINI.md`: Guidance for using Gemini/AI assistants with this repo.\n - `CLAUDE.md`: Guidance for using Claude Code with this repo.\n - `AGENTS.md`: Notes for agentic coding assistants working on this project.\nrepo.\n - `CLAUDE.md`: Guidance for using Claude Code with this repo.\n - `AGENTS.md`: Notes for agentic coding assistants working on this project.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A semantic calculator for executing a canonical, multi-stage pipeline for structured problem-solving and knowledge generation.",
    "version": "19.4.0",
    "project_urls": {
        "Documentation": "https://github.com/sgttomas/chirality-framework/blob/main/README.md",
        "Homepage": "https://github.com/sgttomas/chirality-framework",
        "Issues": "https://github.com/sgttomas/chirality-framework/issues",
        "Repository": "https://github.com/sgttomas/chirality-framework.git"
    },
    "split_keywords": [
        "llm",
        " ontology",
        " semantic computing",
        " pipeline orchestration",
        " static site generator",
        " knowledge generation"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "535b83e781b56865880beac33085b8fb90b91cb2872311107923a90612043e9c",
                "md5": "0c137b2431681a1602a8d18d06a0943d",
                "sha256": "6ea1e43b9d2f8a384b4b15b8042b01aadc3a4da4c75a6386a52119be3b341f1a"
            },
            "downloads": -1,
            "filename": "chirality_framework-19.4.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0c137b2431681a1602a8d18d06a0943d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 160857,
            "upload_time": "2025-09-11T00:35:25",
            "upload_time_iso_8601": "2025-09-11T00:35:25.812249Z",
            "url": "https://files.pythonhosted.org/packages/53/5b/83e781b56865880beac33085b8fb90b91cb2872311107923a90612043e9c/chirality_framework-19.4.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "97c9ab09372ca58ec1720550082669f006b9163b87fd80533657a7d2cdb6b118",
                "md5": "3ef4c4ea1e841b65428e2e081aed901a",
                "sha256": "f82b1abf281fcedd325a2b6143357bee32013c80265c0688e1e16fd24139cfb2"
            },
            "downloads": -1,
            "filename": "chirality_framework-19.4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "3ef4c4ea1e841b65428e2e081aed901a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 221079,
            "upload_time": "2025-09-11T00:35:27",
            "upload_time_iso_8601": "2025-09-11T00:35:27.812138Z",
            "url": "https://files.pythonhosted.org/packages/97/c9/ab09372ca58ec1720550082669f006b9163b87fd80533657a7d2cdb6b118/chirality_framework-19.4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-11 00:35:27",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "sgttomas",
    "github_project": "chirality-framework",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "neo4j",
            "specs": [
                [
                    ">=",
                    "5.0.0"
                ]
            ]
        },
        {
            "name": "python-dotenv",
            "specs": [
                [
                    ">=",
                    "1.0.0"
                ]
            ]
        },
        {
            "name": "click",
            "specs": [
                [
                    ">=",
                    "8.0.0"
                ]
            ]
        },
        {
            "name": "PyYAML",
            "specs": [
                [
                    ">=",
                    "6.0"
                ]
            ]
        },
        {
            "name": "openai",
            "specs": [
                [
                    ">=",
                    "1.50.0"
                ]
            ]
        },
        {
            "name": "pytest",
            "specs": [
                [
                    ">=",
                    "7.0.0"
                ]
            ]
        }
    ],
    "lcname": "chirality-framework"
}
        
Elapsed time: 4.15910s