diff --git a/ARCHITECTURE.md b/ARCHITECTURE.md index 3742f9d..8a9709b 100644 --- a/ARCHITECTURE.md +++ b/ARCHITECTURE.md @@ -140,6 +140,101 @@ These seams are documented as comments in the relevant `.ttl` files. --- +## Interoperability: Flexo MMS and OpenMBEE + +Because knowledgecomplex stores all data as RDF and enforces constraints via standard W3C technologies (OWL, SHACL, SPARQL), it is natively compatible with [Flexo MMS](https://github.com/Open-MBEE/flexo-mms-deployment) — the Model Management System developed by the [OpenMBEE](https://www.openmbee.org/) community. + +### Why the fit is natural + +Flexo MMS is a version-controlled model repository that speaks RDF natively. A KC instance graph is already a valid RDF dataset, so the integration path is direct: + +| KC concept | MMS equivalent | Notes | +|---|---|---| +| `kc:Complex` (instance graph) | MMS model/branch | A KC export is a self-contained RDF graph that can be committed as an MMS model revision | +| `kc:boundedBy`, `kc:hasElement` | MMS element relationships | Topological structure is expressed as standard RDF triples | +| SHACL shapes (`kc_core_shapes.ttl` + user shapes) | MMS validation profiles | Shapes can be registered in MMS to enforce KC constraints on committed models | +| `kc:uri` | MMS element cross-references | Provides traceability from KC elements to external artifacts (files, documents, URIs) | +| JSON-LD export (`dump_graph(format="json-ld")`) | MMS ingest format | JSON-LD is the primary API format for Flexo MMS | + +### Integration patterns + +**Push to MMS:** Export a KC instance via `kc.export()` or `dump_graph(format="json-ld")`, then commit to a Flexo MMS repository via its REST API. The OWL ontology and SHACL shapes can be committed alongside the instance data, enabling MMS-side validation. + +**Pull from MMS:** Retrieve a model revision as JSON-LD from Flexo MMS, then load it into a KC instance via `load_graph(kc, "model.jsonld")`. The KC's SHACL verification (`kc.verify()`) ensures the imported data satisfies all topological and ontological constraints. + +**Version control:** MMS provides branching, diffing, and merge capabilities at the RDF triple level. KC's `ComplexDiff` and `ComplexSequence` classes complement this by providing simplicial-complex-aware diffing (element-level adds/removes rather than triple-level changes). + +### What KC adds beyond MMS + +Flexo MMS manages RDF models generically — it stores, versions, and queries them but does not enforce simplicial complex structure. KC adds the topological layer: boundary-closure, closed-triangle constraints, typed simplicial hierarchy, and algebraic topology computations (Betti numbers, Hodge decomposition). Together, MMS provides the model management infrastructure and KC provides the mathematical structure. + +### Reference + +OpenMBEE (Open Model-Based Engineering Environment) is an open-source community developing tools for model-based systems engineering. Flexo MMS is its core model management system. See [openmbee.org](https://www.openmbee.org/) and [github.com/Open-MBEE](https://github.com/Open-MBEE). + +--- + +## Deployment Architecture + +The internal design described above (2x2 map, component layers, static resources) is the library's foundation. In practice, a knowledge complex is deployed through a stack of five layers, each building on the one below: + +``` +┌─────────────────────────────────────────────────────────────┐ +│ 5. LLM Tool Integration │ +│ Register KC operations as callable tools for a language │ +│ model. The complex serves as a deterministic expert │ +│ system — the LLM navigates, queries, and analyzes via │ +│ tool calls; the KC guarantees topological correctness │ +│ and returns structured, verifiable results. │ +├─────────────────────────────────────────────────────────────┤ +│ 4. MCP Server │ +│ Model Context Protocol server exposing KC as tools for │ +│ AI assistants (Claude, etc.). Each KC operation becomes │ +│ a tool: add_vertex, boundary, betti_numbers, audit, etc. │ +├─────────────────────────────────────────────────────────────┤ +│ 3. Microservice (REST API) │ +│ Python-hosted service exposing KC operations over HTTP. │ +│ CRUD for elements, SPARQL query execution, SHACL │ +│ verification, algebraic topology analysis, export/import.│ +├─────────────────────────────────────────────────────────────┤ +│ 2. Concrete Knowledge Complex │ +│ An instance using a specific ontology. Typed vertices, │ +│ edges, and faces with attributes. SHACL-verified on │ +│ every write. Serialized as RDF (Turtle, JSON-LD). │ +│ Versioned via Flexo MMS or git. │ +├─────────────────────────────────────────────────────────────┤ +│ 1. KC-Compatible Ontology │ +│ OWL class hierarchy extending kc:Vertex/Edge/Face. │ +│ SHACL shapes for attribute constraints. Publicly hosted │ +│ at persistent URIs (w3id.org). Dereferenceable — tools │ +│ can fetch the ontology and understand the type system. │ +└─────────────────────────────────────────────────────────────┘ +``` + +### Layer 1: Ontology + +A KC-compatible ontology is an OWL ontology whose classes extend `kc:Vertex`, `kc:Edge`, and `kc:Face`, paired with SHACL shapes for instance-level constraints. Ontologies are authored via `SchemaBuilder` and exported as standard `.ttl` files. For public use, the ontology should be hosted at a persistent URI (e.g. `https://w3id.org/kc/`) so that other systems can dereference the IRI and retrieve the OWL/SHACL definitions. The `knowledgecomplex.ontologies` package ships three reference ontologies (operations, brand, research) as starting points. + +### Layer 2: Concrete Complex + +A concrete knowledge complex is an RDF instance graph conforming to a specific ontology. It contains typed elements (vertices, edges, faces) with attributes, linked by `kc:boundedBy` and collected by `kc:hasElement`. SHACL verification enforces topological and ontological constraints on every write. The complex is serializable to Turtle, JSON-LD, or N-Triples and can be versioned via Flexo MMS or committed to a git repository as `.ttl` files. + +### Layer 3: Microservice + +A Python-hosted HTTP service wraps the `KnowledgeComplex` API in a REST interface. Typical endpoints: element CRUD, named SPARQL queries, topological operations (boundary, star, closure), algebraic topology analysis (Betti numbers, Hodge decomposition, edge PageRank), SHACL verification and audit, and schema introspection. The service loads a schema at startup and manages one or more complex instances. + +### Layer 4: MCP Server + +A [Model Context Protocol](https://modelcontextprotocol.io/) server exposes KC operations as tools that AI assistants can call. Each KC method becomes an MCP tool: `add_vertex`, `boundary`, `find_cliques`, `betti_numbers`, `audit`, etc. The MCP server is a thin adapter over the microservice or the library directly, translating between MCP tool calls and KC Python API calls. + +### Layer 5: LLM Tool Integration + +The knowledge complex is registered as a set of callable tools for a language model. The LLM uses the complex as a **deterministic expert system** — it navigates the simplicial structure, retrieves typed elements and their attributes, runs topological queries, and performs algebraic topology analysis via tool calls. The KC guarantees that every result is topologically valid and SHACL-verified. The LLM provides natural language understanding and reasoning; the KC provides structured, auditable, mathematically rigorous retrieval. + +This separation is key: the LLM handles ambiguity, intent, and synthesis; the KC handles structure, correctness, and computation. Neither replaces the other. + +--- + ## Namespace Conventions ```turtle @@ -162,5 +257,11 @@ User namespaces are set via `SchemaBuilder(namespace="aaa")`. The URI base `http | `knowledgecomplex/schema.py` | Python API — schema authoring | `SchemaBuilder` DSL: `add_*_type`, `dump_owl`, `dump_shacl`, `export`, `load` | | `knowledgecomplex/graph.py` | Python API — instance I/O | `KnowledgeComplex`: `add_vertex`, `add_edge`, `add_face`, `query`, `dump_graph`, `export`, `load` | | `knowledgecomplex/exceptions.py` | Public exceptions | `ValidationError`, `SchemaError`, `UnknownQueryError` | -| `knowledgecomplex/queries/vertices.sparql` | Framework SPARQL | Return all vertices and their types | -| `knowledgecomplex/queries/coboundary.sparql` | Framework SPARQL | Inverse boundary operator | +| `knowledgecomplex/io.py` | Python API — serialization | `save_graph`, `load_graph`, `dump_graph` — multi-format file I/O (Turtle, JSON-LD, N-Triples) | +| `knowledgecomplex/viz.py` | Python API — visualization | Hasse diagrams (`plot_hasse`), geometric realization (`plot_geometric`), `to_networkx`, `verify_networkx` | +| `knowledgecomplex/analysis.py` | Python API — algebraic topology | `betti_numbers`, `euler_characteristic`, `hodge_laplacian`, `edge_pagerank` (optional: numpy, scipy) | +| `knowledgecomplex/clique.py` | Python API — clique inference | `find_cliques`, `infer_faces`, `fill_cliques` — flagification and typed face inference | +| `knowledgecomplex/filtration.py` | Python API — filtrations | `Filtration` — nested subcomplex sequences, birth tracking, `from_function` | +| `knowledgecomplex/diff.py` | Python API — diffs and sequences | `ComplexDiff`, `ComplexSequence` — time-varying complexes with SPARQL UPDATE export/import | +| `knowledgecomplex/codecs/markdown.py` | Codec — markdown files | `MarkdownCodec` — YAML frontmatter + section-based round-trip; `verify_documents` | +| `knowledgecomplex/queries/*.sparql` | Framework SPARQL | 7 templates: vertices, coboundary, boundary, star, closure, skeleton, degree | diff --git a/README.md b/README.md index 6a2d1b5..ab4f06f 100644 --- a/README.md +++ b/README.md @@ -38,20 +38,27 @@ sb.add_vertex_type("activity", attributes={"name": text()}) sb.add_vertex_type("resource", attributes={"name": text()}) sb.add_edge_type("performs", attributes={"role": vocab("lead", "support")}) sb.add_edge_type("requires", attributes={"mode": vocab("read", "write")}) +sb.add_edge_type("produces", attributes={"mode": vocab("read", "write")}) +sb.add_edge_type("accesses", attributes={"mode": vocab("read", "write")}) sb.add_edge_type("responsible", attributes={"level": vocab("owner", "steward")}) sb.add_face_type("operation") +sb.add_face_type("production") # 2. Build an instance kc = KnowledgeComplex(schema=sb) -kc.add_vertex("alice", type="actor", name="Alice") -kc.add_vertex("etl-run", type="activity", name="Daily ETL") -kc.add_vertex("dataset", type="resource", name="Sales DB") +kc.add_vertex("alice", type="actor", name="Alice") +kc.add_vertex("etl-run", type="activity", name="Daily ETL") +kc.add_vertex("dataset1", type="resource", name="JSON Records") +kc.add_vertex("dataset2", type="resource", name="Sales DB") -kc.add_edge("e1", type="performs", vertices={"alice", "etl-run"}, role="lead") -kc.add_edge("e2", type="requires", vertices={"etl-run", "dataset"}, mode="write") -kc.add_edge("e3", type="responsible", vertices={"alice", "dataset"}, level="owner") +kc.add_edge("e1", type="performs", vertices={"alice", "etl-run"}, role="lead") +kc.add_edge("e2", type="requires", vertices={"etl-run", "dataset1"}, mode="read") +kc.add_edge("e3", type="produces", vertices={"etl-run", "dataset2"}, mode="write") +kc.add_edge("e4", type="accesses", vertices={"alice", "dataset1"}, mode="read") +kc.add_edge("e5", type="responsible", vertices={"alice", "dataset2"}, level="owner") -kc.add_face("op1", type="operation", edges={"e1", "e2", "e3"}) +kc.add_face("op1", type="operation", boundary=["e1", "e2", "e4"]) +kc.add_face("prod1", type="production", boundary=["e1", "e3", "e5"]) # 3. Query df = kc.query("vertices") # built-in SPARQL template @@ -61,17 +68,120 @@ print(df) print(kc.dump_graph()) # Turtle string ``` -## The `kc:uri` attribute +See [`examples/`](examples/) for 10 runnable examples covering all features below. + +## Topological queries + +Every `KnowledgeComplex` has methods for the standard simplicial complex operations. +All return `set[str]` for natural set algebra: + +```python +kc.boundary("face-1") # {e1, e2, e3} — direct boundary +kc.star("alice") # all simplices containing alice +kc.link("alice") # Cl(St) \ St — the horizon around alice +kc.closure({"e1", "e2"}) # smallest subcomplex containing these edges +kc.degree("alice") # number of incident edges + +# Set algebra composes naturally +shared = kc.star("alice") & kc.star("bob") +``` + +All operators accept an optional `type=` filter for OWL-subclass-aware filtering. + +## Clique inference + +Discover higher-order structure from the edge graph: + +```python +from knowledgecomplex import find_cliques, infer_faces + +triangles = find_cliques(kc, k=3) # pure query — what triangles exist? +infer_faces(kc, "operation") # fill in all triangles as typed faces +infer_faces(kc, "team", edge_type="collab") # restrict to specific edge types +``` + +## Visualization + +Two complementary views — Hasse diagrams (all elements as nodes, boundary as directed arrows) and geometric realization (vertices as 3D points, edges as lines, faces as filled triangles): + +```python +from knowledgecomplex import plot_hasse, plot_geometric + +fig, ax = plot_hasse(kc) # directed boundary graph, type-colored +fig, ax = plot_geometric(kc) # 3D triangulation with matplotlib +``` -Every element (vertex, edge, or face) can carry an optional `kc:uri` property pointing to its source file: +Export to NetworkX for further analysis: ```python -kc.add_vertex("alice", type="actor", uri="file:///actors/alice.md", name="Alice") -kc.add_edge("e1", type="performs", vertices={"alice", "etl-run"}, - uri="file:///edges/e1.md", role="lead") +from knowledgecomplex import to_networkx, verify_networkx + +G = to_networkx(kc) # nx.DiGraph with exact degree invariants +verify_networkx(G) # validate cardinality + closed-triangle constraints ``` -SHACL enforces at-most-one `kc:uri` per element. This is useful for domain applications where each element corresponds to an actual document or record. +## Algebraic topology + +Betti numbers, Euler characteristic, Hodge Laplacian, and edge PageRank (requires `pip install knowledgecomplex[analysis]`): + +```python +from knowledgecomplex import betti_numbers, euler_characteristic, edge_pagerank + +betti = betti_numbers(kc) # [beta_0, beta_1, beta_2] +chi = euler_characteristic(kc) # V - E + F +pr = edge_pagerank(kc, "e1") # personalized edge PageRank vector +``` + +## Filtrations and time-varying complexes + +Filtrations model strictly growing subcomplexes. Diffs model arbitrary add/remove sequences: + +```python +from knowledgecomplex import Filtration, ComplexDiff, ComplexSequence + +filt = Filtration(kc) +filt.append_closure({"v1", "v2", "e12"}) # Q0: founders +filt.append_closure({"v3", "e23", "face1"}) # Q1: first triangle +print(filt.birth("face1")) # 1 + +diff = ComplexDiff().add_vertex("eve", type="Person").remove("old-edge") +diff.apply(kc) # mutate the complex +print(diff.to_sparql(kc)) # export as SPARQL UPDATE +``` + +## I/O and codecs + +Multi-format serialization and round-trip with external files: + +```python +from knowledgecomplex import save_graph, load_graph, MarkdownCodec + +save_graph(kc, "data.jsonld", format="json-ld") +load_graph(kc, "data.ttl") # additive loading + +codec = MarkdownCodec(frontmatter_attrs=["name"], section_attrs=["notes"]) +kc.register_codec("Paper", codec) +kc.element("paper-1").compile() # KC -> markdown file +kc.element("paper-1").decompile() # markdown file -> KC +``` + +## Constraint escalation + +Escalate topological queries to SHACL constraints enforced on every write: + +```python +sb.add_topological_constraint( + "requirement", "coboundary", + target_type="verification", + predicate="min_count", min_count=1, + message="Every requirement must have at least one verification edge", +) +``` + +## The `kc:uri` attribute + +Every element can carry an optional `kc:uri` property pointing to its source file. +SHACL enforces at-most-one `kc:uri` per element. ## Architecture diff --git a/docs/api/analysis.md b/docs/api/analysis.md new file mode 100644 index 0000000..ab19165 --- /dev/null +++ b/docs/api/analysis.md @@ -0,0 +1 @@ +::: knowledgecomplex.analysis diff --git a/docs/api/clique.md b/docs/api/clique.md new file mode 100644 index 0000000..96940b2 --- /dev/null +++ b/docs/api/clique.md @@ -0,0 +1 @@ +::: knowledgecomplex.clique diff --git a/docs/api/codecs.md b/docs/api/codecs.md new file mode 100644 index 0000000..bb0e4ac --- /dev/null +++ b/docs/api/codecs.md @@ -0,0 +1 @@ +::: knowledgecomplex.codecs.markdown diff --git a/docs/api/diff.md b/docs/api/diff.md new file mode 100644 index 0000000..81ecaa7 --- /dev/null +++ b/docs/api/diff.md @@ -0,0 +1 @@ +::: knowledgecomplex.diff diff --git a/docs/api/filtration.md b/docs/api/filtration.md new file mode 100644 index 0000000..0ebaa72 --- /dev/null +++ b/docs/api/filtration.md @@ -0,0 +1 @@ +::: knowledgecomplex.filtration diff --git a/docs/api/io.md b/docs/api/io.md new file mode 100644 index 0000000..7c6ce9d --- /dev/null +++ b/docs/api/io.md @@ -0,0 +1 @@ +::: knowledgecomplex.io diff --git a/docs/api/viz.md b/docs/api/viz.md new file mode 100644 index 0000000..cc23034 --- /dev/null +++ b/docs/api/viz.md @@ -0,0 +1 @@ +::: knowledgecomplex.viz diff --git a/docs/index.md b/docs/index.md index 726d5e3..d55fe18 100644 --- a/docs/index.md +++ b/docs/index.md @@ -38,22 +38,44 @@ sb.add_vertex_type("activity", attributes={"name": text()}) sb.add_vertex_type("resource", attributes={"name": text()}) sb.add_edge_type("performs", attributes={"role": vocab("lead", "support")}) sb.add_edge_type("requires", attributes={"mode": vocab("read", "write")}) +sb.add_edge_type("produces", attributes={"mode": vocab("read", "write")}) +sb.add_edge_type("accesses", attributes={"mode": vocab("read", "write")}) sb.add_edge_type("responsible", attributes={"level": vocab("owner", "steward")}) sb.add_face_type("operation") +sb.add_face_type("production") # 2. Build an instance kc = KnowledgeComplex(schema=sb) -kc.add_vertex("alice", type="actor", name="Alice") -kc.add_vertex("etl-run", type="activity", name="Daily ETL") -kc.add_vertex("dataset", type="resource", name="Sales DB") +kc.add_vertex("alice", type="actor", name="Alice") +kc.add_vertex("etl-run", type="activity", name="Daily ETL") +kc.add_vertex("dataset1", type="resource", name="JSON Records") +kc.add_vertex("dataset2", type="resource", name="Sales DB") -kc.add_edge("e1", type="performs", vertices={"alice", "etl-run"}, role="lead") -kc.add_edge("e2", type="requires", vertices={"etl-run", "dataset"}, mode="write") -kc.add_edge("e3", type="responsible", vertices={"alice", "dataset"}, level="owner") +kc.add_edge("e1", type="performs", vertices={"alice", "etl-run"}, role="lead") +kc.add_edge("e2", type="requires", vertices={"etl-run", "dataset1"}, mode="read") +kc.add_edge("e3", type="produces", vertices={"etl-run", "dataset2"}, mode="write") +kc.add_edge("e4", type="accesses", vertices={"alice", "dataset1"}, mode="read") +kc.add_edge("e5", type="responsible", vertices={"alice", "dataset2"}, level="owner") -kc.add_face("op1", type="operation", edges={"e1", "e2", "e3"}) +kc.add_face("op1", type="operation", boundary=["e1", "e2", "e4"]) +kc.add_face("prod1", type="production", boundary=["e1", "e3", "e5"]) # 3. Query df = kc.query("vertices") # built-in SPARQL template print(df) ``` + +See the [examples/](https://github.com/blockscience/knowledgecomplex/tree/main/examples) directory for 10 runnable examples. + +## API Reference + +- [Schema authoring](api/schema.md) — `SchemaBuilder`, `vocab`, `text`, type inheritance, constraint escalation +- [Instance management](api/graph.md) — `KnowledgeComplex`, `Element`, topological queries, SPARQL templates +- [Visualization](api/viz.md) — Hasse diagrams, geometric realization, NetworkX export +- [Algebraic topology](api/analysis.md) — Betti numbers, Hodge Laplacian, edge PageRank +- [Clique inference](api/clique.md) — `find_cliques`, `infer_faces`, `fill_cliques` +- [Filtrations](api/filtration.md) — nested subcomplex sequences, birth tracking +- [Diffs and sequences](api/diff.md) — `ComplexDiff`, `ComplexSequence`, SPARQL UPDATE export/import +- [File I/O](api/io.md) — multi-format save/load (Turtle, JSON-LD, N-Triples) +- [Codecs](api/codecs.md) — `MarkdownCodec` for YAML+markdown round-trip +- [Exceptions](api/exceptions.md) — `ValidationError`, `SchemaError`, `UnknownQueryError` diff --git a/docs/ontology.md b/docs/ontology.md index 520faf2..00716be 100644 --- a/docs/ontology.md +++ b/docs/ontology.md @@ -22,6 +22,10 @@ The abstract SHACL shapes graph enforces instance-level constraints that exceed These constraints use `sh:sparql` validators because they require cross-individual reasoning. +## Interoperability + +Because KC stores all data as standard RDF and enforces constraints via W3C SHACL, instance graphs are natively compatible with RDF-based model management systems such as [Flexo MMS](https://github.com/Open-MBEE/flexo-mms-deployment) from the [OpenMBEE](https://www.openmbee.org/) community. JSON-LD export (`dump_graph(format="json-ld")`) provides the primary bridge format. See the [Interoperability section of ARCHITECTURE.md](https://github.com/blockscience/knowledgecomplex/blob/main/ARCHITECTURE.md#interoperability-flexo-mms-and-openmbee) for integration patterns. + ## Design rationale See [ARCHITECTURE.md](https://github.com/blockscience/knowledgecomplex/blob/main/ARCHITECTURE.md) for the full 2x2 responsibility map and design decisions. diff --git a/examples/01_quickstart/_setup_data.py b/examples/01_quickstart/_setup_data.py new file mode 100644 index 0000000..83e5c3e --- /dev/null +++ b/examples/01_quickstart/_setup_data.py @@ -0,0 +1,34 @@ +""" +Generate the pre-built pipeline data for the quickstart example. + +Run once: python examples/01_quickstart/_setup_data.py + +Creates examples/01_quickstart/data/pipeline/ with ontology.ttl, shapes.ttl, instance.ttl +(a data pipeline with actors, activities, resources, and edges — but no faces). +""" + +from knowledgecomplex import KnowledgeComplex +from knowledgecomplex.ontologies import operations + +# Use the pre-built operations ontology (no faces — those are discovered later) +sb = operations.schema(namespace="ex") + +# Instance: 4 vertices, 5 edges forming 2 triangles (but no faces declared) +kc = KnowledgeComplex(schema=sb) +kc.add_vertex("alice", type="actor", name="Alice") +kc.add_vertex("etl-run", type="activity", name="Daily ETL") +kc.add_vertex("dataset1", type="resource", name="JSON Records") +kc.add_vertex("dataset2", type="resource", name="Sales DB") + +kc.add_edge("e1", type="performs", vertices={"alice", "etl-run"}, role="lead") +kc.add_edge("e2", type="requires", vertices={"etl-run", "dataset1"}, mode="read") +kc.add_edge("e3", type="produces", vertices={"etl-run", "dataset2"}, mode="write") +kc.add_edge("e4", type="accesses", vertices={"alice", "dataset1"}, mode="read") +kc.add_edge("e5", type="responsible", vertices={"alice", "dataset2"}, level="owner") + +# Export +from pathlib import Path +out = Path(__file__).parent / "data" / "pipeline" +kc.export(out) +print(f"Exported to {out}/") +print(f" {len(kc.element_ids())} elements (4 vertices + 5 edges, no faces)") diff --git a/examples/01_quickstart/data/pipeline/instance.ttl b/examples/01_quickstart/data/pipeline/instance.ttl new file mode 100644 index 0000000..389c6b6 --- /dev/null +++ b/examples/01_quickstart/data/pipeline/instance.ttl @@ -0,0 +1,168 @@ +@prefix ex: . +@prefix kc: . +@prefix owl: . +@prefix rdfs: . +@prefix xsd: . + +ex:_complex a kc:Complex ; + kc:hasElement ex:alice, + ex:dataset1, + ex:dataset2, + ex:e1, + ex:e2, + ex:e3, + ex:e4, + ex:e5, + ex:etl-run . + +ex:level a owl:DatatypeProperty ; + rdfs:comment "Allowed values: owner, steward" ; + rdfs:domain ex:responsible ; + rdfs:range xsd:string . + +ex:mode a owl:DatatypeProperty ; + rdfs:comment "Allowed values: read, write" ; + rdfs:range xsd:string . + +ex:name a owl:DatatypeProperty ; + rdfs:range xsd:string . + +ex:operation a owl:Class ; + rdfs:subClassOf kc:Face . + +ex:production a owl:Class ; + rdfs:subClassOf kc:Face . + +ex:role a owl:DatatypeProperty ; + rdfs:comment "Allowed values: lead, support" ; + rdfs:domain ex:performs ; + rdfs:range xsd:string . + + a owl:Ontology ; + rdfs:label "Knowledge Complex Core Ontology" ; + rdfs:comment "Abstract topological backbone: Element, Vertex, Edge, Face, Complex." . + +kc:hasElement a owl:ObjectProperty ; + rdfs:label "hasElement" ; + rdfs:comment "Membership relation: an element belongs to this complex." ; + rdfs:domain kc:Complex ; + rdfs:range kc:Element . + +kc:uri a owl:DatatypeProperty ; + rdfs:label "uri" ; + rdfs:comment """Optional URI pointing to the source file for this element. + Accepts any URI scheme (file:///, https://, etc.). + At-most-one per element (sh:maxCount 1 enforced in kc_core_shapes.ttl).""" ; + rdfs:domain kc:Element ; + rdfs:range xsd:anyURI . + +ex:accesses a owl:Class ; + rdfs:subClassOf kc:Edge . + +ex:activity a owl:Class ; + rdfs:subClassOf kc:Vertex . + +ex:actor a owl:Class ; + rdfs:subClassOf kc:Vertex . + +ex:e1 a ex:performs ; + ex:role "lead" ; + kc:boundedBy ex:alice, + ex:etl-run . + +ex:e2 a ex:requires ; + ex:mode "read" ; + kc:boundedBy ex:dataset1, + ex:etl-run . + +ex:e3 a ex:produces ; + ex:mode "write" ; + kc:boundedBy ex:dataset2, + ex:etl-run . + +ex:e4 a ex:accesses ; + ex:mode "read" ; + kc:boundedBy ex:alice, + ex:dataset1 . + +ex:e5 a ex:responsible ; + ex:level "owner" ; + kc:boundedBy ex:alice, + ex:dataset2 . + +ex:produces a owl:Class ; + rdfs:subClassOf kc:Edge . + +ex:requires a owl:Class ; + rdfs:subClassOf kc:Edge . + +ex:performs a owl:Class ; + rdfs:subClassOf kc:Edge . + +ex:resource a owl:Class ; + rdfs:subClassOf kc:Vertex . + +ex:responsible a owl:Class ; + rdfs:subClassOf kc:Edge . + +kc:Complex a owl:Class ; + rdfs:label "Complex" ; + rdfs:comment """A simplicial complex: a collection of elements (simplices) that is + closed under the boundary operator. If a simplex is in the complex, all its + boundary elements must also be in the complex. Topological validity of the + composition is enforced by SHACL (see kcs:ComplexShape).""" . + +kc:Face a owl:Class ; + rdfs:label "Face" ; + rdfs:comment "Abstract 2-simplex. Boundary: exactly 3 edges. Concrete types must subclass this." ; + rdfs:subClassOf [ a owl:Restriction ; + owl:onClass kc:Edge ; + owl:onProperty kc:boundedBy ; + owl:qualifiedCardinality "3"^^xsd:nonNegativeInteger ], + kc:Element . + +kc:boundedBy a owl:ObjectProperty ; + rdfs:label "boundedBy" ; + rdfs:comment """Boundary operator (∂). Maps a k-simplex to its bounding + (k-1)-simplices. Cardinality is enforced per subclass via OWL restrictions. + Vertices (k=0) have empty boundary. For k>=1, cardinality = k+1. + All elements are unoriented (boundary is an unordered set).""" ; + rdfs:domain kc:Element ; + rdfs:range kc:Element . + +ex:dataset1 a ex:resource ; + ex:name "JSON Records" . + +ex:dataset2 a ex:resource ; + ex:name "Sales DB" . + +ex:alice a ex:actor ; + ex:name "Alice" . + +ex:etl-run a ex:activity ; + ex:name "Daily ETL" . + +kc:Vertex a owl:Class ; + rdfs:label "Vertex" ; + rdfs:comment "Abstract 0-simplex. Empty boundary (∂v = ∅). Concrete types must subclass this." ; + rdfs:subClassOf kc:Element . + +kc:Edge a owl:Class ; + rdfs:label "Edge" ; + rdfs:comment "Abstract 1-simplex. Boundary: exactly 2 vertices. Concrete types must subclass this." ; + rdfs:subClassOf [ a owl:Restriction ; + owl:onClass kc:Vertex ; + owl:onProperty kc:boundedBy ; + owl:qualifiedCardinality "2"^^xsd:nonNegativeInteger ], + kc:Element . + +kc:Element a owl:Class ; + rdfs:label "Element" ; + rdfs:comment """Abstract topological element (k-simplex). All KC types are elements. + The simplex order k determines boundary cardinality: + k=0 (Vertex): empty boundary (cardinality 0) + k=1 (Edge): 2 boundary vertices + k=2 (Face): 3 boundary edges + k>=1 general: (k+1) boundary (k-1)-simplices + Higher orders can be defined by subclassing Element.""" . + diff --git a/examples/01_quickstart/data/pipeline/ontology.ttl b/examples/01_quickstart/data/pipeline/ontology.ttl new file mode 100644 index 0000000..521f965 --- /dev/null +++ b/examples/01_quickstart/data/pipeline/ontology.ttl @@ -0,0 +1,120 @@ +@prefix ex: . +@prefix kc: . +@prefix owl: . +@prefix rdfs: . +@prefix xsd: . + +ex:accesses a owl:Class ; + rdfs:subClassOf kc:Edge . + +ex:activity a owl:Class ; + rdfs:subClassOf kc:Vertex . + +ex:actor a owl:Class ; + rdfs:subClassOf kc:Vertex . + +ex:level a owl:DatatypeProperty ; + rdfs:comment "Allowed values: owner, steward" ; + rdfs:domain ex:responsible ; + rdfs:range xsd:string . + +ex:mode a owl:DatatypeProperty ; + rdfs:comment "Allowed values: read, write" ; + rdfs:range xsd:string . + +ex:name a owl:DatatypeProperty ; + rdfs:range xsd:string . + +ex:operation a owl:Class ; + rdfs:subClassOf kc:Face . + +ex:produces a owl:Class ; + rdfs:subClassOf kc:Edge . + +ex:production a owl:Class ; + rdfs:subClassOf kc:Face . + +ex:requires a owl:Class ; + rdfs:subClassOf kc:Edge . + +ex:resource a owl:Class ; + rdfs:subClassOf kc:Vertex . + +ex:role a owl:DatatypeProperty ; + rdfs:comment "Allowed values: lead, support" ; + rdfs:domain ex:performs ; + rdfs:range xsd:string . + + a owl:Ontology ; + rdfs:label "Knowledge Complex Core Ontology" ; + rdfs:comment "Abstract topological backbone: Element, Vertex, Edge, Face, Complex." . + +kc:hasElement a owl:ObjectProperty ; + rdfs:label "hasElement" ; + rdfs:comment "Membership relation: an element belongs to this complex." ; + rdfs:domain kc:Complex ; + rdfs:range kc:Element . + +kc:uri a owl:DatatypeProperty ; + rdfs:label "uri" ; + rdfs:comment """Optional URI pointing to the source file for this element. + Accepts any URI scheme (file:///, https://, etc.). + At-most-one per element (sh:maxCount 1 enforced in kc_core_shapes.ttl).""" ; + rdfs:domain kc:Element ; + rdfs:range xsd:anyURI . + +ex:performs a owl:Class ; + rdfs:subClassOf kc:Edge . + +ex:responsible a owl:Class ; + rdfs:subClassOf kc:Edge . + +kc:Complex a owl:Class ; + rdfs:label "Complex" ; + rdfs:comment """A simplicial complex: a collection of elements (simplices) that is + closed under the boundary operator. If a simplex is in the complex, all its + boundary elements must also be in the complex. Topological validity of the + composition is enforced by SHACL (see kcs:ComplexShape).""" . + +kc:Face a owl:Class ; + rdfs:label "Face" ; + rdfs:comment "Abstract 2-simplex. Boundary: exactly 3 edges. Concrete types must subclass this." ; + rdfs:subClassOf [ a owl:Restriction ; + owl:onClass kc:Edge ; + owl:onProperty kc:boundedBy ; + owl:qualifiedCardinality "3"^^xsd:nonNegativeInteger ], + kc:Element . + +kc:boundedBy a owl:ObjectProperty ; + rdfs:label "boundedBy" ; + rdfs:comment """Boundary operator (∂). Maps a k-simplex to its bounding + (k-1)-simplices. Cardinality is enforced per subclass via OWL restrictions. + Vertices (k=0) have empty boundary. For k>=1, cardinality = k+1. + All elements are unoriented (boundary is an unordered set).""" ; + rdfs:domain kc:Element ; + rdfs:range kc:Element . + +kc:Vertex a owl:Class ; + rdfs:label "Vertex" ; + rdfs:comment "Abstract 0-simplex. Empty boundary (∂v = ∅). Concrete types must subclass this." ; + rdfs:subClassOf kc:Element . + +kc:Edge a owl:Class ; + rdfs:label "Edge" ; + rdfs:comment "Abstract 1-simplex. Boundary: exactly 2 vertices. Concrete types must subclass this." ; + rdfs:subClassOf [ a owl:Restriction ; + owl:onClass kc:Vertex ; + owl:onProperty kc:boundedBy ; + owl:qualifiedCardinality "2"^^xsd:nonNegativeInteger ], + kc:Element . + +kc:Element a owl:Class ; + rdfs:label "Element" ; + rdfs:comment """Abstract topological element (k-simplex). All KC types are elements. + The simplex order k determines boundary cardinality: + k=0 (Vertex): empty boundary (cardinality 0) + k=1 (Edge): 2 boundary vertices + k=2 (Face): 3 boundary edges + k>=1 general: (k+1) boundary (k-1)-simplices + Higher orders can be defined by subclassing Element.""" . + diff --git a/examples/01_quickstart/data/pipeline/shapes.ttl b/examples/01_quickstart/data/pipeline/shapes.ttl new file mode 100644 index 0000000..ef1e75b --- /dev/null +++ b/examples/01_quickstart/data/pipeline/shapes.ttl @@ -0,0 +1,173 @@ +@prefix ex: . +@prefix exs: . +@prefix kc: . +@prefix kcs: . +@prefix rdf: . +@prefix rdfs: . +@prefix sh: . +@prefix xsd: . + +exs:accessesShape a sh:NodeShape ; + sh:property [ sh:datatype xsd:string ; + sh:in ( "read" "write" ) ; + sh:maxCount 1 ; + sh:minCount 1 ; + sh:path ex:mode ] ; + sh:targetClass ex:accesses . + +exs:activityShape a sh:NodeShape ; + sh:property [ sh:datatype xsd:string ; + sh:maxCount 1 ; + sh:minCount 1 ; + sh:path ex:name ] ; + sh:targetClass ex:activity . + +exs:actorShape a sh:NodeShape ; + sh:property [ sh:datatype xsd:string ; + sh:maxCount 1 ; + sh:minCount 1 ; + sh:path ex:name ] ; + sh:targetClass ex:actor . + +exs:operationShape a sh:NodeShape ; + sh:targetClass ex:operation . + +exs:performsShape a sh:NodeShape ; + sh:property [ sh:datatype xsd:string ; + sh:in ( "lead" "support" ) ; + sh:maxCount 1 ; + sh:minCount 1 ; + sh:path ex:role ] ; + sh:targetClass ex:performs . + +exs:producesShape a sh:NodeShape ; + sh:property [ sh:datatype xsd:string ; + sh:in ( "read" "write" ) ; + sh:maxCount 1 ; + sh:minCount 1 ; + sh:path ex:mode ] ; + sh:targetClass ex:produces . + +exs:productionShape a sh:NodeShape ; + sh:targetClass ex:production . + +exs:requiresShape a sh:NodeShape ; + sh:property [ sh:datatype xsd:string ; + sh:in ( "read" "write" ) ; + sh:maxCount 1 ; + sh:minCount 1 ; + sh:path ex:mode ] ; + sh:targetClass ex:requires . + +exs:resourceShape a sh:NodeShape ; + sh:property [ sh:datatype xsd:string ; + sh:maxCount 1 ; + sh:minCount 1 ; + sh:path ex:name ] ; + sh:targetClass ex:resource . + +exs:responsibleShape a sh:NodeShape ; + sh:property [ sh:datatype xsd:string ; + sh:in ( "owner" "steward" ) ; + sh:maxCount 1 ; + sh:minCount 1 ; + sh:path ex:level ] ; + sh:targetClass ex:responsible . + +kcs:ComplexShape a sh:NodeShape ; + rdfs:label "ComplexShape" ; + rdfs:comment """Boundary-closure constraint: if a simplex is in the complex, + all its boundary elements must also be in the complex.""" ; + sh:sparql [ a sh:SPARQLConstraint ; + rdfs:comment """Boundary-closure: for every element in the complex, + each of its boundedBy targets must also be a hasElement of the complex.""" ; + sh:message "Complex is not closed under the boundary operator: an element's boundary element is missing from the complex." ; + sh:select """ + PREFIX kc: + SELECT $this WHERE { + $this kc:hasElement ?elem . + ?elem kc:boundedBy ?boundary . + FILTER NOT EXISTS { + $this kc:hasElement ?boundary . + } + } + """ ] ; + sh:targetClass kc:Complex . + +kcs:EdgeShape a sh:NodeShape ; + rdfs:label "EdgeShape" ; + rdfs:comment "Topological constraints on KC:Edge instances." ; + sh:property [ sh:class kc:Vertex ; + sh:maxCount 2 ; + sh:message "Edge must have exactly two boundedBy relations of type KC:Vertex." ; + sh:minCount 2 ; + sh:path kc:boundedBy ] ; + sh:sparql [ a sh:SPARQLConstraint ; + rdfs:comment "Edge boundary vertices must be distinct individuals." ; + sh:message "Edge boundary vertices must not be the same vertex." ; + sh:select """ + PREFIX kc: + SELECT $this WHERE { + $this kc:boundedBy ?v . + } + GROUP BY $this + HAVING (COUNT(DISTINCT ?v) < 2) + """ ] ; + sh:targetClass kc:Edge . + +kcs:ElementShape a sh:NodeShape ; + rdfs:label "ElementShape" ; + rdfs:comment "Superstructure constraint: kc:uri may appear at most once per element." ; + sh:property [ sh:datatype xsd:anyURI ; + sh:maxCount 1 ; + sh:message "An element may have at most one kc:uri value." ; + sh:path kc:uri ] ; + sh:targetClass kc:Element . + +kcs:FaceShape a sh:NodeShape ; + rdfs:label "FaceShape" ; + rdfs:comment "Topological constraints on KC:Face instances." ; + sh:property [ sh:class kc:Edge ; + sh:maxCount 3 ; + sh:message "Face must have exactly three boundedBy relations, each of type KC:Edge." ; + sh:minCount 3 ; + sh:path kc:boundedBy ] ; + sh:sparql [ a sh:SPARQLConstraint ; + rdfs:comment """ + Closed-triangle constraint: the 3 boundary edges of a Face must share + vertices forming a cycle (v1-v2, v2-v3, v3-v1). This cannot be expressed + in OWL because it requires co-referencing property values across 3 edge + individuals. + """ ; + sh:message "Face boundary edges do not form a closed triangle over shared vertices." ; + sh:select """ + PREFIX kc: + SELECT $this WHERE { + $this kc:boundedBy ?e1 ; + kc:boundedBy ?e2 ; + kc:boundedBy ?e3 . + FILTER (?e1 != ?e2 && ?e2 != ?e3 && ?e1 != ?e3) + ?e1 kc:boundedBy ?a1 ; kc:boundedBy ?b1 . + ?e2 kc:boundedBy ?a2 ; kc:boundedBy ?b2 . + ?e3 kc:boundedBy ?a3 ; kc:boundedBy ?b3 . + FILTER (?a1 != ?b1 && ?a2 != ?b2 && ?a3 != ?b3) + # For a valid triangle each endpoint of e1 must appear in at least + # one other edge's boundary. + BIND( + EXISTS { + { ?e2 kc:boundedBy ?a1 } UNION { ?e3 kc:boundedBy ?a1 } + } AS ?a1_shared + ) + BIND( + EXISTS { + { ?e2 kc:boundedBy ?b1 } UNION { ?e3 kc:boundedBy ?b1 } + } AS ?b1_shared + ) + FILTER NOT EXISTS { + # Valid closed triangle: each endpoint of e1 is shared with another edge + FILTER (?a1_shared && ?b1_shared) + } + } + """ ] ; + sh:targetClass kc:Face . + diff --git a/examples/01_quickstart/output/geometric.png b/examples/01_quickstart/output/geometric.png new file mode 100644 index 0000000..9a5e479 Binary files /dev/null and b/examples/01_quickstart/output/geometric.png differ diff --git a/examples/01_quickstart/output/hasse.png b/examples/01_quickstart/output/hasse.png new file mode 100644 index 0000000..d63d482 Binary files /dev/null and b/examples/01_quickstart/output/hasse.png differ diff --git a/examples/01_quickstart/quickstart.py b/examples/01_quickstart/quickstart.py new file mode 100644 index 0000000..377bc20 --- /dev/null +++ b/examples/01_quickstart/quickstart.py @@ -0,0 +1,129 @@ +""" +quickstart.py — Load a complex, discover hidden structure, extend it. + +This example loads a pre-built data pipeline complex (vertices and edges +only), discovers triangles via clique detection, declares a face type, +fills in the faces, and shows how the topology changes. + +Run: + pip install knowledgecomplex[analysis,viz] + python examples/01_quickstart/quickstart.py +""" + +import os +from pathlib import Path + +from knowledgecomplex import ( + KnowledgeComplex, find_cliques, infer_faces, + betti_numbers, euler_characteristic, + plot_hasse, plot_geometric, +) + +# ── 1. Load a pre-built complex ────────────────────────────────────────── + +data_dir = Path(__file__).parent / "data" / "pipeline" +kc = KnowledgeComplex.load(data_dir) + +print("=== Loaded complex ===") +ids = kc.element_ids() +print(f"{len(ids)} elements: {sorted(ids)}") +print() + +# List vertices +print("=== Vertices ===") +print(kc.query("vertices")) +print() + +# ── 2. Discover hidden structure ───────────────────────────────────────── + +triangles = find_cliques(kc, k=3) +print(f"=== Discovered {len(triangles)} triangles (3-cliques) ===") +for tri in triangles: + print(f" vertices: {sorted(tri)}") + + # Show which edges connect them + for eid in sorted(kc.element_ids()): + elem = kc.element(eid) + kind = kc._schema._types.get(elem.type, {}).get("kind") + if kind == "edge" and kc.boundary(eid) <= tri: + print(f" edge {eid} ({elem.type}): {sorted(kc.boundary(eid))}") +print() + +# ── 3. Topology before faces ───────────────────────────────────────────── + +betti_before = betti_numbers(kc) +print(f"=== Topology (no faces) ===") +print(f" Betti numbers: {betti_before}") +print(f" β₁ = {betti_before[1]} independent cycles") +print(f" Euler characteristic: {euler_characteristic(kc)}") +print() + +# ── 4. Fill in faces using the declared face types ──────────────────────── + +# The schema already declares two face types: +# operation: actor + activity + input resource (requires/accesses) +# production: actor + activity + output resource (produces/responsible) +# Inspect the triangles to decide which type each gets. +# The input triangle uses "requires" + "accesses" edges → operation +# The output triangle uses "produces" + "responsible" edges → production +for tri in triangles: + # Find the 3 edges forming this triangle + edges = {} + for eid in sorted(kc.element_ids()): + elem = kc.element(eid) + kind = kc._schema._types.get(elem.type, {}).get("kind") + if kind == "edge" and kc.boundary(eid) <= tri: + edges[eid] = elem.type + + edge_types = set(edges.values()) + boundary = sorted(edges.keys()) + + if "requires" in edge_types or "accesses" in edge_types: + face_type = "operation" + else: + face_type = "production" + + face_id = f"{face_type}-{'_'.join(sorted(tri))}" + kc.add_face(face_id, type=face_type, boundary=boundary) + print(f" Added {face_id} ({face_type}): {boundary}") + +print() +added = [eid for eid in kc.element_ids() + if kc._schema._types.get(kc.element(eid).type, {}).get("kind") == "face"] +print(f"=== Added {len(added)} faces ===") +print() + +# ── 5. Topology after faces ────────────────────────────────────────────── + +betti_after = betti_numbers(kc) +print(f"=== Topology (with faces) ===") +print(f" Betti numbers: {betti_after}") +print(f" β₁ = {betti_after[1]} independent cycles (was {betti_before[1]})") +print(f" Euler characteristic: {euler_characteristic(kc)}") +print() + +if betti_before[1] > betti_after[1]: + print(" Faces filled in cycles — the complex is more connected!") +elif betti_before[1] == betti_after[1]: + print(" No change in β₁ — cycles were already independent of the faces.") +print() + +# ── 6. Visualize ───────────────────────────────────────────────────────── + +import matplotlib +matplotlib.use("Agg") +import matplotlib.pyplot as plt + +out = Path(__file__).parent / "output" +out.mkdir(exist_ok=True) + +fig, ax = plot_hasse(kc, figsize=(12, 9)) +fig.savefig(out / "hasse.png", dpi=150, bbox_inches="tight") +print(f"Saved {out / 'hasse.png'}") + +fig, ax = plot_geometric(kc, figsize=(12, 9)) +fig.savefig(out / "geometric.png", dpi=150, bbox_inches="tight") +print(f"Saved {out / 'geometric.png'}") + +plt.close("all") +print("\nDone!") diff --git a/examples/02_construction/construction.py b/examples/02_construction/construction.py new file mode 100644 index 0000000..e24e1bb --- /dev/null +++ b/examples/02_construction/construction.py @@ -0,0 +1,114 @@ +""" +quickstart.py — Runnable version of the README quick-start example. + +Models a data pipeline as a typed simplicial complex: + - 4 vertices: an actor, an activity, and two resources + - 5 edges: performs, requires, produces, accesses, responsible + - 2 faces: an operation triangle and a production triangle + +Run: + pip install knowledgecomplex[viz,analysis] + python examples/02_construction/construction.py +""" + +from knowledgecomplex import KnowledgeComplex +from knowledgecomplex.ontologies import operations + +# 1. Load the pre-built operations ontology +sb = operations.schema(namespace="ex") + +# 2. Build an instance +kc = KnowledgeComplex(schema=sb) +kc.add_vertex("alice", type="actor", name="Alice") +kc.add_vertex("etl-run", type="activity", name="Daily ETL") +kc.add_vertex("dataset1", type="resource", name="JSON Records") +kc.add_vertex("dataset2", type="resource", name="Sales DB") + +kc.add_edge("e1", type="performs", vertices={"alice", "etl-run"}, role="lead") +kc.add_edge("e2", type="requires", vertices={"etl-run", "dataset1"}, mode="read") +kc.add_edge("e3", type="produces", vertices={"etl-run", "dataset2"}, mode="write") +kc.add_edge("e4", type="accesses", vertices={"alice", "dataset1"}, mode="read") +kc.add_edge("e5", type="responsible", vertices={"alice", "dataset2"}, level="owner") + +kc.add_face("op1", type="operation", boundary=["e1", "e2", "e4"]) +kc.add_face("prod1", type="production", boundary=["e1", "e3", "e5"]) + +# 3. Query +print("=== Vertices ===") +df = kc.query("vertices") +print(df) +print() + +# 4. Topological queries +print("=== Boundary of face op1 ===") +print(kc.boundary("op1")) +print() + +print("=== Star of alice (all simplices containing alice) ===") +print(kc.star("alice")) +print() + +print("=== Skeleton k=1 (vertices + edges only) ===") +print(kc.skeleton(1)) +print() + +# 5. Algebraic topology +from knowledgecomplex import betti_numbers, euler_characteristic, edge_pagerank, edge_influence + +betti = betti_numbers(kc) +chi = euler_characteristic(kc) +print(f"=== Betti numbers: {betti} ===") +print(f" beta_0 = {betti[0]} (connected components)") +print(f" beta_1 = {betti[1]} (independent cycles)") +print(f" beta_2 = {betti[2]} (enclosed voids)") +print(f" Euler characteristic chi = {chi} (V - E + F = {len(kc.skeleton(0))} - {len(kc.skeleton(1) - kc.skeleton(0))} + {len(kc.skeleton(2) - kc.skeleton(1))})") +print() + +# Edge PageRank — measure influence of each edge on the complex +from knowledgecomplex import boundary_matrices +bm = boundary_matrices(kc) +print("=== Edge PageRank (influence ranking) ===") +for eid in sorted(bm.edge_index): + pr = edge_pagerank(kc, eid) + infl = edge_influence(eid, pr) + print(f" {eid:12s} spread={infl.spread:.3f} influence={infl.absolute_influence:.3f}") +print() + +# 6. Inspect the RDF +print("=== Turtle dump ===") +print(kc.dump_graph()) + +# 6. Visualize — Hasse diagrams (elements as nodes, boundary as directed arrows) +from knowledgecomplex import ( + to_networkx, verify_networkx, + plot_hasse, plot_hasse_star, plot_geometric, +) +import matplotlib.pyplot as plt + +# Export to directed networkx graph and verify invariants +G = to_networkx(kc) +verify_networkx(G) +print(f"DiGraph: {G.number_of_nodes()} nodes, {G.number_of_edges()} directed edges") +print() + +from pathlib import Path +out = Path(__file__).parent / "output" +out.mkdir(exist_ok=True) + +# Hasse diagram — arrows point from faces→edges→vertices (high→low dim) +fig, ax = plot_hasse(kc, figsize=(12, 9)) +fig.savefig(out / "hasse.png", dpi=150, bbox_inches="tight") +print(f"Saved {out / 'hasse.png'}") +plt.close(fig) + +# Hasse star of alice — her neighborhood highlighted +fig, ax = plot_hasse_star(kc, "alice", figsize=(12, 9)) +fig.savefig(out / "hasse_star_alice.png", dpi=150, bbox_inches="tight") +print(f"Saved {out / 'hasse_star_alice.png'}") +plt.close(fig) + +# 7. Geometric realization — vertices as 3D points, edges as lines, faces as triangles +fig, ax = plot_geometric(kc, figsize=(12, 9)) +fig.savefig(out / "geometric.png", dpi=150, bbox_inches="tight") +print(f"Saved {out / 'geometric.png'}") +plt.close(fig) diff --git a/examples/02_construction/output/geometric.png b/examples/02_construction/output/geometric.png new file mode 100644 index 0000000..dc4f721 Binary files /dev/null and b/examples/02_construction/output/geometric.png differ diff --git a/examples/02_construction/output/hasse.png b/examples/02_construction/output/hasse.png new file mode 100644 index 0000000..5727c7d Binary files /dev/null and b/examples/02_construction/output/hasse.png differ diff --git a/examples/02_construction/output/hasse_star_alice.png b/examples/02_construction/output/hasse_star_alice.png new file mode 100644 index 0000000..bc80592 Binary files /dev/null and b/examples/02_construction/output/hasse_star_alice.png differ diff --git a/examples/03_topology/topology_walkthrough.py b/examples/03_topology/topology_walkthrough.py new file mode 100644 index 0000000..202ef50 --- /dev/null +++ b/examples/03_topology/topology_walkthrough.py @@ -0,0 +1,125 @@ +""" +topology_walkthrough.py — All 8 topological operators with set algebra. + +Models a small research collaboration network as a double-triangle complex: + - 4 researchers (vertices): alice, bob, carol, dave + - 5 collaborations (edges): linking pairs of researchers + - 2 papers (faces): each authored by a triangle of collaborators + +Run: + python examples/03_topology/topology_walkthrough.py +""" + +from knowledgecomplex import SchemaBuilder, KnowledgeComplex, vocab + +# ── Schema ────────────────────────────────────────────────────────────────── + +sb = SchemaBuilder(namespace="research") +sb.add_vertex_type("Researcher", attributes={"field": vocab("ML", "Systems", "Theory")}) +sb.add_edge_type("Collaborates") +sb.add_face_type("Paper") + +kc = KnowledgeComplex(schema=sb) + +# ── Build the complex ─────────────────────────────────────────────────────── +# Two triangles sharing the alice-bob edge: +# +# carol --- alice --- dave +# \ / \ / +# \ / \ / +# bob bob (shared) + +kc.add_vertex("alice", type="Researcher", field="ML") +kc.add_vertex("bob", type="Researcher", field="ML") +kc.add_vertex("carol", type="Researcher", field="Systems") +kc.add_vertex("dave", type="Researcher", field="Theory") + +kc.add_edge("ab", type="Collaborates", vertices={"alice", "bob"}) +kc.add_edge("ac", type="Collaborates", vertices={"alice", "carol"}) +kc.add_edge("bc", type="Collaborates", vertices={"bob", "carol"}) +kc.add_edge("ad", type="Collaborates", vertices={"alice", "dave"}) +kc.add_edge("bd", type="Collaborates", vertices={"bob", "dave"}) + +kc.add_face("paper1", type="Paper", boundary=["ab", "ac", "bc"]) +kc.add_face("paper2", type="Paper", boundary=["ab", "ad", "bd"]) + +# ── Boundary & Coboundary ────────────────────────────────────────────────── + +print("=== Boundary (direct faces of a simplex) ===") +print(f" boundary(alice) = {kc.boundary('alice')}") # vertex: empty +print(f" boundary(ab) = {kc.boundary('ab')}") # edge: 2 vertices +print(f" boundary(paper1) = {kc.boundary('paper1')}") # face: 3 edges +print() + +print("=== Coboundary (simplices that bound this element) ===") +print(f" coboundary(alice) = {kc.coboundary('alice')}") # edges incident to alice +print(f" coboundary(ab) = {kc.coboundary('ab')}") # faces containing edge ab +print(f" coboundary(paper1)= {kc.coboundary('paper1')}") # nothing (top dim) +print() + +# ── Star & Closure ───────────────────────────────────────────────────────── + +print("=== Star (all simplices containing this element) ===") +print(f" star(alice) = {kc.star('alice')}") +print(f" star(ab) = {kc.star('ab')}") +print() + +print("=== Closure (smallest subcomplex containing these elements) ===") +print(f" closure(paper1) = {kc.closure('paper1')}") +# Set input: closure of multiple elements +print(f" closure({{ab, ad}}) = {kc.closure({'ab', 'ad'})}") +print() + +# ── Closed Star & Link ───────────────────────────────────────────────────── + +print("=== Closed Star = Cl(St(x)) — always a valid subcomplex ===") +cs = kc.closed_star("alice") +print(f" closed_star(alice) = {cs}") +print(f" is_subcomplex? = {kc.is_subcomplex(cs)}") +print() + +print("=== Link = Cl(St(x)) \\ St(x) — the 'horizon' around x ===") +print(f" link(alice) = {kc.link('alice')}") +print(f" link(bob) = {kc.link('bob')}") +print(f" link(ab) = {kc.link('ab')}") +print() + +# ── Skeleton & Degree ────────────────────────────────────────────────────── + +print("=== Skeleton (elements up to dimension k) ===") +print(f" skeleton(0) = {kc.skeleton(0)} (vertices)") +print(f" skeleton(1) = {kc.skeleton(1)} (+ edges)") +print(f" skeleton(2) = {kc.skeleton(2)} (+ faces = everything)") +print() + +print("=== Degree (incident edges) ===") +print(f" degree(alice) = {kc.degree('alice')}") # 3 edges +print(f" degree(carol) = {kc.degree('carol')}") # 2 edges +print(f" degree(dave) = {kc.degree('dave')}") # 2 edges +print() + +# ── Type-filtered queries ────────────────────────────────────────────────── + +print("=== Type-filtered queries ===") +print(f" star(alice, type='Paper') = {kc.star('alice', type='Paper')}") +print(f" star(alice, type='Collaborates') = {kc.star('alice', type='Collaborates')}") +print(f" coboundary(alice, type='Collaborates') = {kc.coboundary('alice', type='Collaborates')}") +print() + +# ── Set algebra (composition) ────────────────────────────────────────────── + +print("=== Set algebra ===") +s_alice = kc.star("alice") +s_carol = kc.star("carol") +s_dave = kc.star("dave") + +print(f" star(alice) & star(carol) = {s_alice & s_carol}") # shared +print(f" star(carol) & star(dave) = {s_carol & s_dave}") # disjoint? +print(f" star(alice) | star(dave) = {s_alice | s_dave}") # union +print(f" star(alice) - star(carol) = {s_alice - s_carol}") # alice-only +print() + +# Compose: closure of the star +composed = kc.closure(kc.star("carol")) +direct = kc.closed_star("carol") +print(f" closure(star(carol)) == closed_star(carol)? {composed == direct}") diff --git a/examples/04_clique_inference/clique_inference.py b/examples/04_clique_inference/clique_inference.py new file mode 100644 index 0000000..19a2718 --- /dev/null +++ b/examples/04_clique_inference/clique_inference.py @@ -0,0 +1,118 @@ +""" +clique_inference.py — Flagification and typed face inference. + +Demonstrates the explore → inspect → type workflow: + 1. Build a social network with only vertices and edges + 2. Discover what triangles (3-cliques) exist + 3. Fill them generically to explore the structure + 4. Re-build with typed face inference for semantic meaning + +Run: + pip install knowledgecomplex[viz] + python examples/04_clique_inference/clique_inference.py +""" + +from knowledgecomplex import ( + SchemaBuilder, KnowledgeComplex, vocab, + find_cliques, infer_faces, fill_cliques, + plot_hasse, plot_geometric, +) +import matplotlib.pyplot as plt + +# ── Phase 1: Build a social network (vertices + edges only) ──────────────── + +sb = SchemaBuilder(namespace="social") +sb.add_vertex_type("Person") +sb.add_edge_type("Collaborates") +sb.add_edge_type("Mentors") +sb.add_face_type("Team") # declared but not yet populated + +kc = KnowledgeComplex(schema=sb) + +# 5 people +for name in ["alice", "bob", "carol", "dave", "eve"]: + kc.add_vertex(name, type="Person") + +# Collaboration edges (form several triangles) +kc.add_edge("c-ab", type="Collaborates", vertices={"alice", "bob"}) +kc.add_edge("c-ac", type="Collaborates", vertices={"alice", "carol"}) +kc.add_edge("c-bc", type="Collaborates", vertices={"bob", "carol"}) +kc.add_edge("c-bd", type="Collaborates", vertices={"bob", "dave"}) +kc.add_edge("c-cd", type="Collaborates", vertices={"carol", "dave"}) +kc.add_edge("c-de", type="Collaborates", vertices={"dave", "eve"}) + +# A mentoring edge (different type — won't form team triangles) +kc.add_edge("m-ae", type="Mentors", vertices={"alice", "eve"}) + +print("=== Network built ===") +print(f" {len(kc.skeleton(0))} people, {len(kc.skeleton(1) - kc.skeleton(0))} relationships") +print() + +# ── Phase 2: Discover cliques ────────────────────────────────────────────── + +print("=== find_cliques: what triangles exist? ===") +all_triangles = find_cliques(kc, k=3) +print(f" All 3-cliques (any edge type): {len(all_triangles)}") +for tri in all_triangles: + print(f" {sorted(tri)}") +print() + +# Filter to collaboration-only triangles +collab_triangles = find_cliques(kc, k=3, edge_type="Collaborates") +print(f" Collaboration-only 3-cliques: {len(collab_triangles)}") +for tri in collab_triangles: + print(f" {sorted(tri)}") +print() + +# ── Phase 3: Dry run — preview what would be added ───────────────────────── + +print("=== infer_faces dry run ===") +preview = infer_faces(kc, "Team", edge_type="Collaborates", dry_run=True) +print(f" Would add {len(preview)} Team faces: {preview}") +print(f" Current face count: {len(kc.skeleton(2) - kc.skeleton(1))}") +print() + +# ── Phase 4: Typed inference — fill Team faces ───────────────────────────── + +print("=== infer_faces: filling Team faces ===") +added = infer_faces(kc, "Team", edge_type="Collaborates", id_prefix="team") +print(f" Added {len(added)} faces: {added}") +print() + +# Inspect what was created +for fid in added: + edges = kc.boundary(fid) + edge_types = {kc.element(e).type for e in edges} + vertices = set() + for e in edges: + vertices |= kc.boundary(e) + print(f" {fid}: members={sorted(vertices)}, edge_types={edge_types}") +print() + +# Idempotent — running again adds nothing +second_run = infer_faces(kc, "Team", edge_type="Collaborates") +print(f" Second run added {len(second_run)} faces (idempotent)") +print() + +# ── Phase 5: Visualize ───────────────────────────────────────────────────── + +fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(20, 8)) + +# Hasse diagram — shows all elements as nodes with directed boundary arrows +plot_hasse(kc, ax=ax1) +ax1.set_title("Hasse Diagram (after inference)") + +from pathlib import Path +out = Path(__file__).parent / "output" +out.mkdir(exist_ok=True) + +plt.tight_layout() +fig.savefig(out / "hasse.png", dpi=150, bbox_inches="tight") +print(f"Saved {out / 'hasse.png'}") +plt.close(fig) + +# Geometric realization — vertices as 3D points, edges as lines, faces as triangles +fig, ax = plot_geometric(kc, figsize=(10, 8)) +fig.savefig(out / "geometric.png", dpi=150, bbox_inches="tight") +print(f"Saved {out / 'geometric.png'}") +plt.close(fig) diff --git a/examples/04_clique_inference/output/geometric.png b/examples/04_clique_inference/output/geometric.png new file mode 100644 index 0000000..a2cd284 Binary files /dev/null and b/examples/04_clique_inference/output/geometric.png differ diff --git a/examples/04_clique_inference/output/hasse.png b/examples/04_clique_inference/output/hasse.png new file mode 100644 index 0000000..6109f95 Binary files /dev/null and b/examples/04_clique_inference/output/hasse.png differ diff --git a/examples/05_constraints/constraints.py b/examples/05_constraints/constraints.py new file mode 100644 index 0000000..93396f7 --- /dev/null +++ b/examples/05_constraints/constraints.py @@ -0,0 +1,105 @@ +""" +constraints.py — Topological constraint escalation and schema-level queries. + +Models a V&V (verification & validation) system where: + - Requirements (vertices) must each have at least one verification edge + - Edges connect requirements to test artifacts + +Demonstrates how topological queries can be escalated to SHACL constraints +that are enforced at write time, and how named queries can be registered +on the schema for reuse. + +Run: + python examples/05_constraints/constraints.py +""" + +from knowledgecomplex import SchemaBuilder, KnowledgeComplex, vocab, ValidationError + +# ── Schema with topological constraints ──────────────────────────────────── + +sb = SchemaBuilder(namespace="vv") + +sb.add_vertex_type("Requirement", attributes={"priority": vocab("high", "medium", "low")}) +sb.add_vertex_type("TestCase", attributes={"status": vocab("pass", "fail", "pending")}) +sb.add_edge_type("Verifies") +sb.add_face_type("Coverage") + +# Register a named query: "for a given requirement, find all verification edges" +sb.add_query("req_verifications", "coboundary", target_type="Verifies") +print("=== Registered named query: req_verifications ===") +print() + +# Export schema to see generated SPARQL +import tempfile +from pathlib import Path + +with tempfile.TemporaryDirectory() as tmpdir: + sb.export(tmpdir) + sparql_file = Path(tmpdir) / "queries" / "req_verifications.sparql" + if sparql_file.exists(): + print("=== Generated SPARQL template ===") + print(sparql_file.read_text()) + +# ── Build a valid complex (no constraints yet) ───────────────────────────── + +print("=== Building complex WITHOUT constraints ===") +kc = KnowledgeComplex(schema=sb) +kc.add_vertex("req-1", type="Requirement", priority="high") +kc.add_vertex("req-2", type="Requirement", priority="medium") +kc.add_vertex("test-a", type="TestCase", status="pass") +kc.add_vertex("test-b", type="TestCase", status="pending") + +kc.add_edge("v1", type="Verifies", vertices={"req-1", "test-a"}) +kc.add_edge("v2", type="Verifies", vertices={"req-1", "test-b"}) +kc.add_edge("v3", type="Verifies", vertices={"req-2", "test-a"}) + +print(f" Added {len(kc.skeleton(0))} vertices, {len(kc.skeleton(1) - kc.skeleton(0))} edges") +print(f" req-1 verifications: {kc.coboundary('req-1', type='Verifies')}") +print(f" req-2 verifications: {kc.coboundary('req-2', type='Verifies')}") +print() + +# ── Now add a max_count constraint ───────────────────────────────────────── + +print("=== Schema with max_count constraint ===") +sb2 = SchemaBuilder(namespace="vv") +sb2.add_vertex_type("Requirement", attributes={"priority": vocab("high", "medium", "low")}) +sb2.add_vertex_type("TestCase", attributes={"status": vocab("pass", "fail", "pending")}) +sb2.add_edge_type("Verifies") + +# Each TestCase can verify at most 2 requirements +sb2.add_topological_constraint( + "TestCase", "coboundary", + target_type="Verifies", + predicate="max_count", max_count=2, + message="A test case can verify at most 2 requirements", +) + +kc2 = KnowledgeComplex(schema=sb2) +kc2.add_vertex("req-1", type="Requirement", priority="high") +kc2.add_vertex("req-2", type="Requirement", priority="medium") +kc2.add_vertex("req-3", type="Requirement", priority="low") +kc2.add_vertex("test-a", type="TestCase", status="pass") + +kc2.add_edge("v1", type="Verifies", vertices={"req-1", "test-a"}) +kc2.add_edge("v2", type="Verifies", vertices={"req-2", "test-a"}) +print(" Added 2 verification edges to test-a (within limit)") + +# Third edge would exceed max_count=2 +try: + kc2.add_edge("v3", type="Verifies", vertices={"req-3", "test-a"}) + print(" ERROR: should have raised!") +except ValidationError as e: + print(f" Third edge correctly rejected: {e}") +print() + +# ── Inspect generated SHACL ──────────────────────────────────────────────── + +print("=== Generated SHACL (excerpt) ===") +shacl = sb2.dump_shacl() +# Show just the constraint-related lines +for line in shacl.split("\n"): + if "Topological" in line or "max_count" in line or "coboundary" in line: + print(f" {line.strip()}") +print() + +print("Done.") diff --git a/examples/06_io_roundtrip/io_roundtrip.py b/examples/06_io_roundtrip/io_roundtrip.py new file mode 100644 index 0000000..cb0261b --- /dev/null +++ b/examples/06_io_roundtrip/io_roundtrip.py @@ -0,0 +1,109 @@ +""" +io_roundtrip.py — Multi-format I/O and additive loading. + +Demonstrates: + - Saving instance graphs in Turtle, JSON-LD, and N-Triples + - Loading data additively into an existing complex + - Directory-based export/load for full round-trips + - String serialization with dump_graph() + +Run: + python examples/06_io_roundtrip/io_roundtrip.py +""" + +import tempfile +from pathlib import Path + +from knowledgecomplex import ( + SchemaBuilder, KnowledgeComplex, vocab, + save_graph, load_graph, dump_graph, +) + +# ── Build a simple complex ───────────────────────────────────────────────── + +sb = SchemaBuilder(namespace="io") +sb.add_vertex_type("Node") +sb.add_edge_type("Link", attributes={"weight": vocab("light", "heavy")}) +sb.add_face_type("Triangle") + +kc = KnowledgeComplex(schema=sb) +kc.add_vertex("a", type="Node") +kc.add_vertex("b", type="Node") +kc.add_vertex("c", type="Node") +kc.add_edge("ab", type="Link", vertices={"a", "b"}, weight="heavy") +kc.add_edge("bc", type="Link", vertices={"b", "c"}, weight="light") +kc.add_edge("ac", type="Link", vertices={"a", "c"}, weight="light") +kc.add_face("f1", type="Triangle", boundary=["ab", "bc", "ac"]) + +print(f"Built complex: {len(kc.element_ids())} elements") +print() + +with tempfile.TemporaryDirectory() as tmpdir: + tmp = Path(tmpdir) + + # ── Multi-format export ──────────────────────────────────────────────── + + print("=== Saving in multiple formats ===") + save_graph(kc, tmp / "data.ttl") + save_graph(kc, tmp / "data.jsonld", format="json-ld") + save_graph(kc, tmp / "data.nt", format="ntriples") + + for f in sorted(tmp.glob("data.*")): + size = f.stat().st_size + print(f" {f.name:15s} {size:>6} bytes") + print() + + # ── JSON-LD string output ────────────────────────────────────────────── + + print("=== JSON-LD string (first 500 chars) ===") + jsonld = dump_graph(kc, format="json-ld") + print(jsonld[:500]) + print("...") + print() + + # ── Load into a fresh KC ─────────────────────────────────────────────── + + print("=== Loading from JSON-LD into fresh KC ===") + fresh = KnowledgeComplex(schema=sb) + print(f" Fresh KC before load: {len(fresh.element_ids())} elements") + load_graph(fresh, tmp / "data.jsonld") + print(f" Fresh KC after load: {len(fresh.element_ids())} elements") + print() + + # ── Additive loading (merge two datasets) ────────────────────────────── + + print("=== Additive loading: merging two datasets ===") + + # Build a second dataset with different elements + kc2 = KnowledgeComplex(schema=sb) + kc2.add_vertex("d", type="Node") + kc2.add_vertex("e", type="Node") + kc2.add_edge("de", type="Link", vertices={"d", "e"}, weight="heavy") + save_graph(kc2, tmp / "dataset2.ttl") + + # Load both into one KC + merged = KnowledgeComplex(schema=sb) + load_graph(merged, tmp / "data.ttl") + count_after_first = len(merged.element_ids()) + load_graph(merged, tmp / "dataset2.ttl") + count_after_second = len(merged.element_ids()) + + print(f" After loading dataset 1: {count_after_first} elements") + print(f" After loading dataset 2: {count_after_second} elements (additive)") + print() + + # ── Directory-based export/load ──────────────────────────────────────── + + print("=== Directory export/load (full round-trip) ===") + export_dir = tmp / "exported" + kc.export(export_dir) + print(f" Exported to {export_dir}:") + for f in sorted(export_dir.rglob("*")): + if f.is_file(): + print(f" {f.relative_to(export_dir)}") + + loaded = KnowledgeComplex.load(export_dir) + print(f" Loaded back: {len(loaded.element_ids())} elements") + print() + +print("Done.") diff --git a/examples/07_filtration/filtration_evolution.py b/examples/07_filtration/filtration_evolution.py new file mode 100644 index 0000000..993e66d --- /dev/null +++ b/examples/07_filtration/filtration_evolution.py @@ -0,0 +1,140 @@ +""" +filtration_evolution.py — Proper filtration: a startup growing over quarters. + +A filtration is a strictly growing sequence of subcomplexes. Each quarter +new people join and form collaborations, but nobody leaves. The birth() +index tells us when each element first appeared, enabling persistence +analysis and understanding which structures are foundational vs. late-stage. + +Run: + pip install knowledgecomplex[analysis] + python examples/07_filtration/filtration_evolution.py +""" + +from knowledgecomplex import SchemaBuilder, KnowledgeComplex, Filtration, vocab + +# ── Schema ────────────────────────────────────────────────────────────────── + +sb = SchemaBuilder(namespace="startup") +sb.add_vertex_type("Person", attributes={"role": vocab("eng", "design", "sales", "founder")}) +sb.add_edge_type("Collaborates") +sb.add_face_type("Team") + +# ── Build the full complex (all quarters) ─────────────────────────────────── + +kc = KnowledgeComplex(schema=sb) + +# All people +kc.add_vertex("alice", type="Person", role="founder") +kc.add_vertex("bob", type="Person", role="founder") +kc.add_vertex("carol", type="Person", role="eng") +kc.add_vertex("dave", type="Person", role="design") +kc.add_vertex("eve", type="Person", role="sales") + +# All collaborations +kc.add_edge("c-ab", type="Collaborates", vertices={"alice", "bob"}) +kc.add_edge("c-ac", type="Collaborates", vertices={"alice", "carol"}) +kc.add_edge("c-bc", type="Collaborates", vertices={"bob", "carol"}) +kc.add_edge("c-ad", type="Collaborates", vertices={"alice", "dave"}) +kc.add_edge("c-bd", type="Collaborates", vertices={"bob", "dave"}) +kc.add_edge("c-cd", type="Collaborates", vertices={"carol", "dave"}) +kc.add_edge("c-ae", type="Collaborates", vertices={"alice", "eve"}) +kc.add_edge("c-be", type="Collaborates", vertices={"bob", "eve"}) + +# Teams (triangles of collaboration) +kc.add_face("eng-team", type="Team", boundary=["c-ab", "c-ac", "c-bc"]) +kc.add_face("design-team", type="Team", boundary=["c-ab", "c-ad", "c-bd"]) +kc.add_face("cross-team", type="Team", boundary=["c-ac", "c-ad", "c-cd"]) + +# ── Build the filtration (strictly growing) ──────────────────────────────── + +filt = Filtration(kc) + +# Q0: Founders meet and start collaborating +filt.append_closure({"alice", "bob", "c-ab"}) + +# Q1: First hire (carol) — forms engineering triangle +filt.append_closure({"carol", "c-ac", "c-bc", "eng-team"}) + +# Q2: Second hire (dave) — design team + cross-functional team +filt.append_closure({"dave", "c-ad", "c-bd", "c-cd", "design-team", "cross-team"}) + +# Q3: Third hire (eve) — sales edges, no new triangle +filt.append_closure({"eve", "c-ae", "c-be"}) + +# ── Inspect the filtration ───────────────────────────────────────────────── + +print("=== Filtration: Startup Growth ===") +print(f" {len(filt)} quarters, complete={filt.is_complete}") +print() + +for i, step in enumerate(filt): + print(f" Q{i}: {len(step)} elements") + new = filt.new_at(i) + if new: + print(f" new: {sorted(new)}") +print() + +# ── Birth tracking ───────────────────────────────────────────────────────── + +print("=== When did each element first appear? ===") +for eid in sorted(kc.element_ids()): + try: + b = filt.birth(eid) + print(f" {eid:15s} born Q{b}") + except ValueError: + print(f" {eid:15s} never (not in filtration)") +print() + +# ── Topological evolution (Betti numbers at each step) ───────────────────── + +try: + from knowledgecomplex import betti_numbers + + print("=== Betti numbers at each quarter ===") + print(" (beta_0=components, beta_1=cycles, beta_2=voids)") + + # Build temporary KCs for each step to compute Betti numbers + for i, step in enumerate(filt): + # Create a sub-KC with just this step's elements + sub_kc = KnowledgeComplex(schema=sb) + # Add elements in dimension order + for eid in sorted(step): + elem = kc.element(eid) + kind = kc._schema._types[elem.type]["kind"] + if kind == "vertex": + sub_kc.add_vertex(eid, type=elem.type, **elem.attrs) + for eid in sorted(step): + elem = kc.element(eid) + kind = kc._schema._types[elem.type]["kind"] + if kind == "edge": + boundary = sorted(kc.boundary(eid)) + sub_kc.add_edge(eid, type=elem.type, vertices=set(boundary), **elem.attrs) + for eid in sorted(step): + elem = kc.element(eid) + kind = kc._schema._types[elem.type]["kind"] + if kind == "face": + boundary = sorted(kc.boundary(eid)) + sub_kc.add_face(eid, type=elem.type, boundary=boundary, **elem.attrs) + + betti = betti_numbers(sub_kc) + v = len([e for e in step if kc._schema._types[kc.element(e).type]["kind"] == "vertex"]) + e = len([e for e in step if kc._schema._types[kc.element(e).type]["kind"] == "edge"]) + f = len([e for e in step if kc._schema._types[kc.element(e).type]["kind"] == "face"]) + print(f" Q{i}: V={v} E={e} F={f} -> beta={betti}") + print() +except ImportError: + print(" (install knowledgecomplex[analysis] for Betti numbers)") + print() + +# ── Filtration from a function ───────────────────────────────────────────── + +print("=== Filtration from dimension function ===") +dim_filt = Filtration.from_function(kc, lambda eid: { + "vertex": 0, "edge": 1, "face": 2, +}[kc._schema._types[kc.element(eid).type]["kind"]]) + +for i, step in enumerate(dim_filt): + print(f" Step {i}: {len(step)} elements (dim <= {i})") +print() +print("Done.") diff --git a/examples/08_temporal_sweep/temporal_sweep.py b/examples/08_temporal_sweep/temporal_sweep.py new file mode 100644 index 0000000..c3ef8ba --- /dev/null +++ b/examples/08_temporal_sweep/temporal_sweep.py @@ -0,0 +1,142 @@ +""" +temporal_sweep.py — Parameterized subcomplex sweep over time. + +The complex contains ALL elements that ever existed, each with temporal +metadata (active_from, active_until). A parameterized SPARQL query +filters to "active" elements at any given time point. The complex itself +doesn't change — we're just slicing it at different times. + +This is NOT a filtration (subcomplexes can shrink when people leave). +It demonstrates query_ids() with parameter substitution. + +Run: + python examples/08_temporal_sweep/temporal_sweep.py +""" + +from knowledgecomplex import SchemaBuilder, KnowledgeComplex, vocab, text + +# ── Schema with temporal metadata ────────────────────────────────────────── + +sb = SchemaBuilder(namespace="proj") +sb.add_vertex_type("Person", attributes={ + "role": vocab("eng", "pm", "qa"), + "active_from": text(), + "active_until": text(), # "9999" means still active +}) +sb.add_edge_type("WorksWith", attributes={ + "active_from": text(), + "active_until": text(), +}) +sb.add_face_type("Squad") + +# Register a parameterized query: "elements active at time {t}" +# Uses model attributes active_from <= t and active_until > t +sb.add_query("active_vertices", "coboundary") # placeholder — we'll use a custom template + +kc = KnowledgeComplex(schema=sb) + +# ── Build the complete timeline ──────────────────────────────────────────── + +# People with join/leave dates (quarters as integers for simplicity) +people = [ + ("alice", "eng", "1", "9999"), # joined Q1, still here + ("bob", "eng", "1", "4"), # joined Q1, left Q4 + ("carol", "pm", "2", "9999"), # joined Q2, still here + ("dave", "qa", "3", "9999"), # joined Q3, still here + ("eve", "eng", "4", "9999"), # joined Q4 (replacing bob) +] + +for name, role, af, au in people: + kc.add_vertex(name, type="Person", role=role, active_from=af, active_until=au) + +# Collaborations with their own time ranges +collabs = [ + ("w-ab", {"alice", "bob"}, "1", "4"), # ends when bob leaves + ("w-ac", {"alice", "carol"}, "2", "9999"), + ("w-bc", {"bob", "carol"}, "2", "4"), # ends when bob leaves + ("w-cd", {"carol", "dave"}, "3", "9999"), + ("w-ad", {"alice", "dave"}, "3", "9999"), + ("w-ae", {"alice", "eve"}, "4", "9999"), + ("w-ce", {"carol", "eve"}, "4", "9999"), +] + +for eid, verts, af, au in collabs: + kc.add_edge(eid, type="WorksWith", vertices=verts, active_from=af, active_until=au) + +# Squad alice-carol-dave active from Q3 (uses edges w-ac, w-cd, w-ad) +kc.add_face("squad-1", type="Squad", boundary=["w-ac", "w-cd", "w-ad"]) + +print(f"Built timeline: {len(kc.element_ids())} total elements across all time") +print() + +# ── Manual parameterized sweep ───────────────────────────────────────────── + +# Since we store active_from/active_until as string attributes, we can +# query for elements active at a specific time by comparing attribute values. + +print("=== Active subcomplex at each quarter ===") +for t in ["1", "2", "3", "4", "5"]: + # Get active people at time t + active_people = set() + for pid in kc.element_ids(type="Person"): + elem = kc.element(pid) + af = elem.attrs.get("active_from", "0") + au = elem.attrs.get("active_until", "9999") + if af <= t < au: + active_people.add(pid) + + # Get active edges at time t + active_edges = set() + for eid in kc.element_ids(type="WorksWith"): + elem = kc.element(eid) + af = elem.attrs.get("active_from", "0") + au = elem.attrs.get("active_until", "9999") + if af <= t < au: + # Only include if both endpoints are active + boundary = kc.boundary(eid) + if boundary <= active_people: + active_edges.add(eid) + + # Get active faces + active_faces = set() + for fid in kc.element_ids(type="Squad"): + boundary = kc.boundary(fid) + if boundary <= active_edges: + active_faces.add(fid) + + active = active_people | active_edges | active_faces + is_sub = kc.is_subcomplex(active) + + print(f" Q{t}: {len(active_people)} people, " + f"{len(active_edges)} collabs, " + f"{len(active_faces)} squads " + f"(valid subcomplex: {is_sub})") + print(f" people: {sorted(active_people)}") + + # Show who's new and who left + if t != "1": + prev_t = str(int(t) - 1) + prev_people = set() + for pid in kc.element_ids(type="Person"): + elem = kc.element(pid) + af = elem.attrs.get("active_from", "0") + au = elem.attrs.get("active_until", "9999") + if af <= prev_t < au: + prev_people.add(pid) + joined = active_people - prev_people + left = prev_people - active_people + if joined: + print(f" joined: {sorted(joined)}") + if left: + print(f" left: {sorted(left)}") + print() + +# ── Key insight ──────────────────────────────────────────────────────────── + +print("=== Key insight ===") +print(" This is NOT a filtration — the subcomplex at Q4 is not a superset") +print(" of Q3 (bob left). But each time-slice is a valid subcomplex,") +print(" and the full complex contains the complete history.") +print() +print(" The complex is the territory; the time-slice queries are the maps.") +print("Done.") diff --git a/examples/09_diff_sequence/diff_sequence.py b/examples/09_diff_sequence/diff_sequence.py new file mode 100644 index 0000000..70402bb --- /dev/null +++ b/examples/09_diff_sequence/diff_sequence.py @@ -0,0 +1,146 @@ +""" +diff_sequence.py — Diff-based complex sequences with SPARQL export. + +Models a project evolving sprint-by-sprint via explicit diffs. +Each sprint's changes are recorded as a ComplexDiff — additions and +removals of elements. The diffs can be exported to SPARQL UPDATE +strings for interoperability with RDF-native systems like flexo MMS, +or imported from remote SPARQL updates. + +Run: + python examples/09_diff_sequence/diff_sequence.py +""" + +from knowledgecomplex import ( + SchemaBuilder, KnowledgeComplex, vocab, + ComplexDiff, ComplexSequence, +) + +# ── Schema ────────────────────────────────────────────────────────────────── + +sb = SchemaBuilder(namespace="sprint") +sb.add_vertex_type("Feature", attributes={"status": vocab("planned", "active", "done")}) +sb.add_vertex_type("Engineer") +sb.add_edge_type("WorksOn") +sb.add_edge_type("DependsOn") +sb.add_face_type("WorkPackage") + +# ── Base state (Sprint 0) ────────────────────────────────────────────────── + +kc = KnowledgeComplex(schema=sb) + +kc.add_vertex("auth", type="Feature", status="active") +kc.add_vertex("api", type="Feature", status="planned") +kc.add_vertex("alice", type="Engineer") +kc.add_vertex("bob", type="Engineer") +kc.add_edge("alice-auth", type="WorksOn", vertices={"alice", "auth"}) +kc.add_edge("bob-auth", type="WorksOn", vertices={"bob", "auth"}) +kc.add_edge("alice-bob", type="WorksOn", vertices={"alice", "bob"}) + +print("=== Sprint 0 (base state) ===") +print(f" Elements: {sorted(kc.element_ids())}") +print() + +# ── Sprint 1: API work begins, auth wraps up ────────────────────────────── + +sprint1 = ( + ComplexDiff() + .add_vertex("carol", type="Engineer") + .add_edge("carol-api", type="WorksOn", vertices={"carol", "api"}) + .add_edge("alice-api", type="WorksOn", vertices={"alice", "api"}) + .add_edge("dep-api-auth", type="DependsOn", vertices={"api", "auth"}) +) + +# ── Sprint 2: Bob moves to API, triangle forms ──────────────────────────── + +sprint2 = ( + ComplexDiff() + .remove("bob-auth") # Bob stops working on auth + .add_edge("bob-api", type="WorksOn", vertices={"bob", "api"}) + .add_edge("bob-carol", type="WorksOn", vertices={"bob", "carol"}) + .add_face("wp-api", type="WorkPackage", + boundary=["alice-api", "bob-api", "alice-bob"]) +) + +# ── Sprint 3: Auth feature done, alice moves off ────────────────────────── + +sprint3 = ( + ComplexDiff() + .remove("wp-api") # work package dissolves + .remove("alice-auth") # alice leaves auth +) + +# ── Build the sequence ───────────────────────────────────────────────────── + +seq = ComplexSequence(kc, [sprint1, sprint2, sprint3]) + +print("=== Complex evolution across sprints ===") +base_ids = set(kc.element_ids()) +print(f" Sprint 0 (base): {len(base_ids)} elements") +for i in range(len(seq)): + step = seq[i] + new = seq.new_at(i) + removed = seq.removed_at(i) + print(f" Sprint {i+1}: {len(step)} elements (+{len(new)} -{len(removed)})") + if new: + print(f" added: {sorted(new)}") + if removed: + print(f" removed: {sorted(removed)}") +print() + +# ── Apply diffs to see the actual complex at each state ──────────────────── + +print("=== Applying diffs sequentially ===") + +# Sprint 1 +sprint1.apply(kc) +print(f" After Sprint 1: {sorted(kc.element_ids())}") + +# Sprint 2 +sprint2.apply(kc) +print(f" After Sprint 2: {sorted(kc.element_ids())}") + +# Sprint 3 +sprint3.apply(kc) +print(f" After Sprint 3: {sorted(kc.element_ids())}") +print() + +# ── SPARQL export (for flexo MMS interoperability) ───────────────────────── + +# Re-build to export sprint2's SPARQL from the right state +kc2 = KnowledgeComplex(schema=sb) +kc2.add_vertex("auth", type="Feature", status="active") +kc2.add_vertex("api", type="Feature", status="planned") +kc2.add_vertex("alice", type="Engineer") +kc2.add_vertex("bob", type="Engineer") +kc2.add_vertex("carol", type="Engineer") +kc2.add_edge("alice-auth", type="WorksOn", vertices={"alice", "auth"}) +kc2.add_edge("bob-auth", type="WorksOn", vertices={"bob", "auth"}) +kc2.add_edge("alice-bob", type="WorksOn", vertices={"alice", "bob"}) +kc2.add_edge("carol-api", type="WorksOn", vertices={"carol", "api"}) +kc2.add_edge("alice-api", type="WorksOn", vertices={"alice", "api"}) +kc2.add_edge("dep-api-auth", type="DependsOn", vertices={"api", "auth"}) + +print("=== SPARQL UPDATE for Sprint 2 ===") +sparql = sprint2.to_sparql(kc2) +print(sparql[:600]) +if len(sparql) > 600: + print("...") +print() + +# ── Round-trip: import the SPARQL back ───────────────────────────────────── + +print("=== SPARQL round-trip ===") +imported = ComplexDiff.from_sparql(sparql, kc2) +print(f" Original: {sprint2}") +print(f" Imported: {imported}") +print(f" Additions match: {len(imported.additions) == len(sprint2.additions)}") +print(f" Removals match: {len(imported.removals) >= 1}") +print() + +# Apply the imported diff +imported.apply(kc2) +print(f" After applying imported diff: {len(kc2.element_ids())} elements") +print() + +print("Done.") diff --git a/examples/10_markdown_codec/markdown_codec.py b/examples/10_markdown_codec/markdown_codec.py new file mode 100644 index 0000000..1559bf1 --- /dev/null +++ b/examples/10_markdown_codec/markdown_codec.py @@ -0,0 +1,266 @@ +""" +markdown_codec.py — Round-trip between KC elements and YAML+markdown files. + +Models a research literature domain: + - 3 Concepts (vertices): attention, transformers, scaling + - 2 Papers (vertices): "Attention Is All You Need", "BERT" + - 3 edge types covering all vertex-pair combinations: + * Considers (Paper × Concept): how a paper engages a concept + * Cites (Paper × Paper): how papers build on each other + * Relates (Concept × Concept): how concepts connect + - 2 face types for the rich triangular semantics: + * Debate (Paper × Paper × Concept): two papers' agreement/disagreement on one concept + * Comparison (Paper × Concept × Concept): how a paper interrelates two concepts + +Demonstrates: + 1. Build KC with elements and URIs pointing to markdown files + 2. Register MarkdownCodec for each type + 3. Compile all elements to markdown files + 4. Simulate an external edit (Obsidian user adds notes) + 5. Decompile: read changes back into the KC + 6. Verify KC ↔ filesystem consistency + +Run: + python examples/10_markdown_codec/markdown_codec.py +""" + +import tempfile +from pathlib import Path + +from knowledgecomplex import SchemaBuilder, KnowledgeComplex, text, vocab +from knowledgecomplex.codecs import MarkdownCodec, verify_documents + +# ── Schema ────────────────────────────────────────────────────────────────── + +sb = SchemaBuilder(namespace="lit") + +# Vertex types +sb.add_vertex_type("Concept", attributes={ + "name": text(), + "description": text(), + "notes": text(required=False), +}) +sb.add_vertex_type("Paper", attributes={ + "name": text(), + "author": text(), + "abstract": text(), + "notes": text(required=False), +}) + +# Edge types — all 3 vertex-pair combinations +sb.add_edge_type("Considers", attributes={ + "context": text(), + "notes": text(required=False), +}) +sb.add_edge_type("Cites", attributes={ + "context": text(), + "notes": text(required=False), +}) +sb.add_edge_type("Relates", attributes={ + "context": text(), + "notes": text(required=False), +}) + +# Face types — rich triangular semantics +sb.add_face_type("Debate", attributes={ + "summary": text(), + "notes": text(required=False), +}) +sb.add_face_type("Comparison", attributes={ + "summary": text(), + "notes": text(required=False), +}) + +# ── Build KC in a temp directory ──────────────────────────────────────────── + +with tempfile.TemporaryDirectory() as tmpdir: + root = Path(tmpdir) + + kc = KnowledgeComplex(schema=sb) + + # Helper to build file URIs + def uri(name: str) -> str: + return f"file://{root / name}.md" + + # --- Vertices --- + + kc.add_vertex("attention", type="Concept", uri=uri("concept-attention"), + name="Attention Mechanisms", + description="Weighted aggregation of sequence elements", + notes="") + + kc.add_vertex("transformers", type="Concept", uri=uri("concept-transformers"), + name="Transformer Architecture", + description="Self-attention based sequence model", + notes="") + + kc.add_vertex("scaling", type="Concept", uri=uri("concept-scaling"), + name="Scaling Laws", + description="Performance as a function of model and data size", + notes="") + + kc.add_vertex("vaswani", type="Paper", uri=uri("paper-vaswani"), + name="Attention Is All You Need", + author="Vaswani et al.", + abstract="Proposes the Transformer, dispensing with recurrence entirely", + notes="") + + kc.add_vertex("devlin", type="Paper", uri=uri("paper-devlin"), + name="BERT: Pre-training of Deep Bidirectional Transformers", + author="Devlin et al.", + abstract="Bidirectional pre-training via masked language modeling", + notes="") + + # --- Edges (all 3 pairings) --- + + # Paper × Concept + kc.add_edge("vaswani-attention", type="Considers", + vertices={"vaswani", "attention"}, uri=uri("considers-vaswani-attention"), + context="Introduces multi-head attention as the core mechanism", + notes="") + + kc.add_edge("vaswani-transformers", type="Considers", + vertices={"vaswani", "transformers"}, uri=uri("considers-vaswani-transformers"), + context="Defines the Transformer architecture", + notes="") + + kc.add_edge("devlin-attention", type="Considers", + vertices={"devlin", "attention"}, uri=uri("considers-devlin-attention"), + context="Extends attention to bidirectional context", + notes="") + + # Paper × Paper + kc.add_edge("devlin-cites-vaswani", type="Cites", + vertices={"devlin", "vaswani"}, uri=uri("cites-devlin-vaswani"), + context="BERT builds directly on the Transformer architecture", + notes="") + + # Concept × Concept + kc.add_edge("attention-transformers", type="Relates", + vertices={"attention", "transformers"}, uri=uri("relates-attention-transformers"), + context="Attention is the fundamental building block of transformers", + notes="") + + # --- Faces --- + + # Debate: Paper × Paper × Concept (two papers on one concept) + kc.add_face("debate-attention", type="Debate", + boundary=["vaswani-attention", "devlin-attention", "devlin-cites-vaswani"], + uri=uri("debate-attention"), + summary="Vaswani introduces attention; Devlin extends it bidirectionally. " + "The debate: is unidirectional attention sufficient?", + notes="") + + # Comparison: Paper × Concept × Concept (one paper, two concepts) + kc.add_face("comparison-vaswani", type="Comparison", + boundary=["vaswani-attention", "vaswani-transformers", "attention-transformers"], + uri=uri("comparison-vaswani"), + summary="Vaswani bridges attention and transformers — showing that " + "attention alone is sufficient for state-of-the-art sequence modeling", + notes="") + + print(f"Built KC: {len(kc.element_ids())} elements") + print() + + # ── Register codecs ──────────────────────────────────────────────────── + + concept_codec = MarkdownCodec( + frontmatter_attrs=["name", "description"], + section_attrs=["notes"], + ) + paper_codec = MarkdownCodec( + frontmatter_attrs=["name", "author", "abstract"], + section_attrs=["notes"], + ) + edge_codec = MarkdownCodec( + frontmatter_attrs=["context"], + section_attrs=["notes"], + ) + face_codec = MarkdownCodec( + frontmatter_attrs=["summary"], + section_attrs=["notes"], + ) + + kc.register_codec("Concept", concept_codec) + kc.register_codec("Paper", paper_codec) + kc.register_codec("Considers", edge_codec) + kc.register_codec("Cites", edge_codec) + kc.register_codec("Relates", edge_codec) + kc.register_codec("Debate", face_codec) + kc.register_codec("Comparison", face_codec) + + # ── Compile: KC → markdown files ─────────────────────────────────────── + + print("=== Compiling all elements to markdown ===") + for eid in sorted(kc.element_ids()): + elem = kc.element(eid) + if elem.uri: + elem.compile() + + # Show what was written + for f in sorted(root.glob("*.md")): + print(f" {f.name} ({f.stat().st_size} bytes)") + print() + + # Show one file's content + sample = root / "paper-vaswani.md" + print(f"=== Content of {sample.name} ===") + print(sample.read_text()) + + # ── Verify: KC ↔ filesystem ──────────────────────────────────────────── + + print("=== Verify (should be clean) ===") + issues = verify_documents(kc, root) + if not issues: + print(" All elements match their files.") + else: + for issue in issues: + print(f" {issue}") + print() + + # ── Simulate external edit (Obsidian user adds notes) ────────────────── + + print("=== Simulating external edit ===") + vaswani_file = root / "paper-vaswani.md" + original = vaswani_file.read_text() + modified = original.replace( + "## Notes\n\n(empty)", + "## Notes\n\nSeminal paper — introduced positional encoding and " + "multi-head attention. Key insight: parallelizable training." + ) + vaswani_file.write_text(modified) + print(f" Modified {vaswani_file.name} — added notes") + print() + + # ── Verify: detects the mismatch ─────────────────────────────────────── + + print("=== Verify (should detect mismatch) ===") + issues = verify_documents(kc, root) + for issue in issues: + print(f" {issue}") + print() + + # ── Decompile: read changes back into KC ─────────────────────────────── + + print("=== Decompile: reading changes back ===") + before = kc.element("vaswani").attrs.get("notes", "(none)") + print(f" Before decompile: notes = '{before}'") + + kc.element("vaswani").decompile() + + after = kc.element("vaswani").attrs.get("notes", "(none)") + print(f" After decompile: notes = '{after}'") + print() + + # ── Verify again: should be clean ────────────────────────────────────── + + print("=== Verify (should be clean again) ===") + issues = verify_documents(kc, root) + if not issues: + print(" All elements match their files.") + else: + for issue in issues: + print(f" {issue}") + print() + + print("Done.") diff --git a/examples/README.md b/examples/README.md new file mode 100644 index 0000000..38bb1a5 --- /dev/null +++ b/examples/README.md @@ -0,0 +1,28 @@ +# Examples + +Each example is a self-contained script in its own directory. +Generated outputs (PNGs, temp files) go into an `output/` subdirectory. + +| # | Directory | What it demonstrates | +|---|-----------|---------------------| +| 01 | `01_quickstart/` | Load a pre-built complex, discover triangles via clique detection, add faces, see topology change | +| 02 | `02_construction/` | Build a complex from scratch with schema, vertices, edges, faces, Betti numbers, and visualization | +| 03 | `03_topology/` | All 8 topological operators (boundary, coboundary, star, closure, link, etc.) with set algebra | +| 04 | `04_clique_inference/` | Explore-inspect-type workflow: find_cliques, infer_faces, edge_type filtering | +| 05 | `05_constraints/` | Schema-level query registration and topological constraint escalation to SHACL | +| 06 | `06_io_roundtrip/` | Multi-format I/O (Turtle, JSON-LD, N-Triples), additive loading, directory export/load | +| 07 | `07_filtration/` | Proper filtration: startup growing over quarters, birth tracking, Betti number evolution | +| 08 | `08_temporal_sweep/` | Non-filtration time slicing: elements with active_from/active_until, parameterized queries | +| 09 | `09_diff_sequence/` | ComplexDiff and ComplexSequence: sprint-by-sprint evolution with SPARQL UPDATE export/import | +| 10 | `10_markdown_codec/` | MarkdownCodec: round-trip KC elements to YAML+markdown files, verify consistency | + +## Running + +From the repository root: + +```sh +pip install knowledgecomplex[analysis,viz] +python examples/01_quickstart/quickstart.py +``` + +Each script prints its results and saves any generated images to its `output/` directory. diff --git a/knowledgecomplex/__init__.py b/knowledgecomplex/__init__.py index 173015c..2fb39c1 100644 --- a/knowledgecomplex/__init__.py +++ b/knowledgecomplex/__init__.py @@ -2,17 +2,90 @@ # Internal dependencies: rdflib, pyshacl, owlrl # These are never re-exported. The public API is schema.py and graph.py only. -from knowledgecomplex.schema import SchemaBuilder, vocab, text, TextDescriptor -from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.schema import SchemaBuilder, vocab, text, TextDescriptor, Codec +from knowledgecomplex.graph import KnowledgeComplex, Element +from knowledgecomplex.filtration import Filtration from knowledgecomplex.exceptions import ValidationError, SchemaError, UnknownQueryError +from knowledgecomplex.audit import AuditReport, AuditViolation, audit_file +from knowledgecomplex.io import save_graph, load_graph, dump_graph +from knowledgecomplex.clique import find_cliques, infer_faces, fill_cliques +from knowledgecomplex.diff import ComplexDiff, ComplexSequence +from knowledgecomplex.codecs import MarkdownCodec, verify_documents +from knowledgecomplex.viz import ( + to_networkx, verify_networkx, type_color_map, + plot_hasse, plot_hasse_star, plot_hasse_skeleton, + plot_geometric, plot_geometric_interactive, + plot_complex, plot_star, plot_skeleton, # deprecated aliases +) + +try: + from knowledgecomplex.analysis import ( + boundary_matrices, + betti_numbers, + euler_characteristic, + hodge_laplacian, + edge_pagerank, + edge_pagerank_all, + hodge_decomposition, + edge_influence, + hodge_analysis, + graph_laplacian, + approximate_pagerank, + heat_kernel_pagerank, + sweep_cut, + local_partition, + edge_sweep_cut, + edge_local_partition, + BoundaryMatrices, + HodgeDecomposition, + EdgeInfluence, + HodgeAnalysisResults, + SweepCut, + EdgeSweepCut, + ) + _HAS_ANALYSIS = True +except ImportError: + _HAS_ANALYSIS = False __all__ = [ - "SchemaBuilder", - "vocab", - "text", - "TextDescriptor", - "KnowledgeComplex", - "ValidationError", - "SchemaError", - "UnknownQueryError", + # Schema authoring + "SchemaBuilder", "vocab", "text", "TextDescriptor", "Codec", + # Instance management + "KnowledgeComplex", "Element", + # Filtrations + "Filtration", + # Exceptions + "ValidationError", "SchemaError", "UnknownQueryError", + # File I/O + "save_graph", "load_graph", "dump_graph", + # Visualization — Hasse diagrams + "to_networkx", "verify_networkx", "type_color_map", + "plot_hasse", "plot_hasse_star", "plot_hasse_skeleton", + # Visualization — geometric realization + "plot_geometric", "plot_geometric_interactive", + # Visualization — deprecated aliases + "plot_complex", "plot_star", "plot_skeleton", + # Clique inference + "find_cliques", "infer_faces", "fill_cliques", + # Diffs and sequences + "ComplexDiff", "ComplexSequence", + # Codecs + "MarkdownCodec", "verify_documents", + # Audit + "AuditReport", "AuditViolation", "audit_file", ] + +if _HAS_ANALYSIS: + __all__ += [ + # Algebraic topology + "boundary_matrices", "betti_numbers", "euler_characteristic", + "hodge_laplacian", "edge_pagerank", "edge_pagerank_all", + "hodge_decomposition", "edge_influence", "hodge_analysis", + "BoundaryMatrices", "HodgeDecomposition", "EdgeInfluence", + "HodgeAnalysisResults", + # Graph analysis + "graph_laplacian", "approximate_pagerank", "heat_kernel_pagerank", + "sweep_cut", "local_partition", + "edge_sweep_cut", "edge_local_partition", + "SweepCut", "EdgeSweepCut", + ] diff --git a/knowledgecomplex/analysis.py b/knowledgecomplex/analysis.py new file mode 100644 index 0000000..b0b6a76 --- /dev/null +++ b/knowledgecomplex/analysis.py @@ -0,0 +1,1177 @@ +""" +knowledgecomplex.analysis — Algebraic topology over knowledge complexes. + +Boundary matrices, Betti numbers, Hodge Laplacians, edge PageRank, +and Hodge decomposition of edge flows. + +Requires: numpy, scipy (install with ``pip install knowledgecomplex[analysis]``). +""" + +from __future__ import annotations +from dataclasses import dataclass +from typing import TYPE_CHECKING + +import numpy as np +import scipy.sparse as sp +from scipy.sparse.linalg import cg, splu +from scipy.linalg import expm + +if TYPE_CHECKING: + from knowledgecomplex.graph import KnowledgeComplex + + +# --------------------------------------------------------------------------- +# Dataclasses +# --------------------------------------------------------------------------- + +@dataclass +class BoundaryMatrices: + """Boundary operators and element-to-index mappings.""" + B1: sp.csr_matrix # (n_vertices, n_edges) + B2: sp.csr_matrix # (n_edges, n_faces) + vertex_index: dict[str, int] + edge_index: dict[str, int] + face_index: dict[str, int] + index_vertex: dict[int, str] + index_edge: dict[int, str] + index_face: dict[int, str] + + def __repr__(self) -> str: + return (f"BoundaryMatrices(vertices={len(self.vertex_index)}, " + f"edges={len(self.edge_index)}, faces={len(self.face_index)})") + + +@dataclass +class HodgeDecomposition: + """Orthogonal decomposition of an edge flow.""" + gradient: np.ndarray # im(B1ᵀ) — flows from vertices + curl: np.ndarray # im(B2) — flows from faces + harmonic: np.ndarray # ker(L₁) — topological cycles + + +@dataclass +class EdgeInfluence: + """Influence measures for an edge's PageRank vector.""" + edge_id: str + spread: float # ||v||₂ / ||v||₁ + absolute_influence: float # ||v||₁ + penetration: float # ||v||₂ + relative_influence: float # Σv + + +@dataclass +class SweepCut: + """Result of a vertex sweep cut.""" + vertices: set[str] + conductance: float + volume: int + boundary_edges: int + + def __repr__(self) -> str: + return f"SweepCut(vertices={len(self.vertices)}, conductance={self.conductance:.4f})" + + +@dataclass +class EdgeSweepCut: + """Result of an edge sweep cut.""" + edges: set[str] + conductance: float + volume: int + + def __repr__(self) -> str: + return f"EdgeSweepCut(edges={len(self.edges)}, conductance={self.conductance:.4f})" + + +@dataclass +class HodgeAnalysisResults: + """Complete Hodge analysis output.""" + betti: list[int] + euler_characteristic: int + boundary_matrices: BoundaryMatrices + laplacian: sp.csr_matrix + pagerank: np.ndarray # (n_edges, n_edges) + decompositions: dict[str, HodgeDecomposition] + influences: dict[str, EdgeInfluence] + + def __repr__(self) -> str: + ne = len(self.decompositions) + return f"HodgeAnalysisResults(betti={self.betti}, edges={ne})" + + +# --------------------------------------------------------------------------- +# Weight matrices +# --------------------------------------------------------------------------- + +def _weight_matrices( + bm: BoundaryMatrices, + weights: dict[str, float] | None, +) -> tuple[sp.dia_matrix, sp.dia_matrix, sp.dia_matrix]: + """Build diagonal weight matrices W₀, W₁, W₂ from a weights dict. + + Returns identity matrices when weights is None. Missing elements + default to weight 1.0. + """ + nv = len(bm.vertex_index) + ne = len(bm.edge_index) + nf = len(bm.face_index) + + if weights is None: + return ( + sp.eye(nv, format="dia"), + sp.eye(ne, format="dia"), + sp.eye(nf, format="dia"), + ) + + w0 = np.array([weights.get(bm.index_vertex[i], 1.0) for i in range(nv)]) + w1 = np.array([weights.get(bm.index_edge[i], 1.0) for i in range(ne)]) + w2 = np.array([weights.get(bm.index_face[i], 1.0) for i in range(nf)]) if nf > 0 else np.array([]) + + return ( + sp.diags(w0, format="dia") if nv > 0 else sp.eye(0, format="dia"), + sp.diags(w1, format="dia") if ne > 0 else sp.eye(0, format="dia"), + sp.diags(w2, format="dia") if nf > 0 else sp.eye(0, format="dia"), + ) + + +# --------------------------------------------------------------------------- +# Boundary matrices +# --------------------------------------------------------------------------- + +def boundary_matrices(kc: "KnowledgeComplex") -> BoundaryMatrices: + """ + Build the boundary operator matrices B1 (∂₁) and B2 (∂₂). + + B1 is (n_vertices × n_edges) with entries ±1 encoding which vertices + bound each edge. B2 is (n_edges × n_faces) with entries ±1 encoding + which edges bound each face. + + Parameters + ---------- + kc : KnowledgeComplex + + Returns + ------- + BoundaryMatrices + """ + # Enumerate elements by dimension + vertices = sorted(kc.skeleton(0) - kc.skeleton(1)) + # skeleton(0) = vertices, skeleton(1) = vertices + edges + all_v = set() + all_e = set() + all_f = set() + for eid in kc.element_ids(): + elem = kc.element(eid) + kind = kc._schema._types.get(elem.type, {}).get("kind") + if kind == "vertex": + all_v.add(eid) + elif kind == "edge": + all_e.add(eid) + elif kind == "face": + all_f.add(eid) + + vertices = sorted(all_v) + edges = sorted(all_e) + faces = sorted(all_f) + + vertex_index = {v: i for i, v in enumerate(vertices)} + edge_index = {e: i for i, e in enumerate(edges)} + face_index = {f: i for i, f in enumerate(faces)} + + nv, ne, nf = len(vertices), len(edges), len(faces) + + # B1: vertices × edges + # For each edge, find its 2 boundary vertices. + # Convention: for edge e = {v_i, v_j} with i < j, B1[i,e] = -1, B1[j,e] = +1 + rows1, cols1, vals1 = [], [], [] + for e_id in edges: + bnd = sorted(kc.boundary(e_id), key=lambda v: vertex_index.get(v, 0)) + if len(bnd) == 2: + r0 = vertex_index[bnd[0]] + r1 = vertex_index[bnd[1]] + c = edge_index[e_id] + rows1.extend([r0, r1]) + cols1.extend([c, c]) + vals1.extend([-1.0, 1.0]) + + B1 = sp.csr_matrix( + (vals1, (rows1, cols1)), shape=(nv, ne), dtype=np.float64 + ) if ne > 0 else sp.csr_matrix((nv, 0), dtype=np.float64) + + # B2: edges × faces + # For each face, find its 3 boundary edges. + # Orientation: assign signs so that ∂₁∘∂₂ = 0. + # We pick a consistent orientation per face by walking the vertex cycle. + rows2, cols2, vals2 = [], [], [] + for f_id in faces: + bnd_edges = list(kc.boundary(f_id)) + if len(bnd_edges) == 3: + c = face_index[f_id] + # Get the vertex sets for each boundary edge + edge_verts = {} + for be in bnd_edges: + edge_verts[be] = kc.boundary(be) + + # Orient: find a vertex ordering (v_a, v_b, v_c) and assign signs + # to edges based on whether they agree with the cycle orientation + signs = _orient_face(bnd_edges, edge_verts, vertex_index) + for be, sign in zip(bnd_edges, signs): + rows2.append(edge_index[be]) + cols2.append(c) + vals2.append(sign) + + B2 = sp.csr_matrix( + (vals2, (rows2, cols2)), shape=(ne, nf), dtype=np.float64 + ) if nf > 0 else sp.csr_matrix((ne, 0), dtype=np.float64) + + return BoundaryMatrices( + B1=B1, B2=B2, + vertex_index=vertex_index, + edge_index=edge_index, + face_index=face_index, + index_vertex={v: k for k, v in vertex_index.items()}, + index_edge={v: k for k, v in edge_index.items()}, + index_face={v: k for k, v in face_index.items()}, + ) + + +def _orient_face( + edges: list[str], + edge_verts: dict[str, set[str]], + vertex_index: dict[str, int], +) -> list[float]: + """Assign ±1 signs to face boundary edges for a consistent orientation. + + Given a triangular face with 3 edges, find a vertex cycle (a, b, c) and + assign +1 to edges traversed in cycle order, -1 to those against. + This ensures ∂₁ ∘ ∂₂ = 0. + """ + # Collect all vertices of the face + all_verts: set[str] = set() + for vs in edge_verts.values(): + all_verts |= vs + verts = sorted(all_verts, key=lambda v: vertex_index.get(v, 0)) + + if len(verts) != 3: + return [1.0] * len(edges) + + a, b, c = verts # sorted by index + + # Define cycle: a → b → c → a + # For each edge, check if it goes with or against the cycle + signs = [] + for e in edges: + ev = edge_verts[e] + ev_sorted = sorted(ev, key=lambda v: vertex_index.get(v, 0)) + if len(ev_sorted) != 2: + signs.append(1.0) + continue + v0, v1 = ev_sorted # v0 has lower index + + # Cycle pairs in order: (a,b), (b,c), (a,c) + # Edge orientation convention: B1[lower, e] = -1, B1[higher, e] = +1 + # So the edge "points" from lower-index to higher-index vertex. + # Cycle: a→b→c→a + # (a,b): cycle goes a→b, edge goes a→b (same) → +1 + # (b,c): cycle goes b→c, edge goes b→c (same) → +1 + # (a,c): cycle goes c→a, edge goes a→c (opposite) → -1 + if (v0, v1) == (a, b): + signs.append(1.0) + elif (v0, v1) == (b, c): + signs.append(1.0) + elif (v0, v1) == (a, c): + signs.append(-1.0) + else: + signs.append(1.0) + + return signs + + +# --------------------------------------------------------------------------- +# Betti numbers +# --------------------------------------------------------------------------- + +def betti_numbers(kc: "KnowledgeComplex") -> list[int]: + """ + Compute Betti numbers [β₀, β₁, β₂] of the complex. + + β_k = nullity(∂_k) - rank(∂_{k+1}) + + Parameters + ---------- + kc : KnowledgeComplex + + Returns + ------- + list[int] + [β₀, β₁, β₂] + """ + bm = boundary_matrices(kc) + nv = bm.B1.shape[0] + ne = bm.B1.shape[1] + nf = bm.B2.shape[1] + + rank_B1 = _matrix_rank(bm.B1) if ne > 0 else 0 + rank_B2 = _matrix_rank(bm.B2) if nf > 0 else 0 + + # β₀ = nullity(∂₁) at dimension 0 + # ∂₀ doesn't exist (or is zero), so β₀ = n_vertices - rank(∂₁) + beta0 = nv - rank_B1 + + # β₁ = nullity(∂₁) - rank(∂₂) = (n_edges - rank_B1) - rank_B2 + beta1 = (ne - rank_B1) - rank_B2 if ne > 0 else 0 + + # β₂ = nullity(∂₂) - rank(∂₃) = (n_faces - rank_B2) - 0 + beta2 = nf - rank_B2 if nf > 0 else 0 + + return [beta0, beta1, beta2] + + +def euler_characteristic(kc: "KnowledgeComplex") -> int: + """ + Compute the Euler characteristic χ = V - E + F. + + Parameters + ---------- + kc : KnowledgeComplex + + Returns + ------- + int + """ + bm = boundary_matrices(kc) + return bm.B1.shape[0] - bm.B1.shape[1] + bm.B2.shape[1] + + +def _matrix_rank(M: sp.csr_matrix, tol: float = 1e-10) -> int: + """Compute rank of a sparse matrix via SVD.""" + if M.shape[0] == 0 or M.shape[1] == 0: + return 0 + dense = M.toarray() + s = np.linalg.svd(dense, compute_uv=False) + return int(np.sum(s > tol)) + + +# --------------------------------------------------------------------------- +# Hodge Laplacian +# --------------------------------------------------------------------------- + +def hodge_laplacian( + kc: "KnowledgeComplex", + weighted: bool = False, + weights: dict[str, float] | None = None, +) -> sp.csr_matrix: + """ + Compute the edge Hodge Laplacian L₁. + + Combinatorial (default): + L₁ = B1ᵀ W₀ B1 + B2 W₂ B2ᵀ + + where W₀ and W₂ are diagonal simplex weight matrices (identity when + weights is None). + + Degree-weighted: + L₁ = B1ᵀ D₀⁻¹ W₀ B1 + D₁⁻¹ B2 W₂ B2ᵀ + + Parameters + ---------- + kc : KnowledgeComplex + weighted : bool + If True, also apply degree normalization. + weights : dict[str, float], optional + Map from element IDs to scalar weights. Missing elements default + to 1.0. Vertex weights enter W₀, face weights enter W₂. + + Returns + ------- + scipy.sparse.csr_matrix + (n_edges, n_edges) + """ + bm = boundary_matrices(kc) + ne = bm.B1.shape[1] + + if ne == 0: + return sp.csr_matrix((0, 0), dtype=np.float64) + + W0, _W1, W2 = _weight_matrices(bm, weights) + + if not weighted: + down = bm.B1.T @ W0 @ bm.B1 + up = bm.B2 @ W2 @ bm.B2.T if bm.B2.shape[1] > 0 else sp.csr_matrix((ne, ne), dtype=np.float64) + L = (down + up).tocsr() + return ((L + L.T) / 2).tocsr() + else: + # D₀: diagonal vertex degrees + vertex_degrees = np.array(np.abs(bm.B1).sum(axis=1)).flatten() + vertex_degrees[vertex_degrees == 0] = 1.0 + D0_inv = sp.diags(1.0 / vertex_degrees, format="csr") + + # D₁: diagonal edge face-degrees + if bm.B2.shape[1] > 0: + edge_face_degrees = np.array(np.abs(bm.B2).sum(axis=1)).flatten() + else: + edge_face_degrees = np.zeros(ne) + edge_face_degrees[edge_face_degrees == 0] = 1.0 + D1_inv_sqrt = sp.diags(1.0 / np.sqrt(edge_face_degrees), format="csr") + + down = bm.B1.T @ D0_inv @ W0 @ bm.B1 + up = D1_inv_sqrt @ bm.B2 @ W2 @ bm.B2.T @ D1_inv_sqrt if bm.B2.shape[1] > 0 else sp.csr_matrix((ne, ne), dtype=np.float64) + L = (down + up).tocsr() + return ((L + L.T) / 2).tocsr() + + +# --------------------------------------------------------------------------- +# Edge PageRank +# --------------------------------------------------------------------------- + +def edge_pagerank( + kc: "KnowledgeComplex", + edge_id: str, + beta: float = 0.1, + weighted: bool = False, + weights: dict[str, float] | None = None, +) -> np.ndarray: + """ + Compute personalized edge PageRank for a single edge. + + PR_e = (βI + L₁)⁻¹ χ_e + + Parameters + ---------- + kc : KnowledgeComplex + edge_id : str + beta : float + weighted : bool + weights : dict[str, float], optional + Simplex weights (see hodge_laplacian). + + Returns + ------- + np.ndarray + (n_edges,) + """ + bm = boundary_matrices(kc) + L1 = hodge_laplacian(kc, weighted=weighted, weights=weights) + ne = L1.shape[0] + + A = beta * sp.eye(ne, format="csr") + L1 + indicator = np.zeros(ne) + indicator[bm.edge_index[edge_id]] = 1.0 + + return _solve_spd(A, indicator) + + +def edge_pagerank_all( + kc: "KnowledgeComplex", + beta: float = 0.1, + weighted: bool = False, + weights: dict[str, float] | None = None, +) -> np.ndarray: + """ + Compute edge PageRank for all edges via matrix factorization. + + Factorizes (βI + L₁) once, then solves for each column of the identity. + Equivalent to computing (βI + L₁)⁻¹. + + Parameters + ---------- + kc : KnowledgeComplex + beta : float + weighted : bool + weights : dict[str, float], optional + Simplex weights (see hodge_laplacian). + + Returns + ------- + np.ndarray + (n_edges, n_edges) — column i is the PageRank vector for edge i. + """ + L1 = hodge_laplacian(kc, weighted=weighted, weights=weights) + ne = L1.shape[0] + + if ne == 0: + return np.empty((0, 0)) + + A = beta * sp.eye(ne, format="csc") + L1.tocsc() + + # Factor once (SPD → LU on sparse, or Cholesky) + factor = splu(A) + result = np.zeros((ne, ne)) + for i in range(ne): + rhs = np.zeros(ne) + rhs[i] = 1.0 + result[:, i] = factor.solve(rhs) + + return result + + +def _solve_spd(A: sp.csr_matrix, b: np.ndarray) -> np.ndarray: + """Solve Ax = b for SPD matrix A. Try CG, fall back to dense.""" + x, info = cg(A, b, atol=1e-12, maxiter=1000) + if info != 0: + x = np.linalg.solve(A.toarray(), b) + return x + + +# --------------------------------------------------------------------------- +# Hodge decomposition +# --------------------------------------------------------------------------- + +def hodge_decomposition( + kc: "KnowledgeComplex", + flow: np.ndarray, + weights: dict[str, float] | None = None, +) -> HodgeDecomposition: + """ + Decompose an edge flow into gradient + curl + harmonic components. + + flow = gradient + curl + harmonic + + where: + - gradient ∈ im(W₀^{1/2} B1ᵀ) — vertex-driven flow + - curl ∈ im(W₂^{1/2} B2) — face-driven circulation + - harmonic ∈ ker(L₁) — topological cycles + + When weights is None, W₀ and W₂ are identity (standard decomposition). + + Parameters + ---------- + kc : KnowledgeComplex + flow : np.ndarray + (n_edges,) + weights : dict[str, float], optional + Simplex weights. Affects the inner product used for projection. + + Returns + ------- + HodgeDecomposition + """ + bm = boundary_matrices(kc) + W0, _W1, W2 = _weight_matrices(bm, weights) + + # Weighted projection operators + # gradient lives in im(B1ᵀ W₀^{1/2}), curl in im(B2 W₂^{1/2}) + # but for the orthogonal decomposition with weighted inner product, + # we project onto im(B1ᵀ) with W₀-weighted inner product on vertices + # Practically: project onto im(sqrt(W₀) B1ᵀ) in standard inner product + if weights is not None: + w0_sqrt = sp.diags(np.sqrt(np.array(W0.diagonal())), format="csr") + w2_sqrt = sp.diags(np.sqrt(np.array(W2.diagonal())), format="csr") if W2.shape[0] > 0 else W2 + # B1.T is (ne × nv), W0_sqrt is (nv × nv) → B1.T @ W0_sqrt is (ne × nv) + grad_op = bm.B1.T @ w0_sqrt if bm.B1.shape[1] > 0 else bm.B1.T + curl_op = bm.B2 @ w2_sqrt if bm.B2.shape[1] > 0 else bm.B2 + else: + grad_op = bm.B1.T + curl_op = bm.B2 + + gradient = _project_onto_image(grad_op, flow) + curl = _project_onto_image(curl_op, flow) + harmonic = flow - gradient - curl + + return HodgeDecomposition( + gradient=gradient, + curl=curl, + harmonic=harmonic, + ) + + +def _project_onto_image( + A: sp.csr_matrix, + v: np.ndarray, + regularization: float = 1e-10, +) -> np.ndarray: + """Project v onto im(A): proj = A (AᵀA + λI)⁻¹ Aᵀ v.""" + if A.shape[1] == 0: + return np.zeros_like(v) + + ATA = A.T @ A + ATA_reg = ATA + regularization * sp.eye(ATA.shape[0], format="csr") + ATv = A.T @ v + + x, info = cg(ATA_reg, ATv, atol=1e-12, maxiter=1000) + if info != 0: + x = np.linalg.lstsq(ATA_reg.toarray(), ATv, rcond=None)[0] + + return A @ x + + +# --------------------------------------------------------------------------- +# Edge influence +# --------------------------------------------------------------------------- + +def edge_influence(edge_id: str, pr_vector: np.ndarray) -> EdgeInfluence: + """ + Compute influence measures from a PageRank vector. + + Parameters + ---------- + edge_id : str + pr_vector : np.ndarray + + Returns + ------- + EdgeInfluence + """ + l1 = float(np.sum(np.abs(pr_vector))) + l2 = float(np.linalg.norm(pr_vector)) + spread = l2 / l1 if l1 > 0 else 0.0 + return EdgeInfluence( + edge_id=edge_id, + spread=spread, + absolute_influence=l1, + penetration=l2, + relative_influence=float(np.sum(pr_vector)), + ) + + +# --------------------------------------------------------------------------- +# Full analysis +# --------------------------------------------------------------------------- + +def hodge_analysis( + kc: "KnowledgeComplex", + beta: float = 0.1, + weighted: bool = False, + weights: dict[str, float] | None = None, +) -> HodgeAnalysisResults: + """ + Run complete Hodge analysis on a knowledge complex. + + Computes boundary matrices, Betti numbers, Hodge Laplacian, + edge PageRank for all edges, Hodge decomposition, and influence measures. + + Parameters + ---------- + kc : KnowledgeComplex + beta : float + weighted : bool + weights : dict[str, float], optional + Simplex weights (see hodge_laplacian). + + Returns + ------- + HodgeAnalysisResults + """ + bm = boundary_matrices(kc) + betti = betti_numbers(kc) + chi = euler_characteristic(kc) + L1 = hodge_laplacian(kc, weighted=weighted, weights=weights) + pr = edge_pagerank_all(kc, beta=beta, weighted=weighted, weights=weights) + + decomps: dict[str, HodgeDecomposition] = {} + infls: dict[str, EdgeInfluence] = {} + for eid, idx in bm.edge_index.items(): + pr_vec = pr[:, idx] + decomps[eid] = hodge_decomposition(kc, pr_vec, weights=weights) + infls[eid] = edge_influence(eid, pr_vec) + + return HodgeAnalysisResults( + betti=betti, + euler_characteristic=chi, + boundary_matrices=bm, + laplacian=L1, + pagerank=pr, + decompositions=decomps, + influences=infls, + ) + + +# --------------------------------------------------------------------------- +# Graph Laplacian (on the 1-skeleton) +# --------------------------------------------------------------------------- + +def graph_laplacian(kc: "KnowledgeComplex") -> sp.csr_matrix: + """ + Compute the normalized graph Laplacian L = I - D⁻¹A on the 1-skeleton. + + Parameters + ---------- + kc : KnowledgeComplex + + Returns + ------- + scipy.sparse.csr_matrix + (n_vertices, n_vertices) + """ + bm = boundary_matrices(kc) + nv = len(bm.vertex_index) + + if nv == 0: + return sp.csr_matrix((0, 0), dtype=np.float64) + + # Build adjacency matrix from B1 + # A = |B1| |B1|ᵀ - D but simpler: walk the edges directly + rows, cols, vals = [], [], [] + for e_id, e_idx in bm.edge_index.items(): + bnd = list(kc.boundary(e_id)) + if len(bnd) == 2: + i = bm.vertex_index[bnd[0]] + j = bm.vertex_index[bnd[1]] + rows.extend([i, j]) + cols.extend([j, i]) + vals.extend([1.0, 1.0]) + + A = sp.csr_matrix((vals, (rows, cols)), shape=(nv, nv), dtype=np.float64) + degrees = np.array(A.sum(axis=1)).flatten() + degrees[degrees == 0] = 1.0 + D_inv = sp.diags(1.0 / degrees, format="csr") + + L = sp.eye(nv, format="csr") - D_inv @ A + return ((L + L.T) / 2).tocsr() + + +def _adjacency_and_degrees(kc: "KnowledgeComplex", bm: BoundaryMatrices): + """Build adjacency matrix and degree dict for the 1-skeleton.""" + nv = len(bm.vertex_index) + rows, cols, vals = [], [], [] + for e_id in bm.edge_index: + bnd = list(kc.boundary(e_id)) + if len(bnd) == 2: + i = bm.vertex_index[bnd[0]] + j = bm.vertex_index[bnd[1]] + rows.extend([i, j]) + cols.extend([j, i]) + vals.extend([1.0, 1.0]) + + A = sp.csr_matrix((vals, (rows, cols)), shape=(nv, nv), dtype=np.float64) + degrees = np.array(A.sum(axis=1)).flatten().astype(int) + + # Build id->degree map + deg_map = {} + for vid, idx in bm.vertex_index.items(): + deg_map[vid] = int(degrees[idx]) + + return A, deg_map + + +# --------------------------------------------------------------------------- +# Approximate PageRank (Andersen-Chung-Lang push algorithm) +# --------------------------------------------------------------------------- + +def approximate_pagerank( + kc: "KnowledgeComplex", + seed: str, + alpha: float = 0.15, + epsilon: float = 1e-4, +) -> tuple[dict[str, float], dict[str, float]]: + """ + Compute approximate PageRank via the push algorithm. + + Follows Andersen-Chung-Lang (FOCS 2006). Uses lazy random walk + W = (I + D⁻¹A)/2. Maintains invariant p + pr(α, r) = pr(α, χ_seed). + + Parameters + ---------- + kc : KnowledgeComplex + seed : str + Starting vertex. + alpha : float + Teleportation constant (higher = more local). + epsilon : float + Convergence threshold: stops when max r(u)/d(u) < epsilon. + + Returns + ------- + tuple[dict[str, float], dict[str, float]] + (p, r) — approximate PageRank vector and residual. + """ + bm = boundary_matrices(kc) + _, deg_map = _adjacency_and_degrees(kc, bm) + + # Neighbor lookup + neighbors: dict[str, list[str]] = {v: [] for v in bm.vertex_index} + for e_id in bm.edge_index: + bnd = list(kc.boundary(e_id)) + if len(bnd) == 2: + neighbors[bnd[0]].append(bnd[1]) + neighbors[bnd[1]].append(bnd[0]) + + p: dict[str, float] = {} + r: dict[str, float] = {seed: 1.0} + + # Push loop + while True: + # Find vertex with max r(u)/d(u) + best_u = None + best_ratio = 0.0 + for u, rv in r.items(): + d = max(deg_map.get(u, 1), 1) + ratio = rv / d + if ratio > best_ratio: + best_ratio = ratio + best_u = u + + if best_ratio < epsilon or best_u is None: + break + + # Push operation at best_u + u = best_u + ru = r[u] + d_u = max(deg_map.get(u, 1), 1) + + # Move alpha fraction to p + p[u] = p.get(u, 0) + alpha * ru + + # Spread (1-alpha) fraction via lazy walk: half stays, half spreads + r[u] = (1 - alpha) * ru / 2 + + spread = (1 - alpha) * ru / (2 * d_u) + for v in neighbors.get(u, []): + r[v] = r.get(v, 0) + spread + + return p, r + + +# --------------------------------------------------------------------------- +# Heat kernel PageRank +# --------------------------------------------------------------------------- + +def heat_kernel_pagerank( + kc: "KnowledgeComplex", + seed: str, + t: float = 5.0, + num_terms: int = 30, +) -> dict[str, float]: + """ + Compute heat kernel PageRank ρ_{t,seed} on the 1-skeleton. + + ρ_{t,u} = e^{-t} Σ_{k=0}^{N} (t^k / k!) χ_u W^k + + where W = D⁻¹A is the random walk transition matrix. + + Parameters + ---------- + kc : KnowledgeComplex + seed : str + Starting vertex. + t : float + Heat parameter (temperature). Small t = local, large t = global. + num_terms : int + Number of terms in the Taylor expansion. + + Returns + ------- + dict[str, float] + Mapping from vertex IDs to PageRank values. + """ + bm = boundary_matrices(kc) + nv = len(bm.vertex_index) + + if nv == 0: + return {} + + # Build W = D⁻¹A (random walk transition matrix) + rows, cols, vals = [], [], [] + for e_id in bm.edge_index: + bnd = list(kc.boundary(e_id)) + if len(bnd) == 2: + i = bm.vertex_index[bnd[0]] + j = bm.vertex_index[bnd[1]] + rows.extend([i, j]) + cols.extend([j, i]) + vals.extend([1.0, 1.0]) + + A = sp.csr_matrix((vals, (rows, cols)), shape=(nv, nv), dtype=np.float64) + degrees = np.array(A.sum(axis=1)).flatten() + degrees[degrees == 0] = 1.0 + D_inv = sp.diags(1.0 / degrees, format="csr") + W = D_inv @ A + + # Compute ρ = e^{-t} Σ (t^k / k!) χ_u W^k via Taylor expansion + seed_idx = bm.vertex_index[seed] + chi = np.zeros(nv) + chi[seed_idx] = 1.0 + + result = np.zeros(nv) + current = chi.copy() # χ_u W^0 = χ_u + factorial = 1.0 + + for k in range(num_terms): + if k > 0: + factorial *= k + current = current @ W.toarray() + result += (t ** k / factorial) * current + + result *= np.exp(-t) + + return {bm.index_vertex[i]: float(result[i]) for i in range(nv)} + + +# --------------------------------------------------------------------------- +# Sweep cut (graph version) +# --------------------------------------------------------------------------- + +def sweep_cut( + kc: "KnowledgeComplex", + distribution: dict[str, float], + max_volume: int | None = None, +) -> SweepCut: + """ + Sweep a vertex distribution to find a cut with minimum conductance. + + Sorts vertices by p(v)/d(v) descending, computes conductance of each + prefix set, returns the cut with minimum conductance. + + Parameters + ---------- + kc : KnowledgeComplex + distribution : dict[str, float] + Vertex distribution (e.g., from approximate_pagerank). + max_volume : int, optional + Maximum volume for the small side of the cut. + + Returns + ------- + SweepCut + """ + bm = boundary_matrices(kc) + _, deg_map = _adjacency_and_degrees(kc, bm) + + # Neighbor lookup + neighbors: dict[str, set[str]] = {v: set() for v in bm.vertex_index} + for e_id in bm.edge_index: + bnd = list(kc.boundary(e_id)) + if len(bnd) == 2: + neighbors[bnd[0]].add(bnd[1]) + neighbors[bnd[1]].add(bnd[0]) + + total_volume = sum(deg_map.values()) + + # Sort vertices by p(v)/d(v) descending + scored = [] + for vid in bm.vertex_index: + pv = distribution.get(vid, 0.0) + dv = max(deg_map.get(vid, 1), 1) + scored.append((vid, pv / dv)) + scored.sort(key=lambda x: -x[1]) + + # Sweep: incrementally build S, track boundary edges and volume + best_cut = SweepCut(vertices=set(), conductance=float("inf"), volume=0, boundary_edges=0) + S: set[str] = set() + vol_S = 0 + boundary = 0 + + for vid, _ in scored: + d_v = deg_map.get(vid, 0) + # Update boundary: edges from vid to S decrease boundary, + # edges from vid to outside S increase boundary + edges_to_S = len(neighbors[vid] & S) + edges_to_outside = d_v - edges_to_S + boundary = boundary - edges_to_S + edges_to_outside + + S.add(vid) + vol_S += d_v + + if vol_S == 0 or vol_S >= total_volume: + continue + + if max_volume is not None and vol_S > max_volume: + break + + denom = min(vol_S, total_volume - vol_S) + cond = boundary / denom if denom > 0 else float("inf") + + if cond < best_cut.conductance: + best_cut = SweepCut( + vertices=set(S), + conductance=cond, + volume=vol_S, + boundary_edges=boundary, + ) + + return best_cut + + +# --------------------------------------------------------------------------- +# Local partition (graph version) +# --------------------------------------------------------------------------- + +def local_partition( + kc: "KnowledgeComplex", + seed: str, + target_conductance: float = 0.5, + target_volume: int | None = None, + method: str = "pagerank", +) -> SweepCut: + """ + Find a local partition near a seed vertex. + + Parameters + ---------- + kc : KnowledgeComplex + seed : str + Starting vertex. + target_conductance : float + Target conductance for setting alpha/t. + target_volume : int, optional + Maximum volume for the small side. + method : str + "pagerank" — approximate PageRank (Andersen-Chung-Lang). + "heat_kernel" — heat kernel PageRank (Chung). + + Returns + ------- + SweepCut + """ + if method == "pagerank": + alpha = target_conductance ** 2 / (16 * np.log(sum( + max(kc.degree(v), 1) for v in kc.element_ids(type=None) + if kc._schema._types.get(kc.element(v).type, {}).get("kind") == "vertex" + ) + 1)) + alpha = max(min(alpha, 0.5), 0.01) + p, r = approximate_pagerank(kc, seed, alpha=alpha) + return sweep_cut(kc, p, max_volume=target_volume) + + elif method == "heat_kernel": + t = max(1.0, 4.0 / (target_conductance ** 2)) + rho = heat_kernel_pagerank(kc, seed, t=t) + return sweep_cut(kc, rho, max_volume=target_volume) + + else: + raise ValueError(f"Unknown method '{method}'. Use 'pagerank' or 'heat_kernel'.") + + +# --------------------------------------------------------------------------- +# Edge sweep cut (simplicial version) +# --------------------------------------------------------------------------- + +def edge_sweep_cut( + kc: "KnowledgeComplex", + edge_distribution: np.ndarray, + bm: BoundaryMatrices | None = None, +) -> EdgeSweepCut: + """ + Sweep an edge distribution to find an edge partition with minimum conductance. + + Sorts edges by |distribution(e)|/degree(e) descending, computes edge + conductance of each prefix. Edge conductance measures how many + vertex-boundary connections cross the partition. + + Parameters + ---------- + kc : KnowledgeComplex + edge_distribution : np.ndarray + (n_edges,) vector of edge values. + bm : BoundaryMatrices, optional + Pre-computed boundary matrices. + + Returns + ------- + EdgeSweepCut + """ + if bm is None: + bm = boundary_matrices(kc) + + ne = len(bm.edge_index) + if ne == 0: + return EdgeSweepCut(edges=set(), conductance=float("inf"), volume=0) + + # Edge degree: number of faces incident to each edge + number of vertices + # Use coboundary size as a measure of "degree" for edges + edge_degrees = np.array(np.abs(bm.B2).sum(axis=1)).flatten() + 2 # +2 for boundary vertices + + # Sort edges by |distribution(e)| / degree(e) descending + scored = [] + for eid, idx in bm.edge_index.items(): + val = abs(edge_distribution[idx]) + deg = max(edge_degrees[idx], 1) + scored.append((eid, idx, val / deg)) + scored.sort(key=lambda x: -x[2]) + + # Edge adjacency: two edges are adjacent if they share a vertex + # Build edge adjacency from B1 + edge_adj: dict[str, set[str]] = {e: set() for e in bm.edge_index} + # For each vertex, collect incident edges + vertex_edges: dict[int, list[str]] = {} + for eid, eidx in bm.edge_index.items(): + col = bm.B1[:, eidx] + for vidx in col.nonzero()[0]: + vertex_edges.setdefault(vidx, []).append(eid) + + for vidx, eids in vertex_edges.items(): + for i, e1 in enumerate(eids): + for e2 in eids[i + 1:]: + edge_adj[e1].add(e2) + edge_adj[e2].add(e1) + + total_edge_vol = int(sum(edge_degrees)) + S: set[str] = set() + vol_S = 0 + boundary = 0 + + best = EdgeSweepCut(edges=set(), conductance=float("inf"), volume=0) + + for eid, eidx, _ in scored: + d_e = int(edge_degrees[eidx]) + adj_in_S = len(edge_adj[eid] & S) + adj_outside = len(edge_adj[eid]) - adj_in_S + boundary = boundary - adj_in_S + adj_outside + + S.add(eid) + vol_S += d_e + + if vol_S == 0 or vol_S >= total_edge_vol: + continue + + denom = min(vol_S, total_edge_vol - vol_S) + cond = boundary / denom if denom > 0 else float("inf") + + if cond < best.conductance: + best = EdgeSweepCut(edges=set(S), conductance=cond, volume=vol_S) + + return best + + +# --------------------------------------------------------------------------- +# Edge local partition (simplicial version) +# --------------------------------------------------------------------------- + +def edge_local_partition( + kc: "KnowledgeComplex", + seed_edge: str, + t: float = 5.0, + beta: float = 0.1, + method: str = "hodge_heat", + weights: dict[str, float] | None = None, +) -> EdgeSweepCut: + """ + Find a local edge partition using the Hodge Laplacian. + + Parameters + ---------- + kc : KnowledgeComplex + seed_edge : str + Starting edge. + t : float + Heat parameter (for hodge_heat method). + beta : float + Regularization (for hodge_pagerank method). + method : str + "hodge_heat" — e^{-tL₁} χ_e (heat kernel on edges). + "hodge_pagerank" — (βI + L₁)⁻¹ χ_e (existing edge PageRank). + weights : dict[str, float], optional + Simplex weights. + + Returns + ------- + EdgeSweepCut + """ + bm = boundary_matrices(kc) + ne = len(bm.edge_index) + + if ne == 0: + return EdgeSweepCut(edges=set(), conductance=float("inf"), volume=0) + + L1 = hodge_laplacian(kc, weights=weights) + + if method == "hodge_pagerank": + dist = edge_pagerank(kc, seed_edge, beta=beta, weights=weights) + elif method == "hodge_heat": + # Compute e^{-tL₁} χ_e via dense matrix exponential + L1_dense = L1.toarray() + heat = expm(-t * L1_dense) + seed_idx = bm.edge_index[seed_edge] + dist = heat[:, seed_idx] + else: + raise ValueError(f"Unknown method '{method}'. Use 'hodge_heat' or 'hodge_pagerank'.") + + return edge_sweep_cut(kc, dist, bm=bm) diff --git a/knowledgecomplex/audit.py b/knowledgecomplex/audit.py new file mode 100644 index 0000000..bc5129a --- /dev/null +++ b/knowledgecomplex/audit.py @@ -0,0 +1,173 @@ +""" +knowledgecomplex.audit — Verification and audit tooling. + +Terminology: + - **verify**: deterministic, automated checks (SHACL constraints, type + guards, cardinality). These are pass/fail with no human judgment. + - **validate**: human review and approval of fitness for purpose. + Not implemented here — this module is for verification only. + +Provides: + - ``AuditReport`` / ``AuditViolation`` — structured verification results + - ``audit_file()`` — verify a serialized RDF file against SHACL shapes +""" + +from __future__ import annotations +import re +from dataclasses import dataclass, field +from pathlib import Path +from typing import TYPE_CHECKING + +if TYPE_CHECKING: + pass + + +@dataclass +class AuditViolation: + """A single verification violation from a SHACL report.""" + element_id: str | None + constraint: str + message: str + + def __str__(self) -> str: + eid = self.element_id or "?" + return f"[{eid}] {self.constraint}: {self.message}" + + +@dataclass +class AuditReport: + """Structured result of a SHACL verification pass. + + Attributes + ---------- + conforms : bool + True if no violations were found. + text : str + Full human-readable SHACL report text. + violations : list[AuditViolation] + Parsed individual violations. + """ + conforms: bool + text: str + violations: list[AuditViolation] = field(default_factory=list) + + def __str__(self) -> str: + if self.conforms: + return "Verification passed: no violations" + lines = [f"Verification failed: {len(self.violations)} violation(s)"] + for v in self.violations: + lines.append(f" {v}") + return "\n".join(lines) + + def __bool__(self) -> bool: + return self.conforms + + +def _parse_shacl_report(results_text: str, namespace: str = "") -> list[AuditViolation]: + """Extract individual violations from a pyshacl results text string.""" + violations = [] + + # Split on "Constraint Violation" or "Result" blocks + # pyshacl output format: blocks separated by blank lines + blocks = re.split(r"\n\n+", results_text) + + for block in blocks: + if "Violation" not in block and "Result" not in block: + continue + + element_id = None + constraint = "" + message = "" + + for line in block.strip().split("\n"): + line = line.strip() + if line.startswith("Focus Node:"): + # Extract element ID from IRI + match = re.search(r"#(\S+)", line) + if match: + element_id = match.group(1) + elif line.startswith("Source Shape:"): + match = re.search(r"[#/](\S+?)>?\s*$", line) + if match: + constraint = match.group(1) + elif line.startswith("Message:"): + message = line.replace("Message:", "").strip() + elif line.startswith("Severity:"): + pass # could capture severity if needed + + if constraint or message: + violations.append(AuditViolation( + element_id=element_id, + constraint=constraint, + message=message, + )) + + return violations + + +def _build_report(conforms: bool, results_text: str, namespace: str = "") -> AuditReport: + """Build an AuditReport from pyshacl output.""" + violations = _parse_shacl_report(results_text, namespace) if not conforms else [] + return AuditReport(conforms=conforms, text=results_text, violations=violations) + + +def audit_file( + instance_path: str | Path, + shapes: str | Path, + ontology: str | Path | None = None, +) -> AuditReport: + """Verify a serialized RDF file against SHACL shapes. + + Runs pyshacl on static files without instantiating a KnowledgeComplex. + Useful for CI pipelines and pre-commit hooks. + + Parameters + ---------- + instance_path : str or Path + Path to the instance graph (Turtle, JSON-LD, etc.). + shapes : str or Path + Path to the SHACL shapes file. + ontology : str or Path, optional + Path to the OWL ontology file. If not provided, only SHACL + shapes are used (no RDFS inference). + + Returns + ------- + AuditReport + """ + import pyshacl + from rdflib import Graph + + instance_path = Path(instance_path) + shapes = Path(shapes) + + if not instance_path.exists(): + raise FileNotFoundError(f"Instance file not found: {instance_path}") + if not shapes.exists(): + raise FileNotFoundError(f"Shapes file not found: {shapes}") + + data_graph = Graph() + data_graph.parse(str(instance_path)) + + shacl_graph = Graph() + shacl_graph.parse(str(shapes)) + + ont_graph = None + if ontology is not None: + ontology = Path(ontology) + if not ontology.exists(): + raise FileNotFoundError(f"Ontology file not found: {ontology}") + ont_graph = Graph() + ont_graph.parse(str(ontology)) + # Also parse ontology into data graph for type inference + data_graph.parse(str(ontology)) + + conforms, _, results_text = pyshacl.validate( + data_graph=data_graph, + shacl_graph=shacl_graph, + ont_graph=ont_graph, + inference="rdfs" if ont_graph else None, + abort_on_first=False, + ) + + return _build_report(conforms, results_text) diff --git a/knowledgecomplex/clique.py b/knowledgecomplex/clique.py new file mode 100644 index 0000000..474a4c9 --- /dev/null +++ b/knowledgecomplex/clique.py @@ -0,0 +1,399 @@ +"""knowledgecomplex.clique — Clique complex and flagification methods. + +Two workflows for inferring higher-order simplices from the edge graph: + +**Generic exploration** (``fill_cliques``) + Discover what higher-order structure exists before knowing the semantics. + Fill in generic simplices for all cliques up to a given order. Inspect + what shows up, then decide what types to declare. + +**Typed inference** (``infer_faces``) + Once you've declared a face type with semantic meaning, fill in all + instances of that type from the edge graph. The face type is required — + you declare the type, then run inference to populate it. + +``find_cliques`` is a pure query that returns vertex cliques without +modifying the complex. + +Typical workflow:: + + # Phase 1: Explore — what triangles exist? + sb.add_face_type("_clique") + kc = KnowledgeComplex(schema=sb) + # ... add vertices and edges ... + result = fill_cliques(kc, max_order=2) + + # Phase 2: Inspect + for fid in result[2]: + edge_types = {kc.element(e).type for e in kc.boundary(fid)} + print(f"{fid}: {edge_types}") + + # Phase 3: Typed inference with a real schema + sb2 = SchemaBuilder(namespace="ex") + sb2.add_face_type("operation", attributes={...}) + kc2 = KnowledgeComplex(schema=sb2) + # ... add vertices and edges ... + infer_faces(kc2, "operation", edge_type="performs") +""" +from __future__ import annotations + +from itertools import combinations +from typing import TYPE_CHECKING + +if TYPE_CHECKING: + from knowledgecomplex.graph import KnowledgeComplex + +from knowledgecomplex.exceptions import SchemaError + + +# ── Helpers ───────────────────────────────────────────────────────────────── + + +def _edges_between( + kc: "KnowledgeComplex", + v1: str, + v2: str, + edge_type: str | None = None, +) -> list[str]: + """Find KC edges whose boundary is exactly {v1, v2}. + + Parameters + ---------- + kc : KnowledgeComplex + v1, v2 : str + Vertex IDs. + edge_type : str, optional + Only return edges of this type. + + Returns + ------- + list[str] + """ + # Edges incident to v1 ∩ edges incident to v2 + cb1 = kc.coboundary(v1, type=edge_type) + cb2 = kc.coboundary(v2, type=edge_type) + shared = cb1 & cb2 + + # Filter to edges (dim=1) whose boundary is exactly {v1, v2} + result = [] + for eid in shared: + kind = kc._schema._types.get(kc.element(eid).type, {}).get("kind") + if kind == "edge" and kc.boundary(eid) == {v1, v2}: + result.append(eid) + return result + + +def _has_face_for_edges(kc: "KnowledgeComplex", edge_ids: frozenset[str]) -> bool: + """Check if a face already exists whose boundary is exactly edge_ids.""" + # Check coboundary of any edge — faces containing it + if not edge_ids: + return False + first = next(iter(edge_ids)) + candidates = kc.coboundary(first) + for cid in candidates: + kind = kc._schema._types.get(kc.element(cid).type, {}).get("kind") + if kind == "face" and kc.boundary(cid) == set(edge_ids): + return True + return False + + +# ── find_cliques ──────────────────────────────────────────────────────────── + + +def find_cliques( + kc: "KnowledgeComplex", + k: int = 3, + *, + edge_type: str | None = None, +) -> list[frozenset[str]]: + """Find all k-cliques of KC vertices in the edge graph. + + A k-clique is a set of k vertices where every pair is connected by + an edge. This is a pure query — it does not modify the complex. + + Parameters + ---------- + kc : KnowledgeComplex + k : int + Clique size (default 3 for triangles). + edge_type : str, optional + Only consider edges of this type when building the adjacency graph. + + Returns + ------- + list[frozenset[str]] + Each element is a frozenset of k vertex IDs. + """ + if k < 2: + raise ValueError(f"Clique size must be >= 2, got {k}") + + # Build adjacency from vertex-edge structure + vertices = sorted(kc.skeleton(0)) + adj: dict[str, set[str]] = {v: set() for v in vertices} + + edge_ids = kc.skeleton(1) - kc.skeleton(0) + for eid in edge_ids: + if edge_type is not None and kc.element(eid).type != edge_type: + continue + boundary = kc.boundary(eid) + if len(boundary) == 2: + v1, v2 = sorted(boundary) + adj[v1].add(v2) + adj[v2].add(v1) + + # Enumerate cliques via Bron-Kerbosch or brute-force for small k + cliques: list[frozenset[str]] = [] + sorted_verts = sorted(vertices) + + if k == 2: + # k=2 cliques are just edges + for i, v1 in enumerate(sorted_verts): + for v2 in sorted_verts[i + 1:]: + if v2 in adj[v1]: + cliques.append(frozenset([v1, v2])) + else: + # General: enumerate (k-1)-subsets and check + for combo in combinations(sorted_verts, k): + if all(combo[j] in adj[combo[i]] + for i in range(k) for j in range(i + 1, k)): + cliques.append(frozenset(combo)) + + return cliques + + +# ── infer_faces ───────────────────────────────────────────────────────────── + + +def infer_faces( + kc: "KnowledgeComplex", + face_type: str, + *, + edge_type: str | None = None, + id_prefix: str = "face", + dry_run: bool = False, +) -> list[str]: + """Infer and add faces of a declared type from 3-cliques in the edge graph. + + Finds all triangles (3-cliques of KC vertices), resolves the 3 boundary + edges for each, and calls ``kc.add_face()`` with the specified type. + Skips triangles that already have a face. + + Parameters + ---------- + kc : KnowledgeComplex + face_type : str + A registered face type to assign to inferred faces. + edge_type : str, optional + Only consider edges of this type when finding triangles. + id_prefix : str + Prefix for auto-generated face IDs (e.g. ``"face-0"``). + dry_run : bool + If ``True``, return the list of would-be face IDs and their + boundaries without adding them to the complex. + + Returns + ------- + list[str] + IDs of newly added (or would-be) faces. + + Raises + ------ + SchemaError + If *face_type* is not a registered face type. + """ + if face_type not in kc._schema._types: + raise SchemaError(f"Type '{face_type}' is not registered") + if kc._schema._types[face_type].get("kind") != "face": + raise SchemaError(f"Type '{face_type}' is not a face type") + + triangles = find_cliques(kc, k=3, edge_type=edge_type) + added: list[str] = [] + counter = 0 + + for tri in triangles: + verts = sorted(tri) + # Find edges for each pair + edges = [] + valid = True + for i in range(3): + for j in range(i + 1, 3): + e = _edges_between(kc, verts[i], verts[j], edge_type=edge_type) + if not e: + valid = False + break + edges.append(e[0]) # take first matching edge + if not valid: + break + + if not valid or len(edges) != 3: + continue + + # Skip if face already exists for these edges + if _has_face_for_edges(kc, frozenset(edges)): + continue + + face_id = f"{id_prefix}-{counter}" + counter += 1 + + if not dry_run: + kc.add_face(face_id, type=face_type, boundary=edges) + added.append(face_id) + + return added + + +# ── fill_cliques ──────────────────────────────────────────────────────────── + + +def fill_cliques( + kc: "KnowledgeComplex", + max_order: int = 2, + *, + edge_type: str | None = None, + id_prefix: str = "clique", +) -> dict[int, list[str]]: + """Fill generic simplices for all cliques up to max_order. + + Discovers what higher-order structure exists without requiring semantic + type declarations. For k=2 (faces), uses the first declared face type. + For k>2, uses ``_assert_element`` directly with the base ``kc:Element`` + type — these are generic, untyped simplices. + + This is an exploration tool. Once you've inspected the structure, + declare typed face types and use :func:`infer_faces` for semantic + inference. + + Parameters + ---------- + kc : KnowledgeComplex + max_order : int + Maximum simplex dimension to fill (default 2 = faces). + edge_type : str, optional + Only consider edges of this type when finding cliques. + id_prefix : str + Prefix for auto-generated IDs. + + Returns + ------- + dict[int, list[str]] + Mapping from dimension to list of newly added element IDs. + E.g. ``{2: ["clique-0", "clique-1"], 3: ["clique-4"]}``. + """ + if max_order < 2: + raise ValueError(f"max_order must be >= 2, got {max_order}") + + result: dict[int, list[str]] = {} + + # k=2: faces — use first declared face type + if max_order >= 2: + face_types = kc._schema.type_names(kind="face") + if not face_types: + raise SchemaError( + "No face types declared. Add at least one face type " + "(e.g. sb.add_face_type('_clique')) before calling fill_cliques." + ) + face_type = face_types[0] + result[2] = infer_faces( + kc, face_type, edge_type=edge_type, id_prefix=id_prefix, + ) + + # k>2: higher-order generic simplices + if max_order >= 3: + for dim in range(3, max_order + 1): + cliques = find_cliques(kc, k=dim + 1, edge_type=edge_type) + added: list[str] = [] + counter = 0 + + for clique in cliques: + # Find the k boundary (dim-1)-simplices + # For a (dim)-simplex, boundary is (dim+1 choose dim) = dim+1 + # (dim-1)-simplices from subsets of size dim + boundary_ids = [] + valid = True + + for sub in combinations(sorted(clique), dim): + # Find the (dim-1)-simplex with these vertices in its closure + sub_set = frozenset(sub) + # For dim=3: find face whose closure vertices = sub (3 vertices) + found = _find_simplex_with_vertices(kc, sub_set, dim - 1) + if found is None: + valid = False + break + boundary_ids.append(found) + + if not valid: + continue + + # Check no duplicate + elem_id = f"{id_prefix}-{dim}d-{counter}" + + if not dry_run_check(kc, boundary_ids): + counter += 1 + # Use _assert_element directly for generic higher-order + kc._assert_element( + elem_id, + face_types[0], # reuse face type as best available + boundary_ids=boundary_ids, + attributes={}, + ) + added.append(elem_id) + + result[dim] = added + + return result + + +def _find_simplex_with_vertices( + kc: "KnowledgeComplex", + vertices: frozenset[str], + dim: int, +) -> str | None: + """Find a simplex of given dimension whose closure contains exactly these vertices.""" + if dim == 1: + # Find edge between 2 vertices + verts = sorted(vertices) + if len(verts) != 2: + return None + edges = _edges_between(kc, verts[0], verts[1]) + return edges[0] if edges else None + elif dim == 2: + # Find face whose 3 vertices match + verts = sorted(vertices) + if len(verts) != 3: + return None + # Check edges between each pair + for i in range(3): + for j in range(i + 1, 3): + edges = _edges_between(kc, verts[i], verts[j]) + if not edges: + return None + # Find face containing all three edges + edge_sets = [] + for i in range(3): + for j in range(i + 1, 3): + edge_sets.append(set(_edges_between(kc, verts[i], verts[j]))) + # Check coboundary of first edge for faces + if edge_sets: + for e in edge_sets[0]: + for cand in kc.coboundary(e): + kind = kc._schema._types.get(kc.element(cand).type, {}).get("kind") + if kind == "face": + face_boundary = kc.boundary(cand) + # Check if boundary edges connect these vertices + face_verts = set() + for be in face_boundary: + face_verts |= kc.boundary(be) + if face_verts == set(verts): + return cand + return None + + +def dry_run_check(kc, boundary_ids): + """Check if an element with this boundary already exists.""" + if not boundary_ids: + return False + first = boundary_ids[0] + for cand in kc.coboundary(first): + if set(kc.boundary(cand)) == set(boundary_ids): + return True + return False diff --git a/knowledgecomplex/codecs/__init__.py b/knowledgecomplex/codecs/__init__.py new file mode 100644 index 0000000..614a2c7 --- /dev/null +++ b/knowledgecomplex/codecs/__init__.py @@ -0,0 +1,9 @@ +"""knowledgecomplex.codecs — Built-in codec implementations. + +Codecs bridge KC elements and external artifact formats. +Each codec implements the :class:`~knowledgecomplex.schema.Codec` protocol. +""" + +from knowledgecomplex.codecs.markdown import MarkdownCodec, verify_documents + +__all__ = ["MarkdownCodec", "verify_documents"] diff --git a/knowledgecomplex/codecs/markdown.py b/knowledgecomplex/codecs/markdown.py new file mode 100644 index 0000000..b5c8857 --- /dev/null +++ b/knowledgecomplex/codecs/markdown.py @@ -0,0 +1,220 @@ +"""knowledgecomplex.codecs.markdown — YAML-frontmatter + markdown codec. + +Implements the :class:`~knowledgecomplex.schema.Codec` protocol for +knowledge complexes where each element is a markdown file with YAML +frontmatter (structured metadata) and a markdown body with predefined +section headers (prose content). + +This follows the pattern used in production knowledge complexes authored +in Obsidian — each element is a ``.md`` file, the YAML header holds +structured attributes, and ``##`` sections hold prose content. + +Usage:: + + from knowledgecomplex.codecs import MarkdownCodec + + codec = MarkdownCodec( + frontmatter_attrs=["name", "author", "abstract"], + section_attrs=["notes", "methodology"], + ) + kc.register_codec("Paper", codec) + + # Compile: KC element -> markdown file at its URI + kc.element("paper-1").compile() + + # Decompile: markdown file -> KC element attributes + kc.element("paper-1").decompile() +""" +from __future__ import annotations + +import re +from pathlib import Path +from typing import TYPE_CHECKING, Any + +import yaml + +if TYPE_CHECKING: + from knowledgecomplex.graph import KnowledgeComplex + + +class MarkdownCodec: + """Codec for YAML-frontmatter + markdown files. + + Each element maps to a single ``.md`` file. Attributes are stored in + two places: + + - **YAML frontmatter** (between ``---`` delimiters): structured metadata + fields like ``name``, ``author``, ``description``. These map 1:1 to + KC element attributes. + + - **Markdown body sections** (``## Header`` blocks): prose content like + notes or analysis. The section header becomes the attribute name + (lowercased, spaces replaced with underscores), and the section body + becomes the attribute value. + + Parameters + ---------- + frontmatter_attrs : list[str] + Attribute names stored in the YAML frontmatter. + section_attrs : list[str] + Attribute names stored as ``## Section`` blocks in the body. + """ + + def __init__( + self, + frontmatter_attrs: list[str], + section_attrs: list[str], + ) -> None: + self.frontmatter_attrs = list(frontmatter_attrs) + self.section_attrs = list(section_attrs) + + def compile(self, element: dict) -> None: + """Write an element record to a markdown file at its URI. + + Parameters + ---------- + element : dict + Keys: ``id``, ``type``, ``uri``, plus all attribute key-value pairs. + """ + uri = element["uri"] + path = Path(uri.replace("file://", "")) + path.parent.mkdir(parents=True, exist_ok=True) + + # Build YAML frontmatter + fm: dict[str, Any] = { + "id": element["id"], + "type": element["type"], + } + for attr in self.frontmatter_attrs: + if attr in element: + fm[attr] = element[attr] + + # Build markdown body + title = element.get("name", element["id"]) + lines = [f"# {title}", ""] + for attr in self.section_attrs: + header = attr.replace("_", " ").title() + content = element.get(attr, "") + lines.append(f"## {header}") + lines.append("") + lines.append(content if content else "(empty)") + lines.append("") + + # Write file + fm_str = yaml.dump(fm, default_flow_style=False, sort_keys=False).strip() + body = "\n".join(lines) + path.write_text(f"---\n{fm_str}\n---\n\n{body}\n") + + def decompile(self, uri: str) -> dict: + """Read a markdown file and return attribute key-value pairs. + + Parameters + ---------- + uri : str + File URI (``file://`` prefix stripped automatically). + + Returns + ------- + dict + Attribute key-value pairs (no ``id``, ``type``, or ``uri``). + """ + path = Path(uri.replace("file://", "")) + text = path.read_text() + + # Split frontmatter from body + fm_match = re.match(r"^---\s*\n(.*?)\n---\s*\n(.*)$", text, re.DOTALL) + if not fm_match: + raise ValueError(f"No YAML frontmatter found in {path}") + + fm_raw = fm_match.group(1) + body = fm_match.group(2) + + # Parse YAML frontmatter + fm = yaml.safe_load(fm_raw) or {} + attrs: dict[str, str] = {} + for attr in self.frontmatter_attrs: + if attr in fm: + attrs[attr] = str(fm[attr]) + + # Parse ## sections from body + section_pattern = re.compile(r"^## (.+)$", re.MULTILINE) + sections = {} + matches = list(section_pattern.finditer(body)) + for i, m in enumerate(matches): + header = m.group(1).strip().lower().replace(" ", "_") + start = m.end() + end = matches[i + 1].start() if i + 1 < len(matches) else len(body) + content = body[start:end].strip() + if content == "(empty)": + content = "" + sections[header] = content + + for attr in self.section_attrs: + if attr in sections: + attrs[attr] = sections[attr] + + return attrs + + +def verify_documents( + kc: "KnowledgeComplex", + directory: str | Path, +) -> list[str]: + """Check consistency between KC elements and markdown files on disk. + + Verifies: + + - Every element with a URI has a corresponding file. + - Every ``.md`` file in the directory has a corresponding element. + - Attribute values in files match the KC (via decompile). + + Parameters + ---------- + kc : KnowledgeComplex + directory : str or Path + Root directory containing the markdown files. + + Returns + ------- + list[str] + Discrepancy messages. Empty list means everything is consistent. + """ + directory = Path(directory) + issues: list[str] = [] + + # Collect URIs from KC elements + uri_to_id: dict[str, str] = {} + for eid in kc.element_ids(): + elem = kc.element(eid) + if elem.uri: + uri_to_id[elem.uri] = eid + fpath = Path(elem.uri.replace("file://", "")) + if not fpath.exists(): + issues.append(f"MISSING FILE: {eid} -> {fpath}") + + # Check for orphan files (in directory but not in KC) + for md_file in sorted(directory.rglob("*.md")): + file_uri = f"file://{md_file}" + if file_uri not in uri_to_id: + issues.append(f"ORPHAN FILE: {md_file} (no element in KC)") + + # Check attribute consistency + for uri, eid in sorted(uri_to_id.items()): + fpath = Path(uri.replace("file://", "")) + if not fpath.exists(): + continue + elem = kc.element(eid) + try: + codec = kc._resolve_codec(elem.type) + file_attrs = codec.decompile(uri) + kc_attrs = elem.attrs + for key in file_attrs: + if key in kc_attrs and file_attrs[key] != kc_attrs[key]: + issues.append( + f"MISMATCH: {eid}.{key} — " + f"KC='{kc_attrs[key][:40]}' vs file='{file_attrs[key][:40]}'" + ) + except Exception as e: + issues.append(f"ERROR reading {eid}: {e}") + + return issues diff --git a/knowledgecomplex/diff.py b/knowledgecomplex/diff.py new file mode 100644 index 0000000..d6cab49 --- /dev/null +++ b/knowledgecomplex/diff.py @@ -0,0 +1,429 @@ +"""knowledgecomplex.diff — Complex diffs and sequences for time-varying complexes. + +A ``ComplexDiff`` records element additions and removals. It can be applied +to a ``KnowledgeComplex`` to mutate it, exported to a SPARQL UPDATE string +for interoperability with RDF-native systems (e.g. flexo MMS), or imported +from a SPARQL UPDATE string received from a remote system. + +A ``ComplexSequence`` wraps a base complex and an ordered list of diffs, +representing a time series of complex states. Element ID sets at each step +are computed by applying diffs cumulatively. + +Example:: + + diff = ComplexDiff() + diff.add_vertex("eve", type="Person") + diff.add_edge("e-ae", type="Link", vertices={"alice", "eve"}) + diff.remove("old-edge") + + diff.apply(kc) # mutate the complex + sparql = diff.to_sparql(kc) # export as SPARQL UPDATE + + # Import a diff from a remote system + remote_diff = ComplexDiff.from_sparql(sparql, kc) + remote_diff.apply(kc2) +""" +from __future__ import annotations + +import re +from typing import TYPE_CHECKING, Any + +from rdflib import Graph, Namespace, URIRef, Literal, RDF, XSD + +if TYPE_CHECKING: + from knowledgecomplex.graph import KnowledgeComplex + +_KC = Namespace("https://example.org/kc#") + + +class ComplexDiff: + """A set of element additions and removals that transform a complex. + + Build a diff by chaining ``add_vertex``, ``add_edge``, ``add_face``, + and ``remove`` calls. Then apply it to a ``KnowledgeComplex`` via + :meth:`apply`, or export it to a SPARQL UPDATE string via :meth:`to_sparql`. + """ + + def __init__(self) -> None: + self._additions: list[dict[str, Any]] = [] + self._removals: list[str] = [] + + @property + def additions(self) -> list[dict[str, Any]]: + """Element additions: list of dicts with id, type, kind, boundary, attrs.""" + return list(self._additions) + + @property + def removals(self) -> list[str]: + """Element removals: list of element IDs.""" + return list(self._removals) + + # ── Builder methods (chainable) ──────────────────────────────────────── + + def add_vertex( + self, id: str, type: str, uri: str | None = None, **attrs: Any + ) -> "ComplexDiff": + """Record a vertex addition.""" + self._additions.append({ + "id": id, "type": type, "kind": "vertex", + "boundary": None, "uri": uri, "attrs": attrs, + }) + return self + + def add_edge( + self, id: str, type: str, vertices: set[str] | list[str], + uri: str | None = None, **attrs: Any, + ) -> "ComplexDiff": + """Record an edge addition.""" + self._additions.append({ + "id": id, "type": type, "kind": "edge", + "boundary": list(vertices), "uri": uri, "attrs": attrs, + }) + return self + + def add_face( + self, id: str, type: str, boundary: list[str], + uri: str | None = None, **attrs: Any, + ) -> "ComplexDiff": + """Record a face addition.""" + self._additions.append({ + "id": id, "type": type, "kind": "face", + "boundary": list(boundary), "uri": uri, "attrs": attrs, + }) + return self + + def remove(self, id: str) -> "ComplexDiff": + """Record an element removal.""" + self._removals.append(id) + return self + + # ── Apply ────────────────────────────────────────────────────────────── + + def apply(self, kc: "KnowledgeComplex", validate: bool = True) -> None: + """Apply this diff to a KnowledgeComplex. + + Removals are processed first (highest dimension first to avoid + boundary-closure violations), then additions. + + Parameters + ---------- + kc : KnowledgeComplex + validate : bool + If True (default), run SHACL validation after all changes. + Raises ``ValidationError`` on failure (changes are NOT rolled back). + """ + # Sort removals: higher-dim first to avoid intermediate violations + # Determine dimension of each removal from the live complex + dim_order = {"face": 2, "edge": 1, "vertex": 0} + removals_with_dim = [] + for rid in self._removals: + try: + elem = kc.element(rid) + kind = kc._schema._types.get(elem.type, {}).get("kind", "vertex") + removals_with_dim.append((dim_order.get(kind, 0), rid)) + except ValueError: + pass # already removed or doesn't exist — skip + removals_with_dim.sort(key=lambda x: -x[0]) # highest dim first + + for _, rid in removals_with_dim: + kc.remove_element(rid) + + # Process additions sorted by dimension (vertices first, edges, then faces) + dim_order = {"vertex": 0, "edge": 1, "face": 2} + sorted_additions = sorted( + self._additions, key=lambda a: dim_order.get(a["kind"], 0) + ) + for add in sorted_additions: + kind = add["kind"] + if kind == "vertex": + kc.add_vertex( + add["id"], type=add["type"], uri=add["uri"], **add["attrs"] + ) + elif kind == "edge": + kc.add_edge( + add["id"], type=add["type"], vertices=add["boundary"], + uri=add["uri"], **add["attrs"], + ) + elif kind == "face": + kc.add_face( + add["id"], type=add["type"], boundary=add["boundary"], + uri=add["uri"], **add["attrs"], + ) + + # ── SPARQL export ────────────────────────────────────────────────────── + + def to_sparql(self, kc: "KnowledgeComplex") -> str: + """Export this diff as a SPARQL UPDATE string. + + Generates ``DELETE DATA`` blocks for removals and ``INSERT DATA`` + blocks for additions, using the KC's namespace for IRI construction. + + Parameters + ---------- + kc : KnowledgeComplex + Used to read existing triples for removals and to resolve namespaces. + + Returns + ------- + str + A SPARQL UPDATE string. + """ + ns = kc._schema._base_iri + parts = [] + + # DELETE DATA for removals + if self._removals: + delete_triples = [] + for rid in self._removals: + iri = URIRef(f"{ns}{rid}") + # Collect all triples involving this element + for s, p, o in kc._instance_graph.triples((iri, None, None)): + delete_triples.append(f" <{s}> <{p}> {_sparql_obj(o)} .") + for s, p, o in kc._instance_graph.triples((None, None, iri)): + delete_triples.append(f" <{s}> <{p}> <{o}> .") + if delete_triples: + parts.append( + "DELETE DATA {\n" + "\n".join(delete_triples) + "\n}" + ) + + # INSERT DATA for additions + if self._additions: + insert_triples = [] + for add in self._additions: + iri = f"<{ns}{add['id']}>" + type_iri = f"<{ns}{add['type']}>" + insert_triples.append(f" {iri} <{RDF.type}> {type_iri} .") + + if add.get("boundary"): + for bid in add["boundary"]: + b_iri = f"<{ns}{bid}>" + insert_triples.append(f" {iri} <{_KC.boundedBy}> {b_iri} .") + + if add.get("uri"): + insert_triples.append( + f' {iri} <{_KC.uri}> "{add["uri"]}"^^<{XSD.anyURI}> .' + ) + + for attr_name, attr_value in add.get("attrs", {}).items(): + attr_iri = f"<{ns}{attr_name}>" + if isinstance(attr_value, (list, tuple)): + for v in attr_value: + insert_triples.append(f' {iri} {attr_iri} "{v}" .') + else: + insert_triples.append(f' {iri} {attr_iri} "{attr_value}" .') + + # Add to complex + complex_iri = f"<{ns}_complex>" + insert_triples.append( + f" {complex_iri} <{_KC.hasElement}> {iri} ." + ) + + parts.append( + "INSERT DATA {\n" + "\n".join(insert_triples) + "\n}" + ) + + return " ;\n".join(parts) + + # ── SPARQL import ────────────────────────────────────────────────────── + + @classmethod + def from_sparql(cls, sparql: str, kc: "KnowledgeComplex") -> "ComplexDiff": + """Parse a SPARQL UPDATE string into a ComplexDiff. + + Extracts ``INSERT DATA`` and ``DELETE DATA`` blocks, parses their + triple content, and reconstructs element additions and removals. + + Parameters + ---------- + sparql : str + SPARQL UPDATE string (as produced by :meth:`to_sparql`). + kc : KnowledgeComplex + Used to resolve namespaces and determine element kinds. + + Returns + ------- + ComplexDiff + """ + ns = kc._schema._base_iri + diff = cls() + + # Extract DELETE DATA blocks → removals + for match in re.finditer( + r"DELETE\s+DATA\s*\{(.*?)\}", sparql, re.DOTALL | re.IGNORECASE + ): + block = match.group(1) + removed_ids = set() + for triple_match in re.finditer(r"<([^>]+)>\s+<[^>]+>\s+", block): + subj = triple_match.group(1) + if subj.startswith(ns) and not subj.endswith("_complex"): + removed_ids.add(subj[len(ns):]) + for rid in sorted(removed_ids): + diff.remove(rid) + + # Extract INSERT DATA blocks → additions + for match in re.finditer( + r"INSERT\s+DATA\s*\{(.*?)\}", sparql, re.DOTALL | re.IGNORECASE + ): + block = match.group(1) + # Parse triples to reconstruct elements + g = Graph() + # Convert to N-Triples-like format for parsing + nt_lines = [] + for line in block.strip().split("\n"): + line = line.strip() + if line: + nt_lines.append(line) + nt_data = "\n".join(nt_lines) + try: + g.parse(data=nt_data, format="nt") + except Exception: + continue + + # Find elements (subjects with rdf:type in model namespace) + has_element = _KC.hasElement + bounded_by = _KC.boundedBy + kc_uri = _KC.uri + + for subj in set(g.subjects(RDF.type, None)): + subj_str = str(subj) + if not subj_str.startswith(ns): + continue + elem_id = subj_str[len(ns):] + + # Get type + type_iri = g.value(subj, RDF.type) + if type_iri is None: + continue + type_str = str(type_iri) + if not type_str.startswith(ns): + continue + type_name = type_str[len(ns):] + + # Determine kind from schema + kind = kc._schema._types.get(type_name, {}).get("kind", "vertex") + + # Get boundary + boundary = [] + for _, _, o in g.triples((subj, bounded_by, None)): + bid = str(o) + if bid.startswith(ns): + boundary.append(bid[len(ns):]) + + # Get uri + uri_val = g.value(subj, kc_uri) + uri = str(uri_val) if uri_val else None + + # Get model attributes + attrs = {} + for _, p, o in g.triples((subj, None, None)): + p_str = str(p) + if (p_str.startswith(ns) and p_str != str(type_iri) + and p != RDF.type and p != bounded_by + and p != kc_uri and p != has_element): + attr_name = p_str[len(ns):] + attrs[attr_name] = str(o) + + if kind == "vertex": + diff.add_vertex(elem_id, type=type_name, uri=uri, **attrs) + elif kind == "edge": + diff.add_edge( + elem_id, type=type_name, vertices=boundary, + uri=uri, **attrs, + ) + elif kind == "face": + diff.add_face( + elem_id, type=type_name, boundary=boundary, + uri=uri, **attrs, + ) + + return diff + + def __repr__(self) -> str: + return ( + f"ComplexDiff(+{len(self._additions)} additions, " + f"-{len(self._removals)} removals)" + ) + + +def _sparql_obj(obj) -> str: + """Format an RDF object for SPARQL DATA blocks.""" + if isinstance(obj, URIRef): + return f"<{obj}>" + elif isinstance(obj, Literal): + if obj.datatype: + return f'"{obj}"^^<{obj.datatype}>' + return f'"{obj}"' + return f'"{obj}"' + + +class ComplexSequence: + """A base complex + ordered list of diffs, representing a time series. + + Computes element ID sets at each step by applying diffs cumulatively + to the base complex's element set. This is a lightweight representation + that does not reconstruct full ``KnowledgeComplex`` instances at each step. + + Parameters + ---------- + kc : KnowledgeComplex + The base complex (state at step -1, before any diffs). + diffs : list[ComplexDiff] + Ordered sequence of diffs to apply. + """ + + def __init__( + self, + kc: "KnowledgeComplex", + diffs: list[ComplexDiff], + ) -> None: + self._kc = kc + self._diffs = list(diffs) + # Pre-compute element ID sets at each step + self._steps: list[frozenset[str]] = [] + current = set(kc.element_ids()) + for diff in diffs: + for rid in diff.removals: + current.discard(rid) + for add in diff.additions: + current.add(add["id"]) + self._steps.append(frozenset(current)) + + @property + def complex(self) -> "KnowledgeComplex": + """The base KnowledgeComplex.""" + return self._kc + + @property + def diffs(self) -> list[ComplexDiff]: + """The ordered list of diffs.""" + return list(self._diffs) + + def __len__(self) -> int: + return len(self._steps) + + def __getitem__(self, index: int) -> set[str]: + """Element IDs present at step ``index``.""" + return set(self._steps[index]) + + def __iter__(self): + for step in self._steps: + yield set(step) + + def new_at(self, index: int) -> set[str]: + """Elements added at step ``index`` (not present in previous step).""" + current = self._steps[index] + if index == 0: + base = frozenset(self._kc.element_ids()) + return set(current - base) + return set(current - self._steps[index - 1]) + + def removed_at(self, index: int) -> set[str]: + """Elements removed at step ``index`` (present in previous, absent now).""" + current = self._steps[index] + if index == 0: + base = frozenset(self._kc.element_ids()) + return set(base - current) + return set(self._steps[index - 1] - current) + + def __repr__(self) -> str: + return f"ComplexSequence({len(self._steps)} steps)" diff --git a/knowledgecomplex/filtration.py b/knowledgecomplex/filtration.py new file mode 100644 index 0000000..0c805a3 --- /dev/null +++ b/knowledgecomplex/filtration.py @@ -0,0 +1,226 @@ +""" +knowledgecomplex.filtration — Filtrations over knowledge complexes. + +A filtration F = (C₀, C₁, …, Cₘ) is a nested sequence of subcomplexes +where each Cₚ is a valid simplicial complex (closed under boundary) and +Cₚ₋₁ ⊆ Cₚ. Filtrations are semantics-agnostic — they could represent +temporal evolution, thematic layers, trust levels, or any ordering. +""" + +from __future__ import annotations +from collections import defaultdict +from typing import Callable, Iterator, TYPE_CHECKING + +if TYPE_CHECKING: + from knowledgecomplex.graph import KnowledgeComplex + + +class Filtration: + """ + An indexed sequence of nested subcomplexes over a KnowledgeComplex. + + Each step is a valid subcomplex (closed under boundary) and each step + contains all elements from the previous step (monotone nesting). + + Parameters + ---------- + kc : KnowledgeComplex + The parent complex that this filtration is defined over. + + Example + ------- + >>> filt = Filtration(kc) + >>> filt.append({"v1"}) + >>> filt.append({"v1", "v2", "e12"}) + >>> filt.append({"v1", "v2", "v3", "e12", "e23", "e13", "f123"}) + >>> len(filt) + 3 + >>> filt.birth("e12") + 1 + """ + + def __init__(self, kc: "KnowledgeComplex") -> None: + self._kc = kc + self._steps: list[frozenset[str]] = [] + + def __repr__(self) -> str: + return f"Filtration(steps={len(self._steps)}, complete={self.is_complete})" + + @property + def complex(self) -> "KnowledgeComplex": + """The parent KnowledgeComplex.""" + return self._kc + + @property + def length(self) -> int: + """Number of steps in the filtration.""" + return len(self._steps) + + @property + def is_complete(self) -> bool: + """True if the last step contains all elements in the complex.""" + if not self._steps: + return False + all_ids = set(self._kc.element_ids()) + return set(self._steps[-1]) == all_ids + + def append(self, ids: set[str]) -> "Filtration": + """ + Append a subcomplex to the filtration. + + Parameters + ---------- + ids : set[str] + Element IDs forming the next step. Must be a valid subcomplex + and a superset of the previous step. + + Returns + ------- + Filtration (self, for chaining) + + Raises + ------ + ValueError + If ids is not a valid subcomplex or violates monotonicity. + """ + ids_set = set(ids) + + if not self._kc.is_subcomplex(ids_set): + raise ValueError( + "Cannot append: the given element set is not a valid subcomplex " + "(not closed under boundary)" + ) + + if self._steps and not ids_set >= set(self._steps[-1]): + raise ValueError( + "Cannot append: monotone nesting violated — new step must be " + "a superset of the previous step" + ) + + self._steps.append(frozenset(ids_set)) + return self + + def append_closure(self, ids: set[str]) -> "Filtration": + """ + Append the closure of a set of elements, unioned with the previous step. + + Takes the closure of ids (ensuring a valid subcomplex), unions it + with the previous step (ensuring monotonicity), and appends. + + Parameters + ---------- + ids : set[str] + Element IDs to close over. + + Returns + ------- + Filtration (self, for chaining) + """ + closed = self._kc.closure(ids) + if self._steps: + closed = closed | set(self._steps[-1]) + self._steps.append(frozenset(closed)) + return self + + @classmethod + def from_function( + cls, + kc: "KnowledgeComplex", + fn: Callable[[str], int | float], + ) -> "Filtration": + """ + Build a filtration by grouping elements by a function value. + + Calls fn(id) for every element in the complex, groups by return + value, sorts groups, and builds closure at each cumulative step. + + Parameters + ---------- + kc : KnowledgeComplex + The parent complex. + fn : Callable[[str], int | float] + Function mapping element IDs to filtration values. + + Returns + ------- + Filtration + """ + all_ids = kc.element_ids() + groups: dict[int | float, list[str]] = defaultdict(list) + for eid in all_ids: + groups[fn(eid)].append(eid) + + filt = cls(kc) + accumulated: set[str] = set() + for key in sorted(groups.keys()): + accumulated = accumulated | set(groups[key]) + closed = kc.closure(accumulated) + filt._steps.append(frozenset(closed)) + + return filt + + def __getitem__(self, index: int) -> set[str]: + return set(self._steps[index]) + + def __len__(self) -> int: + return len(self._steps) + + def __iter__(self) -> Iterator[set[str]]: + for step in self._steps: + yield set(step) + + def birth(self, id: str) -> int: + """ + Return the index of the first step containing this element. + + Parameters + ---------- + id : str + Element identifier. + + Returns + ------- + int + + Raises + ------ + ValueError + If the element does not appear in any step. + """ + for i, step in enumerate(self._steps): + if id in step: + return i + raise ValueError(f"Element '{id}' not found in any filtration step") + + def new_at(self, index: int) -> set[str]: + """ + Return elements added at step index (Cₚ \\ Cₚ₋₁). + + Parameters + ---------- + index : int + Step index. + + Returns + ------- + set[str] + """ + current = set(self._steps[index]) + if index == 0: + return current + return current - set(self._steps[index - 1]) + + def elements_at(self, index: int) -> set[str]: + """ + Return all elements at step index (same as self[index]). + + Parameters + ---------- + index : int + Step index. + + Returns + ------- + set[str] + """ + return set(self._steps[index]) diff --git a/knowledgecomplex/graph.py b/knowledgecomplex/graph.py index fff4cb0..7b57c85 100644 --- a/knowledgecomplex/graph.py +++ b/knowledgecomplex/graph.py @@ -30,14 +30,17 @@ from __future__ import annotations from pathlib import Path -from typing import Any +from typing import Any, TYPE_CHECKING + +if TYPE_CHECKING: + from knowledgecomplex.audit import AuditReport import pandas as pd import pyshacl from rdflib import Graph, Namespace, URIRef, Literal, RDF, RDFS, OWL, XSD -from knowledgecomplex.exceptions import ValidationError, UnknownQueryError -from knowledgecomplex.schema import SchemaBuilder +from knowledgecomplex.exceptions import ValidationError, UnknownQueryError, SchemaError +from knowledgecomplex.schema import SchemaBuilder, Codec _FRAMEWORK_QUERIES_DIR = Path(__file__).parent / "queries" @@ -62,6 +65,112 @@ def _load_query_templates( return templates +class _DeferredVerification: + """Context manager that suppresses per-write SHACL, verifies on exit.""" + + def __init__(self, kc: "KnowledgeComplex") -> None: + self._kc = kc + + def __enter__(self) -> "KnowledgeComplex": + self._kc._defer_verification = True + return self._kc + + def __exit__(self, exc_type, exc_val, exc_tb) -> None: + self._kc._defer_verification = False + if exc_type is None: + self._kc.verify() + return None + + +class Element: + """ + Lightweight proxy for an element in a KnowledgeComplex. + + Provides read-only access to element properties and compile/decompile + methods that delegate to the codec registered for this element's type. + Properties read live from the instance graph on each access. + """ + + def __init__(self, kc: "KnowledgeComplex", id: str) -> None: + self._kc = kc + self._id = id + self._iri = URIRef(f"{kc._schema._base_iri}{id}") + + def __repr__(self) -> str: + try: + t = self.type + except ValueError: + t = "?" + return f"Element({self._id!r}, type={t!r})" + + @property + def id(self) -> str: + return self._id + + @property + def type(self) -> str: + ns_str = self._kc._schema._base_iri + for _, _, o in self._kc._instance_graph.triples((self._iri, RDF.type, None)): + type_str = str(o) + if type_str.startswith(ns_str): + return type_str[len(ns_str):] + raise ValueError(f"Element '{self._id}' has no user type") + + @property + def uri(self) -> str | None: + obj = self._kc._instance_graph.value(self._iri, _KC.uri) + return str(obj) if obj is not None else None + + @property + def attrs(self) -> dict[str, Any]: + ns_str = self._kc._schema._base_iri + attrs: dict[str, Any] = {} + for _, p, o in self._kc._instance_graph.triples((self._iri, None, None)): + p_str = str(p) + if p_str.startswith(ns_str): + attr_name = p_str[len(ns_str):] + attrs[attr_name] = str(o) + return attrs + + def compile(self) -> None: + """Write this element's record to the artifact at its URI.""" + codec = self._kc._resolve_codec(self.type) + uri = self.uri + if uri is None: + raise ValueError(f"Element '{self._id}' has no kc:uri — cannot compile") + element_dict = {"id": self._id, "type": self.type, "uri": uri, **self.attrs} + codec.compile(element_dict) + + def decompile(self) -> None: + """Read the artifact at this element's URI and update attributes.""" + codec = self._kc._resolve_codec(self.type) + uri = self.uri + if uri is None: + raise ValueError(f"Element '{self._id}' has no kc:uri — cannot decompile") + new_attrs = codec.decompile(uri) + + # Remove existing model-namespace attribute triples + ns_str = self._kc._schema._base_iri + to_remove = [] + for s, p, o in self._kc._instance_graph.triples((self._iri, None, None)): + if str(p).startswith(ns_str): + to_remove.append((s, p, o)) + for triple in to_remove: + self._kc._instance_graph.remove(triple) + + # Add new attribute triples + for attr_name, attr_value in new_attrs.items(): + attr_iri = self._kc._ns[attr_name] + if isinstance(attr_value, (list, tuple)): + for v in attr_value: + self._kc._instance_graph.add((self._iri, attr_iri, Literal(v))) + else: + self._kc._instance_graph.add((self._iri, attr_iri, Literal(attr_value))) + + # Re-validate + self._kc._validate(self._id) + + class KnowledgeComplex: """ Manage a knowledge complex instance: add elements, validate, query. @@ -107,8 +216,14 @@ def __init__( self._instance_graph: Any = None # rdflib.Graph, populated in _init_graph() self._complex_iri: Any = None # URIRef for the kc:Complex individual self._ns = schema._ns + self._codecs: dict[str, Codec] = {} + self._defer_verification = False self._init_graph() + def __repr__(self) -> str: + n = len(self.element_ids()) + return f"KnowledgeComplex(namespace={self._schema._namespace!r}, elements={n})" + def _init_graph(self) -> None: """ Initialize the instance graph and create the kc:Complex individual. @@ -144,20 +259,22 @@ def _validate(self, focus_node_id: str | None = None) -> None: """ Run pyshacl against the current instance graph. - Validates both element-level shapes (EdgeShape, FaceShape) and - complex-level shapes (ComplexShape boundary-closure). + Skipped when deferred_verification() context manager is active. Parameters ---------- focus_node_id : str, optional If provided, used in the error message to identify which element - triggered the failure. Validation always covers the entire graph. + triggered the failure. Raises ------ ValidationError - If validation fails. report attribute contains human-readable text. + If verification fails. report attribute contains human-readable text. """ + if self._defer_verification: + return + conforms, _, results_text = pyshacl.validate( data_graph=self._instance_graph, shacl_graph=self._shacl_graph, @@ -171,6 +288,71 @@ def _validate(self, focus_node_id: str | None = None) -> None: msg += f" (after adding '{focus_node_id}')" raise ValidationError(msg, report=results_text) + def verify(self) -> None: + """ + Run SHACL verification on the current instance graph. + + Checks all topological and ontological constraints. Raises on failure. + Use :meth:`audit` for a non-throwing alternative. + + Raises + ------ + ValidationError + If any SHACL constraint is violated. + """ + # Bypass the deferral flag — verify() is an explicit user request + conforms, _, results_text = pyshacl.validate( + data_graph=self._instance_graph, + shacl_graph=self._shacl_graph, + ont_graph=self._ont_graph, + inference="rdfs", + abort_on_first=False, + ) + if not conforms: + raise ValidationError("SHACL verification failed", report=results_text) + + def audit(self) -> "AuditReport": + """ + Run SHACL verification and return a structured report. + + Unlike :meth:`verify`, this never raises — it returns an + :class:`~knowledgecomplex.audit.AuditReport` with ``conforms``, + ``violations``, and ``text`` fields. + + Returns + ------- + AuditReport + """ + from knowledgecomplex.audit import _build_report + conforms, _, results_text = pyshacl.validate( + data_graph=self._instance_graph, + shacl_graph=self._shacl_graph, + ont_graph=self._ont_graph, + inference="rdfs", + abort_on_first=False, + ) + return _build_report(conforms, results_text, self._schema._namespace) + + def deferred_verification(self) -> "_DeferredVerification": + """ + Context manager that suppresses per-write SHACL verification. + + Inside the context, ``add_vertex``, ``add_edge``, and ``add_face`` + skip SHACL checks. On exit, a single verification pass runs over + the entire graph. If verification fails, ``ValidationError`` is raised. + + This is much faster for bulk construction — one SHACL pass instead + of one per element. + + Example + ------- + >>> with kc.deferred_verification(): + ... kc.add_vertex("v1", type="Node") + ... kc.add_vertex("v2", type="Node") + ... kc.add_edge("e1", type="Link", vertices={"v1", "v2"}) + """ + return _DeferredVerification(self) + def _assert_element( self, id: str, @@ -347,6 +529,38 @@ def add_face( raise ValueError(f"add_face requires exactly 3 boundary edges; got {len(boundary)}") self._assert_element(id, type, boundary_ids=boundary, attributes=attributes, uri=uri) + def remove_element(self, id: str) -> None: + """Remove an element and all its triples from the complex. + + Removes the element's type assertion, boundary relations (both + directions), attributes, kc:uri, and kc:hasElement membership. + + No validation is performed after removal — the caller is responsible + for ensuring the resulting complex is valid (e.g. removing faces + before their boundary edges). + + Parameters + ---------- + id : str + Element identifier to remove. + + Raises + ------ + ValueError + If no element with that ID exists. + """ + iri = URIRef(f"{self._schema._base_iri}{id}") + if (iri, RDF.type, None) not in self._instance_graph: + raise ValueError(f"No element with id '{id}' in the complex") + + # Remove all triples where element is subject + for s, p, o in list(self._instance_graph.triples((iri, None, None))): + self._instance_graph.remove((s, p, o)) + + # Remove all triples where element is object (coboundary, hasElement) + for s, p, o in list(self._instance_graph.triples((None, None, iri))): + self._instance_graph.remove((s, p, o)) + def query(self, template_name: str, **kwargs: Any) -> pd.DataFrame: """ Execute a named SPARQL template and return results as a DataFrame. @@ -376,6 +590,10 @@ def query(self, template_name: str, **kwargs: Any) -> pd.DataFrame: ) sparql = self._query_templates[template_name] + # Substitute {placeholder} tokens with kwargs values + for key, value in kwargs.items(): + sparql = sparql.replace(f"{{{key}}}", str(value)) + # Provide namespace bindings for queries that may not declare all prefixes init_ns = { "kc": _KC, @@ -393,6 +611,39 @@ def query(self, template_name: str, **kwargs: Any) -> pd.DataFrame: rows.append([str(val) if val is not None else None for val in row]) return pd.DataFrame(rows, columns=columns) + def query_ids(self, template_name: str, **kwargs: Any) -> set[str]: + """Execute a named SPARQL template and return the first column as element IDs. + + Like :meth:`query` but returns a ``set[str]`` of element IDs + (namespace prefix stripped) instead of a DataFrame. Useful for + obtaining subcomplexes from parameterized queries. + + Parameters + ---------- + template_name : str + Name of a registered query template. + **kwargs : Any + Substitution values for ``{placeholder}`` tokens in the template. + + Returns + ------- + set[str] + + Raises + ------ + UnknownQueryError + If template_name is not registered. + """ + if template_name not in self._query_templates: + raise UnknownQueryError( + f"No query template named '{template_name}'. " + f"Available: {sorted(self._query_templates)}" + ) + sparql = self._query_templates[template_name] + for key, value in kwargs.items(): + sparql = sparql.replace(f"{{{key}}}", str(value)) + return self._ids_from_query(sparql) + def dump_graph(self) -> str: """Return the instance graph as a Turtle string.""" return self._instance_graph.serialize(format="turtle") @@ -445,3 +696,412 @@ def load(cls, path: str | Path) -> "KnowledgeComplex": if instance_file.exists(): kc._instance_graph.parse(str(instance_file), format="turtle") return kc + + # --- Element handles and listing --- + + def element(self, id: str) -> Element: + """ + Get an Element handle for the given element ID. + + Parameters + ---------- + id : str + Local identifier of the element. + + Returns + ------- + Element + + Raises + ------ + ValueError + If no element with that ID exists in the graph. + """ + iri = URIRef(f"{self._schema._base_iri}{id}") + if (iri, RDF.type, None) not in self._instance_graph: + raise ValueError(f"No element with id '{id}' in the complex") + return Element(self, id) + + def element_ids(self, type: str | None = None) -> list[str]: + """ + List element IDs, optionally filtered by type (includes subtypes). + + Parameters + ---------- + type : str, optional + Filter to elements of this type or any subtype. + + Returns + ------- + list[str] + """ + ns_str = self._schema._base_iri + if type is not None: + if type not in self._schema._types: + raise SchemaError(f"Type '{type}' is not registered") + type_iri = self._ns[type] + # Use SPARQL with subClassOf* to include subtypes + sparql = f""" + SELECT ?elem WHERE {{ + ?elem a/rdfs:subClassOf* <{type_iri}> . + <{self._complex_iri}> ?elem . + }} + """ + results = self._instance_graph.query( + sparql, initNs={"rdfs": RDFS, "rdf": RDF} + ) + ids = [] + for row in results: + elem_str = str(row[0]) + if elem_str.startswith(ns_str): + ids.append(elem_str[len(ns_str):]) + return sorted(ids) + else: + # All elements in the complex + ids = [] + for _, _, o in self._instance_graph.triples( + (self._complex_iri, _KC.hasElement, None) + ): + elem_str = str(o) + if elem_str.startswith(ns_str): + ids.append(elem_str[len(ns_str):]) + return sorted(ids) + + def elements(self, type: str | None = None) -> list[Element]: + """ + List Element handles, optionally filtered by type (includes subtypes). + + Parameters + ---------- + type : str, optional + Filter to elements of this type or any subtype. + + Returns + ------- + list[Element] + """ + return [Element(self, id) for id in self.element_ids(type=type)] + + def is_subcomplex(self, ids: set[str]) -> bool: + """ + Check whether a set of element IDs forms a valid subcomplex. + + A set is a valid subcomplex iff it is closed under the boundary + operator: for every element in the set, all its boundary elements + are also in the set. + + Parameters + ---------- + ids : set[str] + Element identifiers to check. + + Returns + ------- + bool + """ + if not ids: + return True + return set(ids) == self.closure(ids) + + # --- Topological query helpers --- + + def _iri(self, id: str) -> str: + """Return the full IRI string for an element ID.""" + return f"{self._schema._base_iri}{id}" + + def _type_filter_clause(self, var: str, type: str | None) -> str: + """Return a SPARQL clause filtering ?var by type, or empty string.""" + if type is None: + return "" + if type not in self._schema._types: + raise SchemaError(f"Type '{type}' is not registered") + type_iri = self._ns[type] + return f"?{var} a/rdfs:subClassOf* <{type_iri}> ." + + def _ids_from_query(self, sparql: str) -> set[str]: + """Execute SPARQL and return the first column as a set of element IDs.""" + ns_str = self._schema._base_iri + init_ns = { + "kc": _KC, "rdf": RDF, "rdfs": RDFS, + "owl": OWL, "xsd": XSD, + self._schema._namespace: self._ns, + } + results = self._instance_graph.query(sparql, initNs=init_ns) + ids: set[str] = set() + for row in results: + val = str(row[0]) + if val.startswith(ns_str): + ids.add(val[len(ns_str):]) + return ids + + # --- Topological query methods --- + + def boundary(self, id: str, *, type: str | None = None) -> set[str]: + """Return ∂(id): the direct faces of element id via kc:boundedBy. + + For a vertex, returns the empty set. + For an edge, returns its 2 boundary vertices. + For a face, returns its 3 boundary edges. + + Parameters + ---------- + id : str + Element identifier. + type : str, optional + Filter results to this type (including subtypes). + + Returns + ------- + set[str] + """ + sparql = ( + self._query_templates["boundary"] + .replace("{simplex}", f"<{self._iri(id)}>") + .replace("{type_filter}", self._type_filter_clause("boundary", type)) + ) + return self._ids_from_query(sparql) + + def coboundary(self, id: str, *, type: str | None = None) -> set[str]: + """Return the cofaces of id: all simplices whose boundary contains id. + + Computes {τ ∈ K : id ∈ ∂(τ)} — the set of (k+1)-simplices that + have id as a boundary element. This is the combinatorial coface + relation, not the algebraic coboundary operator δ on cochains. + + Parameters + ---------- + id : str + Element identifier. + type : str, optional + Filter results to this type (including subtypes). + + Returns + ------- + set[str] + """ + tf = self._type_filter_clause("coboundary", type) + sparql = f"""\ +PREFIX kc: +PREFIX rdfs: +SELECT ?coboundary WHERE {{ + ?coboundary kc:boundedBy <{self._iri(id)}> . + {tf} +}}""" + return self._ids_from_query(sparql) + + def star(self, id: str, *, type: str | None = None) -> set[str]: + """Return St(id): all simplices containing id as a face (transitive coboundary + self). + + Parameters + ---------- + id : str + Element identifier. + type : str, optional + Filter results to this type (including subtypes). + + Returns + ------- + set[str] + """ + sparql = ( + self._query_templates["star"] + .replace("{simplex}", f"<{self._iri(id)}>") + .replace("{type_filter}", self._type_filter_clause("star", type)) + ) + return self._ids_from_query(sparql) + + def closure(self, ids: str | set[str], *, type: str | None = None) -> set[str]: + """Return Cl(ids): the smallest subcomplex containing ids. + + Accepts a single ID or a set of IDs. When given a set, returns the + union of closures. + + Parameters + ---------- + ids : str or set[str] + Element identifier(s). + type : str, optional + Filter results to this type (including subtypes). + + Returns + ------- + set[str] + """ + if isinstance(ids, str): + sparql = ( + self._query_templates["closure"] + .replace("{simplex}", f"<{self._iri(ids)}>") + .replace("{type_filter}", self._type_filter_clause("closure", type)) + ) + return self._ids_from_query(sparql) + # Set input: use VALUES clause + values = " ".join(f"(<{self._iri(i)}>)" for i in ids) + tf = self._type_filter_clause("closure", type) + sparql = f"""\ +PREFIX kc: +PREFIX rdfs: +SELECT ?closure WHERE {{ + VALUES (?sigma) {{ {values} }} + ?sigma kc:boundedBy* ?closure . + {tf} +}}""" + return self._ids_from_query(sparql) + + def closed_star(self, id: str, *, type: str | None = None) -> set[str]: + """Return Cl(St(id)): the closure of the star. + + Always a valid subcomplex. + + Parameters + ---------- + id : str + Element identifier. + type : str, optional + Filter results to this type (including subtypes). + + Returns + ------- + set[str] + """ + return self.closure(self.star(id), type=type) + + def link(self, id: str, *, type: str | None = None) -> set[str]: + """Return Lk(id): Cl(St(id)) \\ St(id). + + The link is the set of simplices in the closed star that do not + themselves contain id as a face. + + Parameters + ---------- + id : str + Element identifier. + type : str, optional + Filter results to this type (including subtypes). + + Returns + ------- + set[str] + """ + result = self.closed_star(id) - self.star(id) + if type is not None: + typed = set(self.element_ids(type=type)) + result &= typed + return result + + def skeleton(self, k: int) -> set[str]: + """Return sk_k(K): all elements of dimension <= k. + + k=0: vertices only + k=1: vertices and edges + k=2: vertices, edges, and faces (everything) + + Parameters + ---------- + k : int + Maximum dimension (0, 1, or 2). + + Returns + ------- + set[str] + + Raises + ------ + ValueError + If k < 0 or k > 2. + """ + if k < 0 or k > 2: + raise ValueError(f"skeleton dimension must be 0, 1, or 2; got {k}") + dim_classes_map = { + 0: [_KC.Vertex], + 1: [_KC.Vertex, _KC.Edge], + 2: [_KC.Vertex, _KC.Edge, _KC.Face], + } + classes = dim_classes_map[k] + unions = " UNION ".join( + f"{{ ?elem a/rdfs:subClassOf* <{c}> }}" for c in classes + ) + sparql = ( + self._query_templates["skeleton"] + .replace("{complex}", f"<{self._complex_iri}>") + .replace("{dim_classes}", unions) + ) + return self._ids_from_query(sparql) + + def degree(self, id: str) -> int: + """Return deg(id): the number of edges incident to vertex id. + + Parameters + ---------- + id : str + Vertex identifier. + + Returns + ------- + int + """ + sparql = ( + self._query_templates["degree"] + .replace("{simplex}", f"<{self._iri(id)}>") + ) + init_ns = { + "kc": _KC, "rdf": RDF, "rdfs": RDFS, + "owl": OWL, "xsd": XSD, + self._schema._namespace: self._ns, + } + results = self._instance_graph.query(sparql, initNs=init_ns) + for row in results: + return int(row[0]) + return 0 + + # --- Codec registration and resolution --- + + def register_codec(self, type_name: str, codec: Codec) -> None: + """ + Register a codec for the given type. + + Parameters + ---------- + type_name : str + Must be a registered type in the schema. + codec : Codec + Object implementing compile() and decompile(). + """ + if type_name not in self._schema._types: + raise SchemaError(f"Type '{type_name}' is not registered") + if not isinstance(codec, Codec): + raise TypeError( + f"Expected a Codec instance, got {type(codec).__name__}" + ) + self._codecs[type_name] = codec + + def _resolve_codec(self, type_name: str) -> Codec: + """Walk type hierarchy to find the nearest registered codec.""" + current: str | None = type_name + while current is not None: + if current in self._codecs: + return self._codecs[current] + current = self._schema._types.get(current, {}).get("parent") + raise SchemaError( + f"No codec registered for type '{type_name}' or any of its ancestors" + ) + + def decompile_uri(self, type_name: str, uri: str) -> dict: + """ + Decompile an artifact at a URI without adding it to the graph. + + Parameters + ---------- + type_name : str + The element type (used to resolve the codec). + uri : str + URI of the artifact to read. + + Returns + ------- + dict + Attribute key-value pairs. + """ + if type_name not in self._schema._types: + raise SchemaError(f"Type '{type_name}' is not registered") + codec = self._resolve_codec(type_name) + return codec.decompile(uri) diff --git a/knowledgecomplex/io.py b/knowledgecomplex/io.py new file mode 100644 index 0000000..e4b365c --- /dev/null +++ b/knowledgecomplex/io.py @@ -0,0 +1,167 @@ +"""File-based import/export utilities for KnowledgeComplex. + +Functions +--------- +save_graph Serialize the instance graph to a file. +load_graph Parse a file into the instance graph (additive). +dump_graph Return the instance graph as a string in a given format. + +Design notes +------------ +* Turtle (``.ttl``) is the default format — human-readable and consistent + with the ontology/shapes patterns used throughout the codebase. +* All load functions are **additive**: they call ``graph.parse()`` which + adds triples to the existing graph. Load into a fresh + ``KnowledgeComplex`` for a clean restore. +* No TriG (``.trig``) or N-Quads (``.nq``) — ``KnowledgeComplex`` uses a + plain ``rdflib.Graph``, not a ``ConjunctiveGraph``. + +Adapted from discourse_graph.io (multi-agent-dg). +""" +from __future__ import annotations + +from pathlib import Path +from typing import TYPE_CHECKING, Optional + +if TYPE_CHECKING: + from knowledgecomplex.graph import KnowledgeComplex + +from knowledgecomplex.exceptions import ValidationError + +# ── Format registry ───────────────────────────────────────────────────────── + +_FORMAT_BY_EXT: dict[str, str] = { + ".ttl": "turtle", + ".jsonld": "json-ld", + ".nt": "ntriples", + ".n3": "n3", + ".rdf": "xml", + ".xml": "xml", +} + + +def _detect_format(path: Path, given: Optional[str]) -> str: + """Return the serialization format string. + + Parameters + ---------- + path : + File path whose suffix is used for auto-detection. + given : + Explicit format override; returned as-is when not ``None``. + + Raises + ------ + ValueError + If *given* is ``None`` and the file suffix is not in the registry. + """ + if given is not None: + return given + ext = path.suffix.lower() + if ext not in _FORMAT_BY_EXT: + raise ValueError( + f"Cannot auto-detect RDF format for extension {ext!r}. " + f"Pass format= explicitly. Known extensions: " + f"{', '.join(sorted(_FORMAT_BY_EXT))}." + ) + return _FORMAT_BY_EXT[ext] + + +# ── Instance graph I/O ────────────────────────────────────────────────────── + + +def save_graph( + kc: "KnowledgeComplex", + path: "Path | str", + format: str = "turtle", +) -> None: + """Serialize ``kc._instance_graph`` to a file. + + Parameters + ---------- + kc : + The ``KnowledgeComplex`` whose instance graph is serialized. + path : + Destination file path. The file is created or overwritten. + format : + rdflib serialization format string. Defaults to ``"turtle"``. + Other useful values: ``"json-ld"``, ``"ntriples"``, ``"n3"``, + ``"xml"`` (RDF/XML). + """ + path = Path(path) + kc._instance_graph.serialize(destination=str(path), format=format) + + +def load_graph( + kc: "KnowledgeComplex", + path: "Path | str", + format: Optional[str] = None, + validate: bool = False, +) -> None: + """Parse a file into ``kc._instance_graph`` (additive). + + Parameters + ---------- + kc : + The ``KnowledgeComplex`` whose instance graph receives the + parsed triples. + path : + Source file path. + format : + rdflib serialization format string. When ``None``, auto-detected + from the file extension using the built-in registry. + validate : + If ``True``, run SHACL validation after parsing. On failure the + newly added triples are rolled back and ``ValidationError`` is + raised. Defaults to ``False`` because the data may have been + exported from a validated ``KnowledgeComplex`` and re-validating + is expensive. + + Notes + ----- + This operation is **additive**: existing triples in the instance graph + are retained. For a clean restore, load into a freshly constructed + ``KnowledgeComplex``. + + The instance graph includes TBox (ontology) triples. Loading a file + that was saved from a KC with the same schema is harmless — rdflib + deduplicates triples. Loading data from a different schema will merge + ontologies; the caller is responsible for schema compatibility. + """ + path = Path(path) + fmt = _detect_format(path, format) + + if validate: + before = set(kc._instance_graph) + + kc._instance_graph.parse(source=str(path), format=fmt) + + if validate: + try: + kc._validate() + except ValidationError: + added = set(kc._instance_graph) - before + for triple in added: + kc._instance_graph.remove(triple) + raise + + +def dump_graph( + kc: "KnowledgeComplex", + format: str = "turtle", +) -> str: + """Return the instance graph as a string in the requested format. + + Parameters + ---------- + kc : + The ``KnowledgeComplex`` whose instance graph is serialized. + format : + rdflib serialization format string. Defaults to ``"turtle"``. + + Returns + ------- + str + The serialized graph. + """ + return kc._instance_graph.serialize(format=format) diff --git a/knowledgecomplex/ontologies/README.md b/knowledgecomplex/ontologies/README.md new file mode 100644 index 0000000..60d7bb7 --- /dev/null +++ b/knowledgecomplex/ontologies/README.md @@ -0,0 +1,182 @@ +# Prebuilt Ontologies + +This directory contains ready-to-use domain ontologies that ship with the package. +Each module exposes a `schema()` function returning a configured `SchemaBuilder`. + +## Available ontologies + +| Module | Domain | Vertex types | Edge types | Face types | +|--------|--------|-------------|------------|------------| +| `operations` | Actor/activity/resource workflows | actor, activity, resource | performs, requires, produces, accesses, responsible | operation, production | +| `brand` | Audience/theme brand strategy | audience, theme | resonance, interplay, overlap | thematic_alignment, audience_bridge | +| `research` | Paper/concept literature review | paper, concept, note | discusses, cites, connects, references | synthesis, annotation | + +## Usage + +```python +from knowledgecomplex import KnowledgeComplex +from knowledgecomplex.ontologies import operations + +sb = operations.schema(namespace="my_project") +kc = KnowledgeComplex(schema=sb) + +kc.add_vertex("alice", type="actor", name="Alice") +kc.add_vertex("etl", type="activity", name="Daily ETL") +kc.add_edge("e1", type="performs", vertices={"alice", "etl"}, role="lead") +``` + +Each ontology is a thin Python module — typically under 30 lines. They exist as +convenience starting points, not as comprehensive domain models. Extend them by +calling additional `add_*_type()` methods on the returned `SchemaBuilder`. + +## Philosophy: don't hoard ontologies + +This package deliberately ships very few ontologies. Domain ontologies belong +to their domain communities, not to a framework package. The prebuilt ontologies +here are examples and starting points — they demonstrate the API patterns and +give new users something to run immediately. + +For production use, we recommend: + +1. **Define your ontology in Python** using `SchemaBuilder` +2. **Export it** as standard OWL + SHACL Turtle files +3. **Publish it** at a persistent URI following semantic web best practices +4. **Share the Python module** alongside the Turtle files for programmatic use + +## Creating and exporting ontologies + +### Define in Python + +```python +from knowledgecomplex import SchemaBuilder, vocab, text + +sb = SchemaBuilder(namespace="mydom") +sb.add_vertex_type("Document", attributes={"title": text(), "status": vocab("draft", "final")}) +sb.add_edge_type("References") +sb.add_face_type("Collection") +``` + +### Export as Turtle + +```python +# Export to a directory +sb.export("my_ontology/") +# my_ontology/ontology.ttl — OWL classes, properties, axioms +# my_ontology/shapes.ttl — SHACL validation shapes +# my_ontology/queries/ — SPARQL templates (if any registered) + +# Or get the raw Turtle strings +owl_turtle = sb.dump_owl() +shacl_turtle = sb.dump_shacl() +``` + +### Load from Turtle + +```python +from knowledgecomplex import SchemaBuilder + +sb = SchemaBuilder.load("my_ontology/") # reads ontology.ttl + shapes.ttl +``` + +This makes ontologies portable — they can be shared as a directory of `.ttl` +files, version-controlled, and loaded by any `knowledgecomplex` user. + +## Publishing ontologies on the semantic web + +The standard approach for making ontologies discoverable and citable: + +### 1. Choose a persistent URI + +Use a redirect service that outlives any single hosting provider: + +- **[w3id.org](https://w3id.org/)** — community-maintained persistent identifiers. + Register by submitting a PR to [perma-id/w3id.org](https://github.com/perma-id/w3id.org) + with an `.htaccess` file for content negotiation. +- **[purl.org](https://purl.archive.org/)** — Internet Archive's persistent URL service. + +Example: `https://w3id.org/myorg/myontology/` resolves to your Turtle file +when requested with `Accept: text/turtle`, or to HTML documentation otherwise. + +### 2. Host the Turtle files + +The simplest approach: host on GitHub and point the persistent URI at the raw file. + +``` +# .htaccess for w3id.org +RewriteEngine On +RewriteCond %{HTTP_ACCEPT} text/turtle +RewriteRule ^$ https://raw.githubusercontent.com/myorg/myrepo/main/ontology.ttl [R=303,L] +RewriteRule ^$ https://myorg.github.io/myrepo/ [R=303,L] +``` + +### 3. Use dereferenceable URIs in your schema + +Once your persistent URI is live, use it as your namespace: + +```python +sb = SchemaBuilder(namespace="mydom") +# Currently generates: https://example.org/mydom# +# For production: update _base_iri to your w3id.org URI +``` + +Note: the current framework uses `https://example.org/{namespace}#` as the +default base IRI. For production deployments, this should be replaced with +your registered persistent URI. This is tracked as a pre-release task. + +## Existing ontologies as design references + +Standard semantic web ontologies (Dublin Core, SKOS, PROV-O, schema.org, etc.) +are **not directly compatible** with knowledge complex ontologies. They define +flat class hierarchies and properties — they don't inherit from `kc:Vertex`, +`kc:Edge`, or `kc:Face`, and they have no concept of boundary operators or +simplicial structure. + +That said, they are valuable as **design references** when deciding what types +and attributes your KC ontology should have: + +- **[Linked Open Vocabularies (LOV)](https://lov.linkeddata.es/)** — searchable + index of RDF vocabularies. Useful for finding standard property names and + attribute patterns before defining your own. +- **[schema.org](https://schema.org/)** — web-scale vocabulary. Good reference + for entity types (Person, Organization, CreativeWork) when modeling actors. +- **[SKOS](https://www.w3.org/2004/02/skos/)** — knowledge organization. + Relevant when your vertices represent concepts in a taxonomy. +- **[PROV-O](https://www.w3.org/TR/prov-o/)** — provenance. Useful patterns + for activity/agent/entity relationships that map naturally to KC edges. + +The workflow: browse existing ontologies for naming conventions and attribute +patterns, then declare your KC types using `SchemaBuilder`. The KC types +carry the simplicial structure that flat ontologies lack. + +## The KC ontology registry (future) + +Since KC ontologies extend `kc:Element` with typed simplices, they form their +own ecosystem distinct from the broader semantic web. A KC ontology authored +by one team can be loaded by another via `SchemaBuilder.load()`, but only if +it was exported by `SchemaBuilder.export()`. + +We plan to host a public registry of KC-compatible ontologies at +**knowledgecomplex.org**. The registry will serve as the canonical place to: + +- Discover published KC ontologies by domain +- Resolve persistent URIs for KC ontology namespaces +- Host documentation and SHACL shape files for registered ontologies +- Provide content negotiation (Turtle vs. HTML) following semantic web conventions + +This is not yet live. For now, share ontologies as Git repositories containing +the exported `ontology.ttl` + `shapes.ttl` files and the Python module that +generates them. Community contributions of domain ontologies are welcome — +when the registry launches, published ontologies will be the first entries. + +## Loading KC ontologies from Turtle + +If you have a KC ontology exported by `SchemaBuilder.export()`: + +```python +sb = SchemaBuilder.load("path/to/ontology_dir/") +# Reads ontology.ttl + shapes.ttl, reconstructs the type registry +``` + +This only works for KC-native ontologies (those extending `kc:Vertex`, +`kc:Edge`, `kc:Face`). Arbitrary OWL ontologies from external sources +cannot be loaded this way — they would need to be re-authored as KC types. diff --git a/knowledgecomplex/ontologies/__init__.py b/knowledgecomplex/ontologies/__init__.py new file mode 100644 index 0000000..f209364 --- /dev/null +++ b/knowledgecomplex/ontologies/__init__.py @@ -0,0 +1,24 @@ +""" +knowledgecomplex.ontologies — Pre-built ontologies for common domains. + +Each module exports a ``schema()`` function returning a configured +:class:`~knowledgecomplex.schema.SchemaBuilder`. + +Available ontologies: + +- **operations** — actor / activity / resource workflows +- **brand** — audience / theme brand strategy +- **research** — paper / concept / note for literature review + +Usage:: + + from knowledgecomplex.ontologies import brand + from knowledgecomplex import KnowledgeComplex + + sb = brand.schema() + kc = KnowledgeComplex(schema=sb) +""" + +from knowledgecomplex.ontologies import operations, brand, research + +__all__ = ["operations", "brand", "research"] diff --git a/knowledgecomplex/ontologies/brand.py b/knowledgecomplex/ontologies/brand.py new file mode 100644 index 0000000..39eda7b --- /dev/null +++ b/knowledgecomplex/ontologies/brand.py @@ -0,0 +1,67 @@ +""" +Brand ontology — audience / theme brand strategy. + +Models how audiences relate to themes and how themes relate to each +other, capturing both synergies and tensions. + +Vertex types: + audience — a target audience segment (name, description) + theme — a brand theme or topic (name, description) + +Edge types: + resonance — audience↔theme: how this theme lands with this audience + valence: positive/negative/mixed + intensity: strong/moderate/weak + + interplay — theme↔theme: relationship between themes + nature: synergy/tension/paradox + direction: reinforcing/undermining/independent + + overlap — audience↔audience: shared or contested ground + alignment: agreement/disagreement/partial + significance: high/moderate/low + +Face types: + thematic_alignment — theme↔theme↔audience: two interplaying themes + both resonate with the same audience. Captures whether synergies + or tensions between themes play out positively or negatively for + a given audience. + + audience_bridge — audience↔audience↔theme: two overlapping audiences + both engage with the same theme. Captures whether a theme unites + or divides audiences. +""" + +from knowledgecomplex.schema import SchemaBuilder, vocab, text + + +def schema(namespace: str = "brand") -> SchemaBuilder: + """Return a SchemaBuilder configured for the brand ontology.""" + sb = SchemaBuilder(namespace=namespace) + + sb.add_vertex_type("audience", attributes={ + "name": text(), + "description": text(required=False), + }) + sb.add_vertex_type("theme", attributes={ + "name": text(), + "description": text(required=False), + }) + + sb.add_edge_type("resonance", attributes={ + "valence": vocab("positive", "negative", "mixed"), + "intensity": vocab("strong", "moderate", "weak"), + }) + sb.add_edge_type("interplay", attributes={ + "nature": vocab("synergy", "tension", "paradox"), + "direction": vocab("reinforcing", "undermining", "independent"), + }) + sb.add_edge_type("overlap", attributes={ + "alignment": vocab("agreement", "disagreement", "partial"), + "significance": vocab("high", "moderate", "low"), + }) + + sb.add_face_type("thematic_alignment") + sb.add_face_type("audience_bridge") + + return sb diff --git a/knowledgecomplex/ontologies/operations.py b/knowledgecomplex/ontologies/operations.py new file mode 100644 index 0000000..010f98c --- /dev/null +++ b/knowledgecomplex/ontologies/operations.py @@ -0,0 +1,44 @@ +""" +Operations ontology — actor / activity / resource workflows. + +Models operational processes where actors perform activities that +require and produce resources. + +Vertex types: + actor — a person, team, or agent (name) + activity — a process, task, or workflow (name) + resource — a dataset, document, or artifact (name) + +Edge types: + performs — actor↔activity (role: lead/support) + requires — activity↔resource, input dependency (mode: read/write) + produces — activity↔resource, output artifact (mode: read/write) + accesses — actor↔resource, direct access (mode: read/write) + responsible — actor↔resource, ownership (level: owner/steward) + +Face types: + operation — actor + activity + input resource triad + production — actor + activity + output resource triad +""" + +from knowledgecomplex.schema import SchemaBuilder, vocab, text + + +def schema(namespace: str = "ops") -> SchemaBuilder: + """Return a SchemaBuilder configured for the operations ontology.""" + sb = SchemaBuilder(namespace=namespace) + + sb.add_vertex_type("actor", attributes={"name": text()}) + sb.add_vertex_type("activity", attributes={"name": text()}) + sb.add_vertex_type("resource", attributes={"name": text()}) + + sb.add_edge_type("performs", attributes={"role": vocab("lead", "support")}) + sb.add_edge_type("requires", attributes={"mode": vocab("read", "write")}) + sb.add_edge_type("produces", attributes={"mode": vocab("read", "write")}) + sb.add_edge_type("accesses", attributes={"mode": vocab("read", "write")}) + sb.add_edge_type("responsible", attributes={"level": vocab("owner", "steward")}) + + sb.add_face_type("operation") + sb.add_face_type("production") + + return sb diff --git a/knowledgecomplex/ontologies/research.py b/knowledgecomplex/ontologies/research.py new file mode 100644 index 0000000..a13a956 --- /dev/null +++ b/knowledgecomplex/ontologies/research.py @@ -0,0 +1,73 @@ +""" +Research ontology — paper / concept / note for literature review. + +Models research literature and personal annotations. Papers discuss +concepts, cite each other, and notes connect ideas across papers. +Suitable for Obsidian-style knowledge management. + +Vertex types: + paper — a research paper or article (title, year) + concept — an idea, method, or theory (name) + note — a personal note or annotation (title) + +Edge types: + discusses — paper↔concept: paper engages with this concept + depth: primary/secondary + + cites — paper↔paper: citation relationship. Since KC edges are + unoriented, direction is encoded as an attribute. + role: cites/cited_by + context: supports/extends/critiques + + connects — note↔concept: note links to a concept + relation: defines/questions/applies + + references — note↔paper: note references this paper + purpose: summarizes/responds_to/builds_on + +Face types: + synthesis — paper↔paper↔concept: two papers in a citation + relationship that both discuss the same concept. Captures + intellectual lineage through shared conceptual ground. + + annotation — note↔paper↔concept: a note references a paper + and connects to a concept the paper discusses. Captures + the reader's engagement with a specific idea in a specific work. +""" + +from knowledgecomplex.schema import SchemaBuilder, vocab, text + + +def schema(namespace: str = "res") -> SchemaBuilder: + """Return a SchemaBuilder configured for the research ontology.""" + sb = SchemaBuilder(namespace=namespace) + + sb.add_vertex_type("paper", attributes={ + "title": text(), + "year": text(required=False), + }) + sb.add_vertex_type("concept", attributes={ + "name": text(), + }) + sb.add_vertex_type("note", attributes={ + "title": text(), + }) + + sb.add_edge_type("discusses", attributes={ + "depth": vocab("primary", "secondary"), + }) + sb.add_edge_type("cites", attributes={ + "role": vocab("cites", "cited_by"), + "context": vocab("supports", "extends", "critiques"), + }) + sb.add_edge_type("connects", attributes={ + "relation": vocab("defines", "questions", "applies"), + }) + sb.add_edge_type("references", attributes={ + "purpose": vocab("summarizes", "responds_to", "builds_on"), + }) + + sb.add_face_type("synthesis") + sb.add_face_type("annotation") + + return sb diff --git a/knowledgecomplex/queries/boundary.sparql b/knowledgecomplex/queries/boundary.sparql new file mode 100644 index 0000000..78922be --- /dev/null +++ b/knowledgecomplex/queries/boundary.sparql @@ -0,0 +1,17 @@ +# knowledgecomplex/queries/boundary.sparql +# Boundary operator: direct faces of a simplex via kc:boundedBy. +# +# Boundary of a Vertex = {} (empty) +# Boundary of an Edge = {v1, v2} (two boundary vertices) +# Boundary of a Face = {e1, e2, e3} (three boundary edges) +# +# Usage: substitute {simplex} with the IRI of the element. +# substitute {type_filter} with a type constraint or empty string. + +PREFIX kc: +PREFIX rdfs: + +SELECT ?boundary WHERE { + {simplex} kc:boundedBy ?boundary . + {type_filter} +} diff --git a/knowledgecomplex/queries/closure.sparql b/knowledgecomplex/queries/closure.sparql new file mode 100644 index 0000000..3d459e0 --- /dev/null +++ b/knowledgecomplex/queries/closure.sparql @@ -0,0 +1,18 @@ +# knowledgecomplex/queries/closure.sparql +# Closure operator: smallest subcomplex containing sigma. +# +# Cl({sigma}) = sigma plus all faces of sigma (transitive boundary). +# +# Uses the forward property path kc:boundedBy* to walk zero-or-more +# boundary hops from sigma downward. +# +# Usage: substitute {simplex} with the IRI of the element. +# substitute {type_filter} with a type constraint or empty string. + +PREFIX kc: +PREFIX rdfs: + +SELECT ?closure WHERE { + {simplex} kc:boundedBy* ?closure . + {type_filter} +} diff --git a/knowledgecomplex/queries/degree.sparql b/knowledgecomplex/queries/degree.sparql new file mode 100644 index 0000000..79375e6 --- /dev/null +++ b/knowledgecomplex/queries/degree.sparql @@ -0,0 +1,12 @@ +# knowledgecomplex/queries/degree.sparql +# Degree of a vertex: count of incident edges (coboundary cardinality). +# +# deg(v) = |{e in K : v in boundary(e)}| +# +# Usage: substitute {simplex} with the IRI of the vertex. + +PREFIX kc: + +SELECT (COUNT(?edge) AS ?degree) WHERE { + ?edge kc:boundedBy {simplex} . +} diff --git a/knowledgecomplex/queries/skeleton.sparql b/knowledgecomplex/queries/skeleton.sparql new file mode 100644 index 0000000..23c3c71 --- /dev/null +++ b/knowledgecomplex/queries/skeleton.sparql @@ -0,0 +1,17 @@ +# knowledgecomplex/queries/skeleton.sparql +# k-Skeleton: all elements of the complex with dimension <= k. +# +# sk_0(K) = vertices +# sk_1(K) = vertices + edges +# sk_2(K) = vertices + edges + faces (everything) +# +# Usage: substitute {complex} with the IRI of the kc:Complex individual. +# substitute {dim_classes} with a UNION of type constraints. + +PREFIX kc: +PREFIX rdfs: + +SELECT ?elem WHERE { + {complex} kc:hasElement ?elem . + {dim_classes} +} diff --git a/knowledgecomplex/queries/star.sparql b/knowledgecomplex/queries/star.sparql new file mode 100644 index 0000000..e659cc9 --- /dev/null +++ b/knowledgecomplex/queries/star.sparql @@ -0,0 +1,19 @@ +# knowledgecomplex/queries/star.sparql +# Star operator: all simplices containing sigma as a face (transitive coboundary + self). +# +# St(sigma) = {tau in K : sigma is a face of tau} +# +# Uses kc:boundedBy* from ?star to {simplex}: since boundedBy points +# downward (high-dim → low-dim), ?star can only reach {simplex} if +# {simplex} is a face of ?star — exactly the star definition. +# +# Usage: substitute {simplex} with the IRI of the element. +# substitute {type_filter} with a type constraint or empty string. + +PREFIX kc: +PREFIX rdfs: + +SELECT ?star WHERE { + ?star kc:boundedBy* {simplex} . + {type_filter} +} diff --git a/knowledgecomplex/schema.py b/knowledgecomplex/schema.py index 328497e..3858919 100644 --- a/knowledgecomplex/schema.py +++ b/knowledgecomplex/schema.py @@ -17,9 +17,9 @@ from __future__ import annotations import shutil -from dataclasses import dataclass, field +from dataclasses import dataclass from pathlib import Path -from typing import Any +from typing import Any, Protocol, runtime_checkable # rdflib is an internal implementation detail. # Do not re-export any rdflib types through the public API. @@ -36,6 +36,44 @@ _SH = Namespace("http://www.w3.org/ns/shacl#") +@runtime_checkable +class Codec(Protocol): + """ + Bidirectional bridge between element records and artifacts at URIs. + + A codec pairs compile (map → territory) and decompile (territory → map) + for a given element type. Registered on KnowledgeComplex instances via + register_codec(), and inherited by child types. + """ + + def compile(self, element: dict) -> None: + """ + Write an element record to the artifact at its URI. + + Parameters + ---------- + element : dict + Keys: id, type, uri, plus all attribute key-value pairs. + """ + ... + + def decompile(self, uri: str) -> dict: + """ + Read the artifact at a URI and return an attribute dict. + + Parameters + ---------- + uri : str + The URI of the artifact to read. + + Returns + ------- + dict + Attribute key-value pairs suitable for add_vertex/add_edge/add_face kwargs. + """ + ... + + @dataclass(frozen=True) class VocabDescriptor: """ @@ -161,8 +199,12 @@ def __init__(self, namespace: str) -> None: self._shacl_graph: Any = None # rdflib.Graph, populated in _init_graphs() self._types: dict[str, dict] = {} # registry: name -> {kind, attributes} self._attr_domains: dict[str, URIRef | None] = {} # attr name → first domain or None if shared + self._queries: dict[str, str] = {} # name -> SPARQL template string self._init_graphs() + def __repr__(self) -> str: + return f"SchemaBuilder(namespace={self._namespace!r}, types={len(self._types)})" + def _init_graphs(self) -> None: """Load core OWL and SHACL static resources into internal graphs.""" self._owl_graph = Graph() @@ -301,10 +343,71 @@ def _dispatch_attr( else: raise TypeError(f"Unknown attribute spec type: {type(attr_spec)}") + def _validate_parent(self, parent: str | None, expected_kind: str) -> None: + """Validate parent type exists and has the correct kind.""" + from knowledgecomplex.exceptions import SchemaError + if parent is None: + return + if parent not in self._types: + raise SchemaError(f"Parent type '{parent}' is not registered") + if self._types[parent]["kind"] != expected_kind: + raise SchemaError( + f"Parent type '{parent}' is kind '{self._types[parent]['kind']}', " + f"expected '{expected_kind}'" + ) + + def _collect_inherited_attributes(self, type_name: str) -> dict: + """Walk the parent chain and collect all inherited attributes.""" + inherited = {} + current = self._types[type_name].get("parent") + while current is not None: + parent_attrs = self._types[current].get("attributes", {}) + # Earlier ancestors are overridden by closer ancestors + for k, v in parent_attrs.items(): + if k not in inherited: + inherited[k] = v + current = self._types[current].get("parent") + return inherited + + def _validate_bind( + self, + bind: dict[str, str], + all_attributes: dict, + ) -> None: + """Validate that bind keys exist in all_attributes and values are legal.""" + from knowledgecomplex.exceptions import SchemaError + for attr_name, bound_value in bind.items(): + if attr_name not in all_attributes: + raise SchemaError( + f"Cannot bind '{attr_name}': attribute not found on this type or its ancestors" + ) + descriptor = all_attributes[attr_name] + # Unwrap dict-style descriptors + if isinstance(descriptor, dict): + descriptor = descriptor.get("vocab") or descriptor.get("text") + if isinstance(descriptor, VocabDescriptor): + if bound_value not in descriptor.values: + raise SchemaError( + f"Cannot bind '{attr_name}' to '{bound_value}': " + f"not in allowed values {descriptor.values}" + ) + + def _apply_bind(self, shape_iri: URIRef, bind: dict[str, str]) -> None: + """Add sh:hasValue + sh:minCount 1 constraints for bound attributes.""" + for attr_name, bound_value in bind.items(): + attr_iri = self._ns[attr_name] + prop_shape = BNode() + self._shacl_graph.add((shape_iri, _SH.property, prop_shape)) + self._shacl_graph.add((prop_shape, _SH.path, attr_iri)) + self._shacl_graph.add((prop_shape, _SH.hasValue, Literal(bound_value))) + self._shacl_graph.add((prop_shape, _SH.minCount, Literal(1))) + def add_vertex_type( self, name: str, attributes: dict[str, VocabDescriptor | TextDescriptor | Any] | None = None, + parent: str | None = None, + bind: dict[str, str] | None = None, ) -> "SchemaBuilder": """ Declare a new vertex type (OWL subclass of KC:Vertex + SHACL node shape). @@ -316,6 +419,10 @@ def add_vertex_type( attributes : dict, optional Mapping of attribute name to descriptor (VocabDescriptor, TextDescriptor, or dict with "vocab"/"text" key and optional "required" flag). + parent : str, optional + Name of a registered vertex type to inherit from. + bind : dict, optional + Mapping of attribute names to fixed string values (sh:hasValue). Returns ------- @@ -324,14 +431,30 @@ def add_vertex_type( from knowledgecomplex.exceptions import SchemaError if name in self._types: raise SchemaError(f"Type '{name}' is already registered") + self._validate_parent(parent, "vertex") attributes = attributes or {} - self._types[name] = {"kind": "vertex", "attributes": dict(attributes)} + bind = bind or {} + + self._types[name] = { + "kind": "vertex", + "attributes": dict(attributes), + "parent": parent, + "bind": dict(bind), + } + + # Validate bind against all attributes (own + inherited) + if bind: + inherited = self._collect_inherited_attributes(name) + all_attrs = {**inherited, **attributes} + self._validate_bind(bind, all_attrs) + type_iri = self._ns[name] shape_iri = self._nss[f"{name}Shape"] # OWL + superclass = self._ns[parent] if parent else _KC.Vertex self._owl_graph.add((type_iri, RDF.type, OWL.Class)) - self._owl_graph.add((type_iri, RDFS.subClassOf, _KC.Vertex)) + self._owl_graph.add((type_iri, RDFS.subClassOf, superclass)) # SHACL self._shacl_graph.add((shape_iri, RDF.type, _SH.NodeShape)) @@ -340,12 +463,17 @@ def add_vertex_type( for attr_name, attr_spec in attributes.items(): self._dispatch_attr(type_iri, shape_iri, attr_name, attr_spec) + if bind: + self._apply_bind(shape_iri, bind) + return self def add_edge_type( self, name: str, attributes: dict[str, VocabDescriptor | TextDescriptor | Any] | None = None, + parent: str | None = None, + bind: dict[str, str] | None = None, ) -> "SchemaBuilder": """ Declare a new edge type (OWL subclass of KC:Edge + SHACL property shapes). @@ -357,6 +485,10 @@ def add_edge_type( attributes : dict, optional Mapping of attribute name to descriptor (VocabDescriptor, TextDescriptor, or dict with "vocab"/"text" key and optional "required" flag). + parent : str, optional + Name of a registered edge type to inherit from. + bind : dict, optional + Mapping of attribute names to fixed string values (sh:hasValue). Returns ------- @@ -365,14 +497,29 @@ def add_edge_type( from knowledgecomplex.exceptions import SchemaError if name in self._types: raise SchemaError(f"Type '{name}' is already registered") + self._validate_parent(parent, "edge") attributes = attributes or {} - self._types[name] = {"kind": "edge", "attributes": dict(attributes)} + bind = bind or {} + + self._types[name] = { + "kind": "edge", + "attributes": dict(attributes), + "parent": parent, + "bind": dict(bind), + } + + if bind: + inherited = self._collect_inherited_attributes(name) + all_attrs = {**inherited, **attributes} + self._validate_bind(bind, all_attrs) + type_iri = self._ns[name] shape_iri = self._nss[f"{name}Shape"] # OWL + superclass = self._ns[parent] if parent else _KC.Edge self._owl_graph.add((type_iri, RDF.type, OWL.Class)) - self._owl_graph.add((type_iri, RDFS.subClassOf, _KC.Edge)) + self._owl_graph.add((type_iri, RDFS.subClassOf, superclass)) # SHACL self._shacl_graph.add((shape_iri, RDF.type, _SH.NodeShape)) @@ -381,12 +528,17 @@ def add_edge_type( for attr_name, attr_spec in attributes.items(): self._dispatch_attr(type_iri, shape_iri, attr_name, attr_spec) + if bind: + self._apply_bind(shape_iri, bind) + return self def add_face_type( self, name: str, attributes: dict[str, Any] | None = None, + parent: str | None = None, + bind: dict[str, str] | None = None, ) -> "SchemaBuilder": """ Declare a new face type (OWL subclass of KC:Face + SHACL property shapes). @@ -400,6 +552,10 @@ def add_face_type( attributes : dict, optional Mapping of attribute name to descriptor (VocabDescriptor, TextDescriptor, or dict with "vocab"/"text" key and optional "required" flag). + parent : str, optional + Name of a registered face type to inherit from. + bind : dict, optional + Mapping of attribute names to fixed string values (sh:hasValue). Returns ------- @@ -408,14 +564,29 @@ def add_face_type( from knowledgecomplex.exceptions import SchemaError if name in self._types: raise SchemaError(f"Type '{name}' is already registered") + self._validate_parent(parent, "face") attributes = attributes or {} - self._types[name] = {"kind": "face", "attributes": dict(attributes)} + bind = bind or {} + + self._types[name] = { + "kind": "face", + "attributes": dict(attributes), + "parent": parent, + "bind": dict(bind), + } + + if bind: + inherited = self._collect_inherited_attributes(name) + all_attrs = {**inherited, **attributes} + self._validate_bind(bind, all_attrs) + type_iri = self._ns[name] shape_iri = self._nss[f"{name}Shape"] # OWL + superclass = self._ns[parent] if parent else _KC.Face self._owl_graph.add((type_iri, RDF.type, OWL.Class)) - self._owl_graph.add((type_iri, RDFS.subClassOf, _KC.Face)) + self._owl_graph.add((type_iri, RDFS.subClassOf, superclass)) # SHACL self._shacl_graph.add((shape_iri, RDF.type, _SH.NodeShape)) @@ -424,8 +595,73 @@ def add_face_type( for attr_name, attr_spec in attributes.items(): self._dispatch_attr(type_iri, shape_iri, attr_name, attr_spec) + if bind: + self._apply_bind(shape_iri, bind) + return self + def describe_type(self, name: str) -> dict: + """ + Inspect a registered type's attributes, parent, and bindings. + + Parameters + ---------- + name : str + The registered type name. + + Returns + ------- + dict + Keys: name, kind, parent, own_attributes, inherited_attributes, + all_attributes, bound. + """ + from knowledgecomplex.exceptions import SchemaError + if name not in self._types: + raise SchemaError(f"Type '{name}' is not registered") + + info = self._types[name] + own_attrs = dict(info.get("attributes", {})) + inherited_attrs = self._collect_inherited_attributes(name) + # Collect bindings from ancestors + inherited_bind = {} + current = info.get("parent") + while current is not None: + parent_bind = self._types[current].get("bind", {}) + for k, v in parent_bind.items(): + if k not in inherited_bind: + inherited_bind[k] = v + current = self._types[current].get("parent") + own_bind = dict(info.get("bind", {})) + all_bind = {**inherited_bind, **own_bind} + + all_attrs = {**inherited_attrs, **own_attrs} + return { + "name": name, + "kind": info["kind"], + "parent": info.get("parent"), + "own_attributes": own_attrs, + "inherited_attributes": inherited_attrs, + "all_attributes": all_attrs, + "bound": all_bind, + } + + def type_names(self, kind: str | None = None) -> list[str]: + """ + List registered type names, optionally filtered by kind. + + Parameters + ---------- + kind : str, optional + Filter by "vertex", "edge", or "face". + + Returns + ------- + list[str] + """ + if kind is None: + return list(self._types.keys()) + return [n for n, info in self._types.items() if info["kind"] == kind] + def promote_to_attribute( self, type: str, @@ -542,6 +778,247 @@ def add_sparql_constraint( self._shacl_graph.add((constraint, _SH.message, Literal(message))) return self + # --- Topological query registration and constraint escalation --- + + _TOPO_PATTERNS: dict[str, tuple[str, str]] = { + # operation -> (graph_pattern_template, result_variable) + # {simplex_iri} is replaced by the target IRI, + # {type_filter} by a type constraint or "". + "boundary": ( + "{simplex_iri} kc:boundedBy ?result . {type_filter}", + "result", + ), + "coboundary": ( + "?result kc:boundedBy {simplex_iri} . {type_filter}", + "result", + ), + "star": ( + "?result kc:boundedBy* {simplex_iri} . {type_filter}", + "result", + ), + "closure": ( + "{simplex_iri} kc:boundedBy* ?result . {type_filter}", + "result", + ), + "link": ( + # closed_star minus star: elements reachable from star's closure + # but not in the star itself + "?star_elem kc:boundedBy* {simplex_iri} . " + "?star_elem kc:boundedBy* ?result . " + "FILTER NOT EXISTS {{ ?result kc:boundedBy* {simplex_iri} }} " + "{type_filter}", + "result", + ), + "degree": ( + "?result kc:boundedBy {simplex_iri} .", + "result", + ), + } + + def _build_topo_sparql( + self, + operation: str, + *, + simplex_iri: str = "{simplex}", + target_type: str | None = None, + ) -> str: + """Build a complete SPARQL SELECT from a topological operation. + + Parameters + ---------- + operation : + One of: boundary, coboundary, star, closure, link, degree. + simplex_iri : + IRI or placeholder for the focus element. + target_type : + Optional type name to filter results. + + Returns + ------- + str + A complete SPARQL SELECT query string. + """ + from knowledgecomplex.exceptions import SchemaError + if operation not in self._TOPO_PATTERNS: + raise SchemaError( + f"Unknown topological operation '{operation}'. " + f"Valid: {sorted(self._TOPO_PATTERNS)}" + ) + pattern_tmpl, result_var = self._TOPO_PATTERNS[operation] + + if target_type is not None: + if target_type not in self._types: + raise SchemaError(f"Type '{target_type}' is not registered") + type_iri = self._ns[target_type] + tf = f"?{result_var} a/rdfs:subClassOf* <{type_iri}> ." + else: + tf = "" + + pattern = ( + pattern_tmpl + .replace("{simplex_iri}", simplex_iri) + .replace("{type_filter}", tf) + ) + + return ( + f"PREFIX kc: \n" + f"PREFIX rdfs: \n" + f"SELECT ?{result_var} WHERE {{\n" + f" {pattern}\n" + f"}}\n" + ) + + def add_query( + self, + name: str, + operation: str, + *, + target_type: str | None = None, + ) -> "SchemaBuilder": + """Register a named topological query template on this schema. + + The query is generated from a topological operation and optional type + filter, then stored internally. It is exported as a ``.sparql`` file + by :meth:`export` and automatically loaded by + :class:`~knowledgecomplex.graph.KnowledgeComplex` at runtime. + + Parameters + ---------- + name : str + Query template name (becomes the filename stem, e.g. ``"spec_coboundary"`` + exports as ``queries/spec_coboundary.sparql``). + operation : str + Topological operation: ``"boundary"``, ``"coboundary"``, ``"star"``, + ``"closure"``, ``"link"``, or ``"degree"``. + target_type : str, optional + Filter results to this type (including subtypes via OWL class hierarchy). + + Returns + ------- + SchemaBuilder (self, for chaining) + + Example + ------- + >>> sb.add_query("spec_coboundary", "coboundary", target_type="verification") + """ + sparql = self._build_topo_sparql( + operation, simplex_iri="{simplex}", target_type=target_type, + ) + self._queries[name] = sparql + return self + + def add_topological_constraint( + self, + type_name: str, + operation: str, + *, + target_type: str | None = None, + predicate: str = "min_count", + min_count: int = 1, + max_count: int | None = None, + message: str | None = None, + ) -> "SchemaBuilder": + """Escalate a topological query to a SHACL constraint. + + Generates a ``sh:sparql`` constraint that, for each focus node of + *type_name*, evaluates a topological operation and checks a cardinality + predicate. Delegates to :meth:`add_sparql_constraint`. + + Parameters + ---------- + type_name : str + The type to constrain (must be registered). + operation : str + Topological operation: ``"boundary"``, ``"coboundary"``, ``"star"``, + ``"closure"``, ``"link"``, or ``"degree"``. + target_type : str, optional + Filter the topological result to this type. + predicate : str + ``"min_count"`` — at least *min_count* results (default). + ``"max_count"`` — at most *max_count* results. + ``"exact_count"`` — exactly *min_count* results. + min_count : int + Minimum count (used by ``"min_count"`` and ``"exact_count"``). + max_count : int, optional + Maximum count (used by ``"max_count"``). + message : str, optional + Custom violation message. Auto-generated if not provided. + + Returns + ------- + SchemaBuilder (self, for chaining) + + Example + ------- + >>> sb.add_topological_constraint( + ... "spec", "coboundary", + ... target_type="verification", + ... predicate="min_count", min_count=1, + ... message="Every spec must have at least one verification edge", + ... ) + """ + from knowledgecomplex.exceptions import SchemaError + if type_name not in self._types: + raise SchemaError(f"Type '{type_name}' is not registered") + if operation not in self._TOPO_PATTERNS: + raise SchemaError( + f"Unknown topological operation '{operation}'. " + f"Valid: {sorted(self._TOPO_PATTERNS)}" + ) + + pattern_tmpl, result_var = self._TOPO_PATTERNS[operation] + + if target_type is not None: + if target_type not in self._types: + raise SchemaError(f"Type '{target_type}' is not registered") + type_iri = self._ns[target_type] + tf = f"?{result_var} a/rdfs:subClassOf* <{type_iri}> ." + else: + tf = "" + + pattern = ( + pattern_tmpl + .replace("{simplex_iri}", "$this") + .replace("{type_filter}", tf) + ) + + # Build the HAVING clause based on predicate + if predicate == "min_count": + having = f"HAVING (COUNT(DISTINCT ?{result_var}) < {min_count})" + elif predicate == "max_count": + if max_count is None: + raise SchemaError("max_count predicate requires max_count parameter") + having = f"HAVING (COUNT(DISTINCT ?{result_var}) > {max_count})" + elif predicate == "exact_count": + having = f"HAVING (COUNT(DISTINCT ?{result_var}) != {min_count})" + else: + raise SchemaError( + f"Unknown predicate '{predicate}'. " + f"Valid: min_count, max_count, exact_count" + ) + + # Wrap pattern in OPTIONAL so GROUP BY produces a row even when + # there are zero matches (otherwise HAVING never fires for empty results) + sparql = ( + f"PREFIX kc: \n" + f"PREFIX rdfs: \n" + f"SELECT $this WHERE {{\n" + f" OPTIONAL {{ {pattern} }}\n" + f"}}\n" + f"GROUP BY $this\n" + f"{having}\n" + ) + + if message is None: + target_desc = f" of type '{target_type}'" if target_type else "" + message = ( + f"Topological constraint violated: {operation}{target_desc} " + f"on '{type_name}' failed {predicate} check " + f"(min={min_count}, max={max_count})" + ) + + return self.add_sparql_constraint(type_name, sparql, message) + def dump_owl(self) -> str: """Return merged OWL graph (core + user schema) as a Turtle string.""" return self._owl_graph.serialize(format="turtle") @@ -577,12 +1054,16 @@ def export( p.mkdir(parents=True, exist_ok=True) (p / "ontology.ttl").write_text(self.dump_owl()) (p / "shapes.ttl").write_text(self.dump_shacl()) - if query_dirs: + # Write schema-generated query templates and copy external query dirs + if self._queries or query_dirs: qdir = p / "queries" qdir.mkdir(exist_ok=True) - for d in query_dirs: - for sparql_file in d.glob("*.sparql"): - shutil.copy2(sparql_file, qdir / sparql_file.name) + for name, sparql_text in self._queries.items(): + (qdir / f"{name}.sparql").write_text(sparql_text) + if query_dirs: + for d in query_dirs: + for sparql_file in d.glob("*.sparql"): + shutil.copy2(sparql_file, qdir / sparql_file.name) return p @classmethod @@ -642,6 +1123,7 @@ def load(cls, path: str | Path) -> "SchemaBuilder": sb._owl_graph = owl_graph sb._shacl_graph = shacl_graph sb._attr_domains = {} + sb._queries = {} # Reconstruct _types registry from OWL subclass triples sb._types = {} diff --git a/knowledgecomplex/viz.py b/knowledgecomplex/viz.py new file mode 100644 index 0000000..25515b8 --- /dev/null +++ b/knowledgecomplex/viz.py @@ -0,0 +1,786 @@ +"""knowledgecomplex.viz — NetworkX export and visualization helpers. + +Two complementary views of a knowledge complex are provided: + +**Hasse diagram** (``plot_hasse``, ``plot_hasse_star``, ``plot_hasse_skeleton``) + Every element (vertex, edge, face) becomes a graph node. Directed edges + represent the boundary operator, pointing from each element to its boundary + elements (higher dimension → lower dimension). Faces have out-degree 3 + and in-degree 0; edges have out-degree 2; vertices have out-degree 0. + Nodes are colored by type and sized by dimension. + +**Geometric realization** (``plot_geometric``, ``plot_geometric_interactive``) + Only KC vertices become points in 3D space. KC edges become line segments + connecting their two boundary vertices. KC faces become filled, + semi-transparent triangular patches spanning their three boundary vertices. + This is the classical geometric realization of the abstract simplicial + complex — the view a topologist would draw. + +``to_networkx`` exports a ``DiGraph`` that backs the Hasse plots. +``verify_networkx`` validates that a DiGraph satisfies simplicial complex +cardinality and closure invariants at runtime. + +Requires optional dependencies:: + + pip install knowledgecomplex[viz] # matplotlib + networkx + pip install knowledgecomplex[viz-interactive] # + plotly for interactive 3D +""" +from __future__ import annotations + +import warnings +from typing import TYPE_CHECKING, Any + +if TYPE_CHECKING: + from knowledgecomplex.graph import KnowledgeComplex + +_DIM_BY_KIND = {"vertex": 0, "edge": 1, "face": 2} +_SIZE_BY_DIM = {0: 400, 1: 200, 2: 100} + +_INSTALL_HINT = ( + "networkx and matplotlib are required for visualization.\n" + "Install them with: pip install knowledgecomplex[viz]" +) + +_PLOTLY_HINT = ( + "plotly is required for interactive 3D visualization.\n" + "Install it with: pip install knowledgecomplex[viz-interactive]" +) + + +def _require_nx(): + try: + import networkx as nx + return nx + except ImportError: + raise ImportError(_INSTALL_HINT) from None + + +def _require_mpl(): + try: + import matplotlib + import matplotlib.pyplot as plt + return matplotlib, plt + except ImportError: + raise ImportError(_INSTALL_HINT) from None + + +def _require_plotly(): + try: + import plotly.graph_objects as go + return go + except ImportError: + raise ImportError(_PLOTLY_HINT) from None + + +# ── NetworkX export ───────────────────────────────────────────────────────── + + +def to_networkx(kc: "KnowledgeComplex") -> Any: + """Convert a KnowledgeComplex to a directed networkx DiGraph. + + Every element (vertex, edge, face) becomes a node. Directed edges + represent the boundary operator ``kc:boundedBy``, pointing **from each + element to its boundary elements** (higher dimension → lower dimension). + + In the resulting DiGraph: + + - **Face** nodes have out-degree 3 (→ 3 boundary edges) and in-degree 0. + - **Edge** nodes have out-degree 2 (→ 2 boundary vertices). + - **Vertex** nodes have out-degree 0 (empty boundary). + + Each node carries attributes: + + - ``type``: element type name (e.g. ``"Node"``, ``"Link"``) + - ``kind``: ``"vertex"``, ``"edge"``, or ``"face"`` + - ``dim``: 0, 1, or 2 + - ``uri``: file URI if present, else ``None`` + - All model-namespace attributes from the element + + Parameters + ---------- + kc : KnowledgeComplex + + Returns + ------- + networkx.DiGraph + """ + nx = _require_nx() + G = nx.DiGraph(name=kc._schema._namespace) + + for elem_id in kc.element_ids(): + elem = kc.element(elem_id) + type_name = elem.type + kind = kc._schema._types.get(type_name, {}).get("kind", "vertex") + attrs = { + "type": type_name, + "kind": kind, + "dim": _DIM_BY_KIND.get(kind, 0), + "uri": elem.uri, + **elem.attrs, + } + G.add_node(elem_id, **attrs) + + # Directed boundary edges: element → boundary element (high dim → low dim) + for elem_id in kc.element_ids(): + for boundary_id in kc.boundary(elem_id): + G.add_edge(elem_id, boundary_id) + + return G + + +# ── Verification ──────────────────────────────────────────────────────────── + + +def verify_networkx(G: Any) -> bool: + """Validate that a DiGraph satisfies simplicial complex invariants. + + Checks cardinality constraints and boundary closure: + + - Every node has ``kind`` and ``dim`` attributes. + - **Vertices** (dim=0): out-degree = 0. + - **Edges** (dim=1): out-degree = 2, both targets are vertices (dim=0). + - **Faces** (dim=2): out-degree = 3, all targets are edges (dim=1). + - **Closed-triangle**: for each face, the 3 boundary edges share exactly + 3 distinct vertices (forming a closed triangle, not an open fan). + + Parameters + ---------- + G : networkx.DiGraph + A DiGraph produced by :func:`to_networkx`. + + Returns + ------- + bool + ``True`` if all invariants hold. + + Raises + ------ + ValueError + On the first invariant violation, with a descriptive message. + TypeError + If *G* is not a ``DiGraph``. + """ + nx = _require_nx() + if not isinstance(G, nx.DiGraph): + raise TypeError(f"Expected nx.DiGraph, got {type(G).__name__}") + + for node in G.nodes: + data = G.nodes[node] + if "kind" not in data or "dim" not in data: + raise ValueError(f"Node '{node}' missing 'kind' or 'dim' attribute") + + dim = data["dim"] + out_deg = G.out_degree(node) + successors = list(G.successors(node)) + + if dim == 0: # vertex + if out_deg != 0: + raise ValueError( + f"Vertex '{node}' has out-degree {out_deg}, expected 0" + ) + + elif dim == 1: # edge + if out_deg != 2: + raise ValueError( + f"Edge '{node}' has out-degree {out_deg}, expected 2" + ) + for s in successors: + if G.nodes[s].get("dim") != 0: + raise ValueError( + f"Edge '{node}' boundary target '{s}' is not a vertex " + f"(dim={G.nodes[s].get('dim')})" + ) + + elif dim == 2: # face + if out_deg != 3: + raise ValueError( + f"Face '{node}' has out-degree {out_deg}, expected 3" + ) + for s in successors: + if G.nodes[s].get("dim") != 1: + raise ValueError( + f"Face '{node}' boundary target '{s}' is not an edge " + f"(dim={G.nodes[s].get('dim')})" + ) + # Closed-triangle: 3 edges must share exactly 3 distinct vertices + face_vertices = set() + for edge_node in successors: + for v in G.successors(edge_node): + face_vertices.add(v) + if len(face_vertices) != 3: + raise ValueError( + f"Face '{node}' boundary edges span {len(face_vertices)} " + f"distinct vertices, expected 3 (closed triangle)" + ) + + return True + + +# ── Color mapping ────────────────────────────────────────────────────────── + + +def type_color_map(kc: "KnowledgeComplex") -> dict[str, str]: + """Build a type-name to hex-color mapping from the schema's type registry. + + Uses matplotlib's ``tab10`` colormap (or ``tab20`` if > 10 types) + for distinct, visually separable colors. + + Parameters + ---------- + kc : KnowledgeComplex + + Returns + ------- + dict[str, str] + Mapping from type name to hex color string. + """ + _, plt = _require_mpl() + import matplotlib.colors as mcolors + + type_names = sorted(kc._schema._types.keys()) + cmap_name = "tab10" if len(type_names) <= 10 else "tab20" + cmap = plt.get_cmap(cmap_name) + + colors = {} + for i, name in enumerate(type_names): + colors[name] = mcolors.to_hex(cmap(i % cmap.N)) + return colors + + +# ── Hasse diagram helpers ────────────────────────────────────────────────── + + +def _prepare_ax(ax, figsize): + """Create or reuse a matplotlib Axes.""" + _, plt = _require_mpl() + if ax is None: + fig, ax = plt.subplots(1, 1, figsize=figsize) + else: + fig = ax.get_figure() + return fig, ax + + +def _layout(G): + """Choose a 2D layout for the graph (converts DiGraph to undirected).""" + nx = _require_nx() + if len(G) == 0: + return {} + undirected = G.to_undirected() if G.is_directed() else G + try: + return nx.kamada_kawai_layout(undirected) + except Exception: + return nx.spring_layout(undirected, seed=42) + + +# ── Hasse diagram plots ─────────────────────────────────────────────────── + + +def plot_hasse( + kc: "KnowledgeComplex", + *, + ax: Any = None, + figsize: tuple[float, float] = (10, 8), + with_labels: bool = True, + node_size_by_dim: bool = True, +) -> tuple[Any, Any]: + """Plot the Hasse diagram of the complex with type-based color coding. + + Every element (vertex, edge, face) is drawn as a node. Directed arrows + represent the boundary operator, pointing from each element to its + boundary elements (higher dimension → lower dimension). Nodes are colored + by type and sized by dimension (vertices largest, faces smallest). + + This is **not** a geometric picture of the complex — it is the partially + ordered set of simplices. For a geometric view where vertices are points, + edges are line segments, and faces are filled triangles, see + :func:`plot_geometric`. + + Parameters + ---------- + kc : KnowledgeComplex + ax : matplotlib Axes, optional + Axes to draw on. Created if not provided. + figsize : tuple + Figure size if creating a new figure. + with_labels : bool + Show element ID labels on nodes. + node_size_by_dim : bool + Scale node size by dimension (vertex=large, face=small). + + Returns + ------- + (fig, ax) + The matplotlib Figure and Axes. + """ + nx = _require_nx() + _, plt = _require_mpl() + + G = to_networkx(kc) + fig, ax = _prepare_ax(ax, figsize) + pos = _layout(G) + colors = type_color_map(kc) + + if len(G) == 0: + ax.set_title("Empty complex") + ax.axis("off") + return fig, ax + + node_colors = [colors.get(G.nodes[n].get("type", ""), "#999999") for n in G] + if node_size_by_dim: + node_sizes = [_SIZE_BY_DIM.get(G.nodes[n].get("dim", 0), 200) for n in G] + else: + node_sizes = 300 + + nx.draw_networkx_edges( + G, pos, ax=ax, edge_color="#cccccc", width=1.5, + arrows=True, arrowstyle="-|>", arrowsize=12, + connectionstyle="arc3,rad=0.05", + ) + nx.draw_networkx_nodes( + G, pos, ax=ax, + node_color=node_colors, + node_size=node_sizes, + edgecolors="#333333", + linewidths=0.5, + ) + if with_labels: + nx.draw_networkx_labels(G, pos, ax=ax, font_size=8) + + # Legend + from matplotlib.lines import Line2D + legend_handles = [] + for type_name in sorted(colors): + kind = kc._schema._types.get(type_name, {}).get("kind", "?") + legend_handles.append( + Line2D([0], [0], marker="o", color="w", + markerfacecolor=colors[type_name], markersize=10, + label=f"{type_name} ({kind})") + ) + if legend_handles: + ax.legend(handles=legend_handles, loc="best", fontsize=8) + + ax.set_title(f"Hasse Diagram: {kc._schema._namespace}") + ax.axis("off") + return fig, ax + + +def plot_hasse_star( + kc: "KnowledgeComplex", + id: str, + *, + ax: Any = None, + figsize: tuple[float, float] = (10, 8), + with_labels: bool = True, +) -> tuple[Any, Any]: + """Plot the Hasse diagram with the star of an element highlighted. + + Elements in ``St(id)`` are drawn in full color with directed arrows; + all other elements are dimmed to light gray. This is the Hasse-diagram + view — see :func:`plot_hasse` for details on what that means. + + Parameters + ---------- + kc : KnowledgeComplex + id : str + Element whose star to highlight. + ax : matplotlib Axes, optional + figsize : tuple + with_labels : bool + + Returns + ------- + (fig, ax) + """ + nx = _require_nx() + _, plt = _require_mpl() + + G = to_networkx(kc) + fig, ax = _prepare_ax(ax, figsize) + pos = _layout(G) + colors = type_color_map(kc) + star_ids = kc.star(id) + + highlighted = [n for n in G if n in star_ids] + dimmed = [n for n in G if n not in star_ids] + + if dimmed: + nx.draw_networkx_nodes( + G, pos, nodelist=dimmed, ax=ax, + node_color="#dddddd", node_size=150, + edgecolors="#cccccc", linewidths=0.5, + ) + + star_edges = [(u, v) for u, v in G.edges() if u in star_ids and v in star_ids] + dim_edges = [(u, v) for u, v in G.edges() if (u, v) not in set(star_edges)] + if dim_edges: + nx.draw_networkx_edges( + G, pos, edgelist=dim_edges, ax=ax, edge_color="#eeeeee", width=1.0, + arrows=True, arrowstyle="-|>", arrowsize=8, + ) + if star_edges: + nx.draw_networkx_edges( + G, pos, edgelist=star_edges, ax=ax, edge_color="#666666", width=2.0, + arrows=True, arrowstyle="-|>", arrowsize=14, + ) + + if highlighted: + h_colors = [colors.get(G.nodes[n].get("type", ""), "#999999") for n in highlighted] + h_sizes = [_SIZE_BY_DIM.get(G.nodes[n].get("dim", 0), 200) for n in highlighted] + nx.draw_networkx_nodes( + G, pos, nodelist=highlighted, ax=ax, + node_color=h_colors, node_size=h_sizes, + edgecolors="#333333", linewidths=1.0, + ) + + if with_labels: + nx.draw_networkx_labels(G, pos, ax=ax, font_size=8) + + ax.set_title(f"Hasse Star({id})") + ax.axis("off") + return fig, ax + + +def plot_hasse_skeleton( + kc: "KnowledgeComplex", + k: int, + *, + ax: Any = None, + figsize: tuple[float, float] = (10, 8), + with_labels: bool = True, +) -> tuple[Any, Any]: + """Plot the Hasse diagram of the k-skeleton only. + + Shows only elements of dimension ≤ k, with directed boundary arrows. + This is the Hasse-diagram view — see :func:`plot_hasse` for details. + + k=0: vertices only, k=1: vertices + edges, k=2: everything. + + Parameters + ---------- + kc : KnowledgeComplex + k : int + Maximum dimension (0, 1, or 2). + ax : matplotlib Axes, optional + figsize : tuple + with_labels : bool + + Returns + ------- + (fig, ax) + """ + nx = _require_nx() + _, plt = _require_mpl() + + G = to_networkx(kc) + skel_ids = kc.skeleton(k) + subG = G.subgraph(skel_ids).copy() + + fig, ax = _prepare_ax(ax, figsize) + pos = _layout(subG) + colors = type_color_map(kc) + + if len(subG) == 0: + ax.set_title(f"Hasse Skeleton({k}) — empty") + ax.axis("off") + return fig, ax + + node_colors = [colors.get(subG.nodes[n].get("type", ""), "#999999") for n in subG] + node_sizes = [_SIZE_BY_DIM.get(subG.nodes[n].get("dim", 0), 200) for n in subG] + + nx.draw_networkx_edges( + subG, pos, ax=ax, edge_color="#cccccc", width=1.5, + arrows=True, arrowstyle="-|>", arrowsize=12, + ) + nx.draw_networkx_nodes( + subG, pos, ax=ax, + node_color=node_colors, + node_size=node_sizes, + edgecolors="#333333", + linewidths=0.5, + ) + if with_labels: + nx.draw_networkx_labels(subG, pos, ax=ax, font_size=8) + + ax.set_title(f"Hasse Skeleton({k})") + ax.axis("off") + return fig, ax + + +# ── Deprecated aliases ───────────────────────────────────────────────────── + + +def plot_complex(kc, **kwargs): + """Deprecated: use :func:`plot_hasse` instead.""" + warnings.warn("plot_complex is deprecated, use plot_hasse", DeprecationWarning, stacklevel=2) + return plot_hasse(kc, **kwargs) + + +def plot_star(kc, id, **kwargs): + """Deprecated: use :func:`plot_hasse_star` instead.""" + warnings.warn("plot_star is deprecated, use plot_hasse_star", DeprecationWarning, stacklevel=2) + return plot_hasse_star(kc, id, **kwargs) + + +def plot_skeleton(kc, k, **kwargs): + """Deprecated: use :func:`plot_hasse_skeleton` instead.""" + warnings.warn("plot_skeleton is deprecated, use plot_hasse_skeleton", DeprecationWarning, stacklevel=2) + return plot_hasse_skeleton(kc, k, **kwargs) + + +# ── Geometric realization helpers ────────────────────────────────────────── + + +def _face_vertices(kc: "KnowledgeComplex", face_id: str) -> list[str]: + """Get the 3 vertices of a face by walking boundary twice. + + boundary(face) → 3 edges → boundary(each edge) → deduplicate → 3 vertices. + """ + verts: set[str] = set() + for edge_id in kc.boundary(face_id): + verts |= kc.boundary(edge_id) + return sorted(verts) + + +def _vertex_positions_3d( + kc: "KnowledgeComplex", +) -> dict[str, tuple[float, float, float]]: + """Compute 3D positions for KC vertices using force-directed layout. + + Builds a networkx graph of only KC vertices, with an edge between + vertices that share a KC edge, then runs spring_layout in 3D. + """ + nx = _require_nx() + G = nx.Graph() + + # Add vertex nodes + vertex_ids = list(kc.skeleton(0)) + for vid in vertex_ids: + G.add_node(vid) + + # Connect vertices that share a KC edge + edge_ids = kc.skeleton(1) - kc.skeleton(0) + for eid in edge_ids: + boundary = list(kc.boundary(eid)) + if len(boundary) == 2: + G.add_edge(boundary[0], boundary[1]) + + if len(G) == 0: + return {} + + pos = nx.spring_layout(G, dim=3, seed=42) + return {vid: tuple(pos[vid]) for vid in pos} + + +# ── Geometric realization: matplotlib ────────────────────────────────────── + + +def plot_geometric( + kc: "KnowledgeComplex", + *, + ax: Any = None, + figsize: tuple[float, float] = (10, 8), + with_labels: bool = True, +) -> tuple[Any, Any]: + """Plot the geometric realization of the complex in 3D. + + KC vertices become points in 3D space (positioned by force-directed + layout). KC edges become line segments connecting their two boundary + vertices. KC faces become filled, semi-transparent triangular patches + spanning their three boundary vertices. + + This is the classical geometric realization — the view a topologist + would draw. For the Hasse diagram where every element is a node and + boundary relations are directed edges, see :func:`plot_hasse`. + + Parameters + ---------- + kc : KnowledgeComplex + ax : matplotlib Axes3D, optional + A 3D axes to draw on. Created if not provided. + figsize : tuple + Figure size if creating a new figure. + with_labels : bool + Show vertex ID labels. + + Returns + ------- + (fig, ax) + The matplotlib Figure and Axes3D. + """ + _, plt = _require_mpl() + from mpl_toolkits.mplot3d.art3d import Poly3DCollection + + colors = type_color_map(kc) + pos = _vertex_positions_3d(kc) + + if ax is None: + fig = plt.figure(figsize=figsize) + ax = fig.add_subplot(111, projection="3d") + else: + fig = ax.get_figure() + + if not pos: + ax.set_title("Empty complex") + return fig, ax + + # Draw faces as filled triangles + face_ids = kc.skeleton(2) - kc.skeleton(1) + for fid in face_ids: + verts = _face_vertices(kc, fid) + if len(verts) == 3 and all(v in pos for v in verts): + tri = [pos[v] for v in verts] + face_type = kc.element(fid).type + color = colors.get(face_type, "#999999") + poly = Poly3DCollection([tri], alpha=0.25, facecolor=color, + edgecolor=color, linewidths=0.5) + ax.add_collection3d(poly) + + # Draw edges as line segments + edge_ids = kc.skeleton(1) - kc.skeleton(0) + for eid in edge_ids: + boundary = list(kc.boundary(eid)) + if len(boundary) == 2 and all(v in pos for v in boundary): + p0, p1 = pos[boundary[0]], pos[boundary[1]] + edge_type = kc.element(eid).type + color = colors.get(edge_type, "#999999") + ax.plot3D( + [p0[0], p1[0]], [p0[1], p1[1]], [p0[2], p1[2]], + color=color, linewidth=2, + ) + + # Draw vertices as scatter points + vertex_ids = list(kc.skeleton(0)) + for vid in vertex_ids: + if vid in pos: + x, y, z = pos[vid] + vtype = kc.element(vid).type + color = colors.get(vtype, "#999999") + ax.scatter3D(x, y, z, color=color, s=80, edgecolors="#333333", + linewidths=0.5, zorder=5, depthshade=False) + + # Labels + if with_labels: + for vid in vertex_ids: + if vid in pos: + x, y, z = pos[vid] + ax.text(x, y, z, f" {vid}", fontsize=7) + + # Legend + from matplotlib.lines import Line2D + legend_handles = [] + for type_name in sorted(colors): + kind = kc._schema._types.get(type_name, {}).get("kind", "?") + legend_handles.append( + Line2D([0], [0], marker="o", color="w", + markerfacecolor=colors[type_name], markersize=8, + label=f"{type_name} ({kind})") + ) + if legend_handles: + ax.legend(handles=legend_handles, loc="best", fontsize=7) + + ax.set_title(f"Geometric Realization: {kc._schema._namespace}") + return fig, ax + + +# ── Geometric realization: plotly ────────────────────────────────────────── + + +def plot_geometric_interactive( + kc: "KnowledgeComplex", +) -> Any: + """Plot an interactive 3D geometric realization of the complex. + + Same geometry as :func:`plot_geometric` — KC vertices are points, KC edges + are line segments, KC faces are filled triangles — but rendered with + Plotly for interactive rotation, zoom, and hover inspection. + + Requires plotly:: + + pip install knowledgecomplex[viz-interactive] + + Parameters + ---------- + kc : KnowledgeComplex + + Returns + ------- + plotly.graph_objects.Figure + Call ``.show()`` to display or ``.write_html("file.html")`` to save. + """ + go = _require_plotly() + + colors = type_color_map(kc) + pos = _vertex_positions_3d(kc) + fig = go.Figure() + + if not pos: + fig.update_layout(title="Empty complex") + return fig + + # Faces as Mesh3d triangles + face_ids = kc.skeleton(2) - kc.skeleton(1) + for fid in face_ids: + verts = _face_vertices(kc, fid) + if len(verts) == 3 and all(v in pos for v in verts): + xs = [pos[v][0] for v in verts] + ys = [pos[v][1] for v in verts] + zs = [pos[v][2] for v in verts] + face_type = kc.element(fid).type + color = colors.get(face_type, "#999999") + fig.add_trace(go.Mesh3d( + x=xs, y=ys, z=zs, + i=[0], j=[1], k=[2], + color=color, opacity=0.3, + hoverinfo="text", + hovertext=f"{fid} ({face_type})", + showlegend=False, + )) + + # Edges as line segments + edge_ids = kc.skeleton(1) - kc.skeleton(0) + for eid in edge_ids: + boundary = list(kc.boundary(eid)) + if len(boundary) == 2 and all(v in pos for v in boundary): + p0, p1 = pos[boundary[0]], pos[boundary[1]] + edge_type = kc.element(eid).type + color = colors.get(edge_type, "#999999") + fig.add_trace(go.Scatter3d( + x=[p0[0], p1[0]], y=[p0[1], p1[1]], z=[p0[2], p1[2]], + mode="lines", + line=dict(color=color, width=4), + hoverinfo="text", + hovertext=f"{eid} ({edge_type})", + showlegend=False, + )) + + # Vertices as markers + vertex_ids = [v for v in kc.skeleton(0) if v in pos] + xs = [pos[v][0] for v in vertex_ids] + ys = [pos[v][1] for v in vertex_ids] + zs = [pos[v][2] for v in vertex_ids] + vtypes = [kc.element(v).type for v in vertex_ids] + vcolors = [colors.get(t, "#999999") for t in vtypes] + hover = [f"{vid} ({vt})" for vid, vt in zip(vertex_ids, vtypes)] + + fig.add_trace(go.Scatter3d( + x=xs, y=ys, z=zs, + mode="markers+text", + marker=dict(size=6, color=vcolors, line=dict(width=1, color="#333333")), + text=vertex_ids, + textposition="top center", + textfont=dict(size=8), + hoverinfo="text", + hovertext=hover, + showlegend=False, + )) + + fig.update_layout( + title=f"Geometric Realization: {kc._schema._namespace}", + scene=dict( + xaxis=dict(showgrid=False, zeroline=False, showticklabels=False, title=""), + yaxis=dict(showgrid=False, zeroline=False, showticklabels=False, title=""), + zaxis=dict(showgrid=False, zeroline=False, showticklabels=False, title=""), + ), + showlegend=False, + ) + return fig diff --git a/pyproject.toml b/pyproject.toml index 97bb835..877689a 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -16,6 +16,9 @@ dependencies = [ [project.optional-dependencies] dev = ["pytest>=8.0", "pytest-cov", "ruff", "mypy"] docs = ["mkdocs-material", "mkdocstrings[python]"] +viz = ["networkx>=3.0", "matplotlib>=3.7"] +viz-interactive = ["networkx>=3.0", "matplotlib>=3.7", "plotly>=5.0"] +analysis = ["numpy>=1.24", "scipy>=1.10"] [build-system] requires = ["hatchling"] diff --git a/tests/test_analysis.py b/tests/test_analysis.py new file mode 100644 index 0000000..658e487 --- /dev/null +++ b/tests/test_analysis.py @@ -0,0 +1,405 @@ +""" +tests/test_analysis.py + +Tests for topological analysis: boundary matrices, Betti numbers, +Hodge Laplacian, edge PageRank, Hodge decomposition, edge influence. + +Fixtures: + - double_triangle: 4 vertices, 5 edges, 2 faces (contractible, β₁=0) + - cycle_only: 3 vertices, 3 edges, 0 faces (one cycle, β₁=1) + - disconnected: 2 vertices, 0 edges (β₀=2) +""" + +import pytest +import numpy as np +from numpy.testing import assert_allclose + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.analysis import ( + boundary_matrices, + betti_numbers, + euler_characteristic, + hodge_laplacian, + edge_pagerank, + edge_pagerank_all, + hodge_decomposition, + edge_influence, + hodge_analysis, + BoundaryMatrices, + HodgeDecomposition, + EdgeInfluence, + HodgeAnalysisResults, +) + + +@pytest.fixture +def schema() -> SchemaBuilder: + sb = SchemaBuilder(namespace="topo") + sb.add_vertex_type("Node") + sb.add_edge_type("Link", attributes={"weight": vocab("light", "heavy")}) + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def double_triangle(schema) -> KnowledgeComplex: + """4 vertices, 5 edges, 2 faces sharing edge e23. Contractible.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_vertex("v4", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}, weight="light") + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}, weight="heavy") + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}, weight="light") + kc.add_edge("e24", type="Link", vertices={"v2", "v4"}, weight="heavy") + kc.add_edge("e34", type="Link", vertices={"v3", "v4"}, weight="light") + kc.add_face("f123", type="Triangle", boundary=["e12", "e23", "e13"]) + kc.add_face("f234", type="Triangle", boundary=["e23", "e24", "e34"]) + return kc + + +@pytest.fixture +def cycle_only(schema) -> KnowledgeComplex: + """3 vertices, 3 edges, 0 faces. One independent cycle (β₁=1).""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}, weight="light") + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}, weight="light") + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}, weight="light") + return kc + + +@pytest.fixture +def disconnected(schema) -> KnowledgeComplex: + """2 vertices, 0 edges. β₀=2.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + return kc + + +# =========================================================================== +# Boundary matrices +# =========================================================================== + +class TestBoundaryMatrices: + + def test_shapes(self, double_triangle): + bm = boundary_matrices(double_triangle) + assert bm.B1.shape == (4, 5) # 4 vertices, 5 edges + assert bm.B2.shape == (5, 2) # 5 edges, 2 faces + + def test_B1_two_nonzeros_per_column(self, double_triangle): + """Each edge has exactly 2 boundary vertices.""" + bm = boundary_matrices(double_triangle) + for col in range(bm.B1.shape[1]): + nnz = bm.B1[:, col].nnz + assert nnz == 2 + + def test_B2_three_nonzeros_per_column(self, double_triangle): + """Each face has exactly 3 boundary edges.""" + bm = boundary_matrices(double_triangle) + for col in range(bm.B2.shape[1]): + nnz = bm.B2[:, col].nnz + assert nnz == 3 + + def test_boundary_of_boundary_is_zero(self, double_triangle): + """∂₁ ∘ ∂₂ = 0 (fundamental theorem of simplicial homology).""" + bm = boundary_matrices(double_triangle) + product = bm.B1 @ bm.B2 + assert_allclose(product.toarray(), 0, atol=1e-12) + + def test_index_maps_bijective(self, double_triangle): + bm = boundary_matrices(double_triangle) + assert len(bm.vertex_index) == 4 + assert len(bm.edge_index) == 5 + assert len(bm.face_index) == 2 + for k, v in bm.vertex_index.items(): + assert bm.index_vertex[v] == k + for k, v in bm.edge_index.items(): + assert bm.index_edge[v] == k + + def test_no_faces(self, cycle_only): + """Complex with no faces has B2 with 0 columns.""" + bm = boundary_matrices(cycle_only) + assert bm.B1.shape == (3, 3) + assert bm.B2.shape == (3, 0) + + def test_no_edges(self, disconnected): + """Complex with no edges has B1 with 0 columns.""" + bm = boundary_matrices(disconnected) + assert bm.B1.shape == (2, 0) + assert bm.B2.shape == (0, 0) + + +# =========================================================================== +# Betti numbers +# =========================================================================== + +class TestBettiNumbers: + + def test_double_triangle_contractible(self, double_triangle): + """Double triangle is contractible: β = [1, 0, 0].""" + b = betti_numbers(double_triangle) + assert b == [1, 0, 0] + + def test_cycle_has_hole(self, cycle_only): + """Triangle boundary has one cycle: β = [1, 1, 0].""" + b = betti_numbers(cycle_only) + assert b == [1, 1, 0] + + def test_disconnected_components(self, disconnected): + """Two disconnected vertices: β = [2, 0, 0].""" + b = betti_numbers(disconnected) + assert b == [2, 0, 0] + + def test_euler_characteristic(self, double_triangle): + """χ = V - E + F = 4 - 5 + 2 = 1.""" + chi = euler_characteristic(double_triangle) + assert chi == 1 + + def test_euler_equals_alternating_betti(self, double_triangle): + """χ = β₀ - β₁ + β₂.""" + b = betti_numbers(double_triangle) + chi = euler_characteristic(double_triangle) + assert chi == b[0] - b[1] + b[2] + + def test_euler_cycle(self, cycle_only): + """χ = 3 - 3 + 0 = 0.""" + assert euler_characteristic(cycle_only) == 0 + + +# =========================================================================== +# Hodge Laplacian +# =========================================================================== + +class TestHodgeLaplacian: + + def test_shape(self, double_triangle): + L1 = hodge_laplacian(double_triangle) + assert L1.shape == (5, 5) + + def test_symmetric(self, double_triangle): + L1 = hodge_laplacian(double_triangle) + diff = L1 - L1.T + assert_allclose(diff.toarray(), 0, atol=1e-12) + + def test_positive_semidefinite(self, double_triangle): + L1 = hodge_laplacian(double_triangle) + eigenvalues = np.linalg.eigvalsh(L1.toarray()) + assert np.all(eigenvalues >= -1e-10) + + def test_kernel_dimension_equals_beta1(self, double_triangle): + """dim ker(L₁) = β₁ for combinatorial Laplacian.""" + L1 = hodge_laplacian(double_triangle) + eigenvalues = np.linalg.eigvalsh(L1.toarray()) + kernel_dim = np.sum(np.abs(eigenvalues) < 1e-8) + b = betti_numbers(double_triangle) + assert kernel_dim == b[1] + + def test_kernel_dimension_cycle(self, cycle_only): + """Cycle has β₁=1 so L₁ has 1D kernel.""" + L1 = hodge_laplacian(cycle_only) + eigenvalues = np.linalg.eigvalsh(L1.toarray()) + kernel_dim = np.sum(np.abs(eigenvalues) < 1e-8) + assert kernel_dim == 1 + + def test_weighted_symmetric(self, double_triangle): + L1 = hodge_laplacian(double_triangle, weighted=True) + diff = L1 - L1.T + assert_allclose(diff.toarray(), 0, atol=1e-12) + + def test_weighted_psd(self, double_triangle): + L1 = hodge_laplacian(double_triangle, weighted=True) + eigenvalues = np.linalg.eigvalsh(L1.toarray()) + assert np.all(eigenvalues >= -1e-10) + + +# =========================================================================== +# Edge PageRank +# =========================================================================== + +class TestEdgePageRank: + + def test_single_edge_shape(self, double_triangle): + bm = boundary_matrices(double_triangle) + pr = edge_pagerank(double_triangle, "e12", beta=0.1) + assert pr.shape == (5,) + + def test_nonzero_at_target(self, double_triangle): + """The target edge has nonzero PageRank.""" + bm = boundary_matrices(double_triangle) + pr = edge_pagerank(double_triangle, "e12", beta=0.1) + assert pr[bm.edge_index["e12"]] > 0 + + def test_all_edges_shape(self, double_triangle): + pr = edge_pagerank_all(double_triangle, beta=0.1) + assert pr.shape == (5, 5) + + def test_all_edges_symmetric(self, double_triangle): + """PR matrix is symmetric since L₁ is symmetric.""" + pr = edge_pagerank_all(double_triangle, beta=0.1) + assert_allclose(pr, pr.T, atol=1e-10) + + def test_single_matches_all(self, double_triangle): + """Single edge PR matches column of all-edges matrix.""" + bm = boundary_matrices(double_triangle) + pr_single = edge_pagerank(double_triangle, "e12", beta=0.1) + pr_all = edge_pagerank_all(double_triangle, beta=0.1) + col_idx = bm.edge_index["e12"] + assert_allclose(pr_single, pr_all[:, col_idx], atol=1e-10) + + +# =========================================================================== +# Hodge decomposition +# =========================================================================== + +class TestHodgeDecomposition: + + def test_exact_decomposition(self, double_triangle): + """gradient + curl + harmonic = original vector.""" + flow = edge_pagerank(double_triangle, "e12", beta=0.1) + decomp = hodge_decomposition(double_triangle, flow) + reconstructed = decomp.gradient + decomp.curl + decomp.harmonic + assert_allclose(reconstructed, flow, atol=1e-8) + + def test_orthogonality(self, double_triangle): + """Components are mutually orthogonal.""" + flow = edge_pagerank(double_triangle, "e12", beta=0.1) + decomp = hodge_decomposition(double_triangle, flow) + assert abs(np.dot(decomp.gradient, decomp.curl)) < 1e-8 + assert abs(np.dot(decomp.gradient, decomp.harmonic)) < 1e-8 + assert abs(np.dot(decomp.curl, decomp.harmonic)) < 1e-8 + + def test_contractible_no_harmonic(self, double_triangle): + """Contractible complex (β₁=0) → harmonic component is zero.""" + flow = edge_pagerank(double_triangle, "e12", beta=0.1) + decomp = hodge_decomposition(double_triangle, flow) + assert_allclose(decomp.harmonic, 0, atol=1e-8) + + def test_cycle_has_harmonic(self, cycle_only): + """Complex with β₁=1 → harmonic component is nonzero for generic flow.""" + flow = np.ones(3) # uniform flow + decomp = hodge_decomposition(cycle_only, flow) + assert np.linalg.norm(decomp.harmonic) > 1e-8 + + +# =========================================================================== +# Edge influence +# =========================================================================== + +class TestEdgeInfluence: + + def test_non_negative(self, double_triangle): + pr = edge_pagerank(double_triangle, "e12", beta=0.1) + inf = edge_influence("e12", pr) + assert inf.spread >= 0 + assert inf.absolute_influence >= 0 + assert inf.penetration >= 0 + + def test_spread_range(self, double_triangle): + pr = edge_pagerank(double_triangle, "e12", beta=0.1) + inf = edge_influence("e12", pr) + assert 0 <= inf.spread <= 1 + + +# =========================================================================== +# Full analysis +# =========================================================================== + +class TestHodgeAnalysis: + + def test_returns_results(self, double_triangle): + results = hodge_analysis(double_triangle, beta=0.1) + assert isinstance(results, HodgeAnalysisResults) + assert results.betti == [1, 0, 0] + assert results.euler_characteristic == 1 + assert results.pagerank.shape == (5, 5) + assert len(results.decompositions) == 5 + assert len(results.influences) == 5 + + def test_cycle_analysis(self, cycle_only): + results = hodge_analysis(cycle_only, beta=0.1) + assert results.betti == [1, 1, 0] + assert results.euler_characteristic == 0 + + +# =========================================================================== +# Simplex weights +# =========================================================================== + +class TestWeights: + + def test_none_matches_unweighted(self, double_triangle): + """weights=None produces identical results to default.""" + L_default = hodge_laplacian(double_triangle) + L_none = hodge_laplacian(double_triangle, weights=None) + assert_allclose(L_default.toarray(), L_none.toarray()) + + def test_uniform_weights_match_unweighted(self, double_triangle): + """All weights=1.0 produces identical results to default.""" + all_ids = double_triangle.element_ids() + uniform = {eid: 1.0 for eid in all_ids} + L_default = hodge_laplacian(double_triangle) + L_uniform = hodge_laplacian(double_triangle, weights=uniform) + assert_allclose(L_default.toarray(), L_uniform.toarray(), atol=1e-12) + + def test_vertex_weights_change_laplacian(self, double_triangle): + """Non-uniform vertex weights produce a different Laplacian.""" + w = {"v1": 2.0, "v2": 0.5} # others default to 1.0 + L_default = hodge_laplacian(double_triangle) + L_weighted = hodge_laplacian(double_triangle, weights=w) + assert not np.allclose(L_default.toarray(), L_weighted.toarray()) + + def test_face_weights_change_laplacian(self, double_triangle): + """Non-uniform face weights produce a different Laplacian.""" + w = {"f123": 3.0, "f234": 0.1} + L_default = hodge_laplacian(double_triangle) + L_weighted = hodge_laplacian(double_triangle, weights=w) + assert not np.allclose(L_default.toarray(), L_weighted.toarray()) + + def test_weighted_laplacian_symmetric(self, double_triangle): + """Weighted Laplacian is symmetric.""" + w = {"v1": 2.0, "v2": 0.5, "f123": 3.0} + L = hodge_laplacian(double_triangle, weights=w) + assert_allclose(L.toarray(), L.T.toarray(), atol=1e-12) + + def test_weighted_laplacian_psd(self, double_triangle): + """Weighted Laplacian is positive semidefinite.""" + w = {"v1": 2.0, "v2": 0.5, "f123": 3.0} + L = hodge_laplacian(double_triangle, weights=w) + eigenvalues = np.linalg.eigvalsh(L.toarray()) + assert np.all(eigenvalues >= -1e-10) + + def test_betti_unchanged_by_weights(self, double_triangle): + """Betti numbers are topological invariants — weights don't change them.""" + b_default = betti_numbers(double_triangle) + # Betti numbers only depend on boundary matrices, not weights + assert b_default == [1, 0, 0] + + def test_weighted_pagerank_differs(self, double_triangle): + """Weighted PageRank differs from unweighted.""" + w = {"v1": 5.0, "v3": 0.1, "f123": 2.0} + pr_default = edge_pagerank(double_triangle, "e12", beta=0.1) + pr_weighted = edge_pagerank(double_triangle, "e12", beta=0.1, weights=w) + assert not np.allclose(pr_default, pr_weighted) + + def test_weighted_decomposition_exact(self, double_triangle): + """Weighted Hodge decomposition still sums to original flow.""" + w = {"v1": 2.0, "f234": 3.0} + flow = edge_pagerank(double_triangle, "e12", beta=0.1, weights=w) + decomp = hodge_decomposition(double_triangle, flow, weights=w) + reconstructed = decomp.gradient + decomp.curl + decomp.harmonic + assert_allclose(reconstructed, flow, atol=1e-8) + + def test_weighted_hodge_analysis(self, double_triangle): + """Full hodge_analysis with weights runs without error.""" + w = {"v1": 2.0, "v2": 0.5, "f123": 3.0, "f234": 0.5} + results = hodge_analysis(double_triangle, beta=0.1, weights=w) + assert isinstance(results, HodgeAnalysisResults) + assert results.betti == [1, 0, 0] + assert results.pagerank.shape == (5, 5) diff --git a/tests/test_audit.py b/tests/test_audit.py new file mode 100644 index 0000000..7e5da4c --- /dev/null +++ b/tests/test_audit.py @@ -0,0 +1,191 @@ +""" +tests/test_audit.py + +Tests for verification and audit tooling: + - kc.verify() — public throwing verification + - kc.audit() — non-throwing structured report + - deferred_verification() — context manager for bulk construction + - audit_file() — static file verification +""" + +import pytest +from pathlib import Path + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.audit import AuditReport, AuditViolation, audit_file +from knowledgecomplex.exceptions import ValidationError + + +@pytest.fixture +def schema() -> SchemaBuilder: + sb = SchemaBuilder(namespace="topo") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def valid_kc(schema) -> KnowledgeComplex: + """A valid 3-vertex, 3-edge, 1-face complex.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}) + kc.add_face("f123", type="Triangle", boundary=["e12", "e23", "e13"]) + return kc + + +@pytest.fixture +def invalid_kc(schema) -> KnowledgeComplex: + """A complex with a dangling edge (boundary vertex missing from complex). + + We bypass normal validation by using load_graph to inject bad triples. + """ + from knowledgecomplex.io import load_graph + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + # Manually remove v2 from the complex to create an invalid state + from rdflib import URIRef + v2_iri = URIRef(f"{schema._base_iri}v2") + kc._instance_graph.remove((kc._complex_iri, None, v2_iri)) + return kc + + +# =========================================================================== +# kc.verify() +# =========================================================================== + +class TestVerify: + + def test_valid_no_exception(self, valid_kc): + valid_kc.verify() # should not raise + + def test_invalid_raises(self, invalid_kc): + with pytest.raises(ValidationError): + invalid_kc.verify() + + +# =========================================================================== +# kc.audit() +# =========================================================================== + +class TestAudit: + + def test_valid_conforms(self, valid_kc): + report = valid_kc.audit() + assert isinstance(report, AuditReport) + assert report.conforms is True + assert len(report.violations) == 0 + + def test_valid_bool(self, valid_kc): + report = valid_kc.audit() + assert bool(report) is True + + def test_invalid_does_not_raise(self, invalid_kc): + report = invalid_kc.audit() # should NOT raise + assert isinstance(report, AuditReport) + + def test_invalid_conforms_false(self, invalid_kc): + report = invalid_kc.audit() + assert report.conforms is False + + def test_invalid_has_violations(self, invalid_kc): + report = invalid_kc.audit() + assert len(report.violations) > 0 + + def test_violations_have_message(self, invalid_kc): + report = invalid_kc.audit() + for v in report.violations: + assert isinstance(v, AuditViolation) + assert isinstance(v.message, str) or isinstance(v.constraint, str) + + def test_report_text(self, invalid_kc): + report = invalid_kc.audit() + assert len(report.text) > 0 + + def test_report_str(self, valid_kc): + report = valid_kc.audit() + s = str(report) + assert "passed" in s.lower() or "no violation" in s.lower() + + +# =========================================================================== +# deferred_verification +# =========================================================================== + +class TestDeferredVerification: + + def test_valid_bulk_construction(self, schema): + """Bulk add inside context manager, verifies once at exit.""" + kc = KnowledgeComplex(schema=schema) + with kc.deferred_verification(): + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}) + kc.add_face("f123", type="Triangle", boundary=["e12", "e23", "e13"]) + # If we got here, verification passed on exit + + def test_flag_reset_after_exit(self, schema): + kc = KnowledgeComplex(schema=schema) + with kc.deferred_verification(): + kc.add_vertex("v1", type="Node") + assert kc._defer_verification is False + + def test_flag_reset_on_exception(self, schema): + kc = KnowledgeComplex(schema=schema) + try: + with kc.deferred_verification(): + kc.add_vertex("v1", type="Node") + raise RuntimeError("simulated error") + except RuntimeError: + pass + assert kc._defer_verification is False + + def test_returns_kc(self, schema): + """Context manager yields the KC for convenience.""" + kc = KnowledgeComplex(schema=schema) + with kc.deferred_verification() as ctx: + assert ctx is kc + + +# =========================================================================== +# audit_file +# =========================================================================== + +class TestAuditFile: + + def test_valid_file(self, valid_kc, tmp_path): + valid_kc.export(tmp_path / "export") + report = audit_file( + tmp_path / "export" / "instance.ttl", + shapes=tmp_path / "export" / "shapes.ttl", + ontology=tmp_path / "export" / "ontology.ttl", + ) + assert isinstance(report, AuditReport) + assert report.conforms is True + + def test_missing_file_raises(self, tmp_path): + with pytest.raises(FileNotFoundError): + audit_file( + tmp_path / "nonexistent.ttl", + shapes=tmp_path / "shapes.ttl", + ) + + def test_valid_without_ontology(self, valid_kc, tmp_path): + """Without ontology, RDFS inference is skipped but basic shapes still work.""" + valid_kc.export(tmp_path / "export") + report = audit_file( + tmp_path / "export" / "instance.ttl", + shapes=tmp_path / "export" / "shapes.ttl", + ) + assert isinstance(report, AuditReport) diff --git a/tests/test_clique.py b/tests/test_clique.py new file mode 100644 index 0000000..f664dfb --- /dev/null +++ b/tests/test_clique.py @@ -0,0 +1,236 @@ +""" +tests/test_clique.py + +Tests for knowledgecomplex.clique — clique detection, typed face inference, +and generic flagification. +""" + +import pytest + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.clique import find_cliques, infer_faces, fill_cliques, _edges_between +from knowledgecomplex.exceptions import SchemaError + + +@pytest.fixture +def schema() -> SchemaBuilder: + """Schema with one vertex type, one edge type, one face type.""" + sb = SchemaBuilder(namespace="cq") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def schema_multi_edge() -> SchemaBuilder: + """Schema with two edge types for filtering tests.""" + sb = SchemaBuilder(namespace="cq") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_edge_type("Special") + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def triangle(schema) -> KnowledgeComplex: + """3 vertices, 3 edges forming a single triangle. No face added.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}) + return kc + + +@pytest.fixture +def k4(schema) -> KnowledgeComplex: + """Complete graph K4: 4 vertices, 6 edges, no faces.""" + kc = KnowledgeComplex(schema=schema) + for i in range(1, 5): + kc.add_vertex(f"v{i}", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}) + kc.add_edge("e14", type="Link", vertices={"v1", "v4"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e24", type="Link", vertices={"v2", "v4"}) + kc.add_edge("e34", type="Link", vertices={"v3", "v4"}) + return kc + + +# --- _edges_between --- + + +class TestEdgesBetween: + def test_finds_edge(self, triangle): + edges = _edges_between(triangle, "v1", "v2") + assert edges == ["e12"] + + def test_no_edge(self, schema): + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("a", type="Node") + kc.add_vertex("b", type="Node") + assert _edges_between(kc, "a", "b") == [] + + def test_edge_type_filter(self, schema_multi_edge): + kc = KnowledgeComplex(schema=schema_multi_edge) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_edge("e1", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e2", type="Special", vertices={"v1", "v2"}) + assert len(_edges_between(kc, "v1", "v2")) == 2 + assert len(_edges_between(kc, "v1", "v2", edge_type="Link")) == 1 + assert len(_edges_between(kc, "v1", "v2", edge_type="Special")) == 1 + + +# --- find_cliques --- + + +class TestFindCliques: + def test_triangle_has_one_3clique(self, triangle): + cliques = find_cliques(triangle, k=3) + assert len(cliques) == 1 + assert cliques[0] == frozenset(["v1", "v2", "v3"]) + + def test_k4_has_four_3cliques(self, k4): + cliques = find_cliques(k4, k=3) + assert len(cliques) == 4 + + def test_k4_has_one_4clique(self, k4): + cliques = find_cliques(k4, k=4) + assert len(cliques) == 1 + assert cliques[0] == frozenset(["v1", "v2", "v3", "v4"]) + + def test_no_cliques_in_path(self, schema): + """A path graph v1-v2-v3 has no triangles.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + assert find_cliques(kc, k=3) == [] + + def test_edge_type_filter(self, schema_multi_edge): + """Only edges of specified type form cliques.""" + kc = KnowledgeComplex(schema=schema_multi_edge) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Special", vertices={"v1", "v3"}) + # With all edges: 1 triangle + assert len(find_cliques(kc, k=3)) == 1 + # With only Link edges: no triangle (e13 is Special) + assert len(find_cliques(kc, k=3, edge_type="Link")) == 0 + + def test_k_less_than_2_raises(self, triangle): + with pytest.raises(ValueError, match="Clique size"): + find_cliques(triangle, k=1) + + +# --- infer_faces --- + + +class TestInferFaces: + def test_adds_one_face(self, triangle): + added = infer_faces(triangle, "Triangle") + assert len(added) == 1 + # Face was actually added + assert len(triangle.skeleton(2) - triangle.skeleton(1)) == 1 + + def test_k4_adds_four_faces(self, k4): + added = infer_faces(k4, "Triangle") + assert len(added) == 4 + + def test_dry_run_adds_nothing(self, triangle): + added = infer_faces(triangle, "Triangle", dry_run=True) + assert len(added) == 1 # one would-be face + # Nothing actually added + assert len(triangle.skeleton(2) - triangle.skeleton(1)) == 0 + + def test_no_duplicates(self, triangle): + first = infer_faces(triangle, "Triangle") + assert len(first) == 1 + second = infer_faces(triangle, "Triangle") + assert len(second) == 0 # already exists + + def test_edge_type_filter(self, schema_multi_edge): + kc = KnowledgeComplex(schema=schema_multi_edge) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Special", vertices={"v1", "v3"}) + # Filter to Link only — no triangle (e13 is Special) + added = infer_faces(kc, "Triangle", edge_type="Link") + assert len(added) == 0 + # No filter — triangle found + added = infer_faces(kc, "Triangle") + assert len(added) == 1 + + def test_unregistered_type_raises(self, triangle): + with pytest.raises(SchemaError, match="not registered"): + infer_faces(triangle, "Bogus") + + def test_non_face_type_raises(self, triangle): + with pytest.raises(SchemaError, match="not a face type"): + infer_faces(triangle, "Node") + + def test_custom_id_prefix(self, triangle): + added = infer_faces(triangle, "Triangle", id_prefix="tri") + assert added[0].startswith("tri-") + + def test_inferred_face_passes_validation(self, triangle): + """Inferred face passes SHACL validation (closed triangle).""" + infer_faces(triangle, "Triangle") + # If we got here, validation passed during add_face + face_ids = list(triangle.skeleton(2) - triangle.skeleton(1)) + assert len(face_ids) == 1 + # Boundary should be 3 edges + assert len(triangle.boundary(face_ids[0])) == 3 + + +# --- fill_cliques --- + + +class TestFillCliques: + def test_fills_faces(self, triangle): + result = fill_cliques(triangle, max_order=2) + assert 2 in result + assert len(result[2]) == 1 + + def test_k4_fills_four_faces(self, k4): + result = fill_cliques(k4, max_order=2) + assert len(result[2]) == 4 + + def test_idempotent(self, triangle): + first = fill_cliques(triangle, max_order=2) + assert len(first[2]) == 1 + second = fill_cliques(triangle, max_order=2) + assert len(second[2]) == 0 + + def test_no_face_type_raises(self): + sb = SchemaBuilder(namespace="cq") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + # No face type declared + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}) + with pytest.raises(SchemaError, match="No face types"): + fill_cliques(kc, max_order=2) + + def test_max_order_too_low_raises(self, triangle): + with pytest.raises(ValueError, match="max_order"): + fill_cliques(triangle, max_order=1) diff --git a/tests/test_codec.py b/tests/test_codec.py new file mode 100644 index 0000000..c332c26 --- /dev/null +++ b/tests/test_codec.py @@ -0,0 +1,310 @@ +""" +tests/test_codec.py + +Tests for codec registration, element.compile(), element.decompile(), +kc.decompile_uri(), and codec inheritance. +""" + +import json +import pytest +from pathlib import Path + +from knowledgecomplex.schema import SchemaBuilder, Codec, vocab, text +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.exceptions import SchemaError + + +# --------------------------------------------------------------------------- +# Test codecs +# --------------------------------------------------------------------------- + +class MockCodec: + """Records calls for assertion.""" + + def __init__(self): + self.compile_calls: list[dict] = [] + self.decompile_calls: list[str] = [] + self.decompile_return: dict = {"title": "From Codec"} + + def compile(self, element: dict) -> None: + self.compile_calls.append(dict(element)) + + def decompile(self, uri: str) -> dict: + self.decompile_calls.append(uri) + return dict(self.decompile_return) + + +class JsonFileCodec: + """Real codec that writes/reads JSON files for round-trip tests.""" + + def compile(self, element: dict) -> None: + uri = element["uri"] + path = uri.replace("file://", "") + data = {k: v for k, v in element.items() if k not in ("id", "type", "uri")} + Path(path).write_text(json.dumps(data)) + + def decompile(self, uri: str) -> dict: + path = uri.replace("file://", "") + return json.loads(Path(path).read_text()) + + +# --------------------------------------------------------------------------- +# Fixtures +# --------------------------------------------------------------------------- + +@pytest.fixture +def qa_kc() -> KnowledgeComplex: + sb = SchemaBuilder(namespace="qa") + sb.add_vertex_type("document", attributes={"title": text()}) + sb.add_vertex_type("specification", parent="document", attributes={"format": text()}) + sb.add_vertex_type("guidance", parent="document", attributes={"criteria": text()}) + sb.add_edge_type("verification", attributes={"status": vocab("pass", "fail", "pending")}) + sb.add_face_type("assurance") + + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("spec-001", type="specification", title="Spec A", format="PDF", + uri="file:///tmp/test-spec-001.json") + kc.add_vertex("guid-001", type="guidance", title="Guide A", criteria="Accuracy", + uri="file:///tmp/test-guid-001.json") + kc.add_vertex("doc-001", type="document", title="Doc A") + return kc + + +@pytest.fixture +def deep_kc() -> KnowledgeComplex: + """Three-deep inheritance for codec inheritance tests.""" + sb = SchemaBuilder(namespace="deep") + sb.add_vertex_type("document", attributes={"title": text()}) + sb.add_vertex_type("specification", parent="document", attributes={"format": text()}) + sb.add_vertex_type("detailed_spec", parent="specification", + attributes={"section": text()}) + + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("ds-001", type="detailed_spec", + title="DS", format="PDF", section="S1", + uri="file:///tmp/test-ds-001.json") + kc.add_vertex("spec-001", type="specification", + title="Spec", format="PDF", + uri="file:///tmp/test-spec-001.json") + kc.add_vertex("doc-001", type="document", title="Doc", + uri="file:///tmp/test-doc-001.json") + return kc + + +# =========================================================================== +# Codec registration tests +# =========================================================================== + +class TestCodecRegistration: + + def test_register_valid_type(self, qa_kc): + codec = MockCodec() + qa_kc.register_codec("specification", codec) + + def test_register_nonexistent_type_raises(self, qa_kc): + codec = MockCodec() + with pytest.raises(SchemaError): + qa_kc.register_codec("nonexistent", codec) + + def test_register_non_codec_raises(self, qa_kc): + with pytest.raises(TypeError): + qa_kc.register_codec("specification", "not a codec") + + def test_register_parent_type(self, qa_kc): + codec = MockCodec() + qa_kc.register_codec("document", codec) + + +# =========================================================================== +# element.compile() tests +# =========================================================================== + +class TestCompile: + + def test_compile_calls_codec(self, qa_kc): + codec = MockCodec() + qa_kc.register_codec("specification", codec) + elem = qa_kc.element("spec-001") + elem.compile() + assert len(codec.compile_calls) == 1 + call = codec.compile_calls[0] + assert call["id"] == "spec-001" + assert call["type"] == "specification" + assert call["uri"] == "file:///tmp/test-spec-001.json" + assert call["title"] == "Spec A" + assert call["format"] == "PDF" + + def test_compile_no_uri_raises(self, qa_kc): + codec = MockCodec() + qa_kc.register_codec("document", codec) + elem = qa_kc.element("doc-001") + with pytest.raises(ValueError): + elem.compile() + + def test_compile_no_codec_raises(self, qa_kc): + elem = qa_kc.element("spec-001") + with pytest.raises(SchemaError): + elem.compile() + + +# =========================================================================== +# element.decompile() tests +# =========================================================================== + +class TestDecompile: + + def test_decompile_calls_codec(self, qa_kc): + codec = MockCodec() + codec.decompile_return = {"title": "Updated Title", "format": "DOCX"} + qa_kc.register_codec("specification", codec) + elem = qa_kc.element("spec-001") + elem.decompile() + assert len(codec.decompile_calls) == 1 + assert codec.decompile_calls[0] == "file:///tmp/test-spec-001.json" + + def test_decompile_updates_attrs(self, qa_kc): + codec = MockCodec() + codec.decompile_return = {"title": "Updated Title", "format": "DOCX"} + qa_kc.register_codec("specification", codec) + elem = qa_kc.element("spec-001") + elem.decompile() + assert elem.attrs["title"] == "Updated Title" + assert elem.attrs["format"] == "DOCX" + + def test_decompile_no_uri_raises(self, qa_kc): + codec = MockCodec() + qa_kc.register_codec("document", codec) + elem = qa_kc.element("doc-001") + with pytest.raises(ValueError): + elem.decompile() + + def test_decompile_no_codec_raises(self, qa_kc): + elem = qa_kc.element("spec-001") + with pytest.raises(SchemaError): + elem.decompile() + + +# =========================================================================== +# decompile_uri() tests (standalone) +# =========================================================================== + +class TestDecompileUri: + + def test_returns_attr_dict(self, qa_kc): + codec = MockCodec() + codec.decompile_return = {"title": "From URI", "format": "HTML"} + qa_kc.register_codec("specification", codec) + result = qa_kc.decompile_uri("specification", "file:///some/path.json") + assert result == {"title": "From URI", "format": "HTML"} + assert codec.decompile_calls == ["file:///some/path.json"] + + def test_nonexistent_type_raises(self, qa_kc): + with pytest.raises(SchemaError): + qa_kc.decompile_uri("nonexistent", "file:///x") + + def test_no_codec_raises(self, qa_kc): + with pytest.raises(SchemaError): + qa_kc.decompile_uri("specification", "file:///x") + + +# =========================================================================== +# Codec inheritance tests +# =========================================================================== + +class TestCodecInheritance: + + def test_child_inherits_parent_codec(self, qa_kc): + """Register on document, compile on specification — uses document's codec.""" + codec = MockCodec() + qa_kc.register_codec("document", codec) + elem = qa_kc.element("spec-001") + elem.compile() + assert len(codec.compile_calls) == 1 + assert codec.compile_calls[0]["type"] == "specification" + + def test_child_override(self, qa_kc): + """Register on both — specification uses its own.""" + parent_codec = MockCodec() + child_codec = MockCodec() + qa_kc.register_codec("document", parent_codec) + qa_kc.register_codec("specification", child_codec) + elem = qa_kc.element("spec-001") + elem.compile() + assert len(child_codec.compile_calls) == 1 + assert len(parent_codec.compile_calls) == 0 + + def test_grandchild_inherits_from_root(self, deep_kc): + """Only root has codec — grandchild uses it.""" + codec = MockCodec() + deep_kc.register_codec("document", codec) + elem = deep_kc.element("ds-001") + elem.compile() + assert len(codec.compile_calls) == 1 + assert codec.compile_calls[0]["type"] == "detailed_spec" + + def test_grandchild_uses_middle_override(self, deep_kc): + """Register on root and middle — grandchild uses middle's.""" + root_codec = MockCodec() + mid_codec = MockCodec() + deep_kc.register_codec("document", root_codec) + deep_kc.register_codec("specification", mid_codec) + elem = deep_kc.element("ds-001") + elem.compile() + assert len(mid_codec.compile_calls) == 1 + assert len(root_codec.compile_calls) == 0 + + def test_decompile_inherits(self, qa_kc): + """Decompile also inherits codecs.""" + codec = MockCodec() + codec.decompile_return = {"title": "Inherited", "format": "TXT"} + qa_kc.register_codec("document", codec) + elem = qa_kc.element("spec-001") + elem.decompile() + assert elem.attrs["title"] == "Inherited" + + def test_decompile_uri_inherits(self, qa_kc): + """decompile_uri also inherits codecs.""" + codec = MockCodec() + codec.decompile_return = {"title": "Via Parent"} + qa_kc.register_codec("document", codec) + result = qa_kc.decompile_uri("specification", "file:///x") + assert result == {"title": "Via Parent"} + + +# =========================================================================== +# Round-trip test +# =========================================================================== + +class TestRoundTrip: + + def test_compile_then_decompile(self, qa_kc, tmp_path): + """Write to file via compile, read back via decompile, attrs match.""" + uri = f"file://{tmp_path / 'spec.json'}" + + # Re-create with temp URI + sb = SchemaBuilder(namespace="qa") + sb.add_vertex_type("document", attributes={"title": text()}) + sb.add_vertex_type("specification", parent="document", + attributes={"format": text()}) + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("spec-001", type="specification", + title="Round Trip Spec", format="PDF", uri=uri) + + codec = JsonFileCodec() + kc.register_codec("specification", codec) + + # Compile: write to file + elem = kc.element("spec-001") + elem.compile() + written = json.loads(Path(tmp_path / "spec.json").read_text()) + assert written["title"] == "Round Trip Spec" + assert written["format"] == "PDF" + + # Modify the file externally + written["title"] = "Modified Externally" + Path(tmp_path / "spec.json").write_text(json.dumps(written)) + + # Decompile: read back + elem.decompile() + assert elem.attrs["title"] == "Modified Externally" + assert elem.attrs["format"] == "PDF" diff --git a/tests/test_diff.py b/tests/test_diff.py new file mode 100644 index 0000000..b9ef9b5 --- /dev/null +++ b/tests/test_diff.py @@ -0,0 +1,258 @@ +""" +tests/test_diff.py + +Tests for knowledgecomplex.diff — ComplexDiff and ComplexSequence. +""" + +import pytest + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.diff import ComplexDiff, ComplexSequence +from knowledgecomplex.exceptions import ValidationError + + +@pytest.fixture +def schema() -> SchemaBuilder: + sb = SchemaBuilder(namespace="df") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def base_kc(schema) -> KnowledgeComplex: + """Triangle: v1-v2-v3, edges e12/e23/e13, face f123.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}) + kc.add_face("f123", type="Triangle", boundary=["e12", "e23", "e13"]) + return kc + + +# --- ComplexDiff.apply --- + + +class TestComplexDiffApply: + def test_add_vertex(self, schema): + kc = KnowledgeComplex(schema=schema) + diff = ComplexDiff().add_vertex("v1", type="Node") + diff.apply(kc) + assert "v1" in kc.element_ids() + + def test_add_edge(self, schema): + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + diff = ComplexDiff().add_edge("e12", type="Link", vertices={"v1", "v2"}) + diff.apply(kc) + assert "e12" in kc.element_ids() + + def test_remove_element(self, base_kc): + # Remove face first, then can remove edges + diff = ComplexDiff().remove("f123") + diff.apply(base_kc) + assert "f123" not in base_kc.element_ids() + assert "e12" in base_kc.element_ids() # edges still there + + def test_remove_orders_by_dimension(self, base_kc): + """Removing face + edge in one diff: face removed first (higher dim).""" + diff = ComplexDiff().remove("e12").remove("f123") + diff.apply(base_kc) + assert "f123" not in base_kc.element_ids() + assert "e12" not in base_kc.element_ids() + + def test_add_and_remove(self, base_kc): + diff = ( + ComplexDiff() + .remove("f123") + .add_vertex("v4", type="Node") + ) + diff.apply(base_kc) + assert "f123" not in base_kc.element_ids() + assert "v4" in base_kc.element_ids() + + def test_chaining(self): + diff = ( + ComplexDiff() + .add_vertex("a", type="Node") + .add_vertex("b", type="Node") + .remove("c") + ) + assert len(diff.additions) == 2 + assert len(diff.removals) == 1 + + +# --- remove_element on KnowledgeComplex --- + + +class TestRemoveElement: + def test_remove_vertex(self, schema): + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.remove_element("v1") + assert "v1" not in kc.element_ids() + + def test_remove_nonexistent_raises(self, schema): + kc = KnowledgeComplex(schema=schema) + with pytest.raises(ValueError, match="No element"): + kc.remove_element("nope") + + def test_remove_face_preserves_edges(self, base_kc): + base_kc.remove_element("f123") + assert "e12" in base_kc.element_ids() + assert "e23" in base_kc.element_ids() + + +# --- ComplexDiff.to_sparql --- + + +class TestToSparql: + def test_insert_only(self, schema): + kc = KnowledgeComplex(schema=schema) + diff = ComplexDiff().add_vertex("v1", type="Node") + sparql = diff.to_sparql(kc) + assert "INSERT DATA" in sparql + assert "DELETE DATA" not in sparql + assert "df#v1" in sparql + + def test_delete_only(self, base_kc): + diff = ComplexDiff().remove("f123") + sparql = diff.to_sparql(base_kc) + assert "DELETE DATA" in sparql + assert "df#f123" in sparql + + def test_both_insert_and_delete(self, base_kc): + diff = ComplexDiff().remove("f123").add_vertex("v4", type="Node") + sparql = diff.to_sparql(base_kc) + assert "DELETE DATA" in sparql + assert "INSERT DATA" in sparql + + +# --- ComplexDiff.from_sparql round-trip --- + + +class TestFromSparql: + def test_roundtrip_additions(self, schema): + kc = KnowledgeComplex(schema=schema) + original = ( + ComplexDiff() + .add_vertex("v1", type="Node") + .add_vertex("v2", type="Node") + ) + sparql = original.to_sparql(kc) + restored = ComplexDiff.from_sparql(sparql, kc) + assert len(restored.additions) == len(original.additions) + + def test_roundtrip_removals(self, base_kc): + original = ComplexDiff().remove("f123") + sparql = original.to_sparql(base_kc) + restored = ComplexDiff.from_sparql(sparql, base_kc) + assert len(restored.removals) >= 1 + # f123 should appear in removals + assert "f123" in restored.removals + + def test_roundtrip_apply_equivalence(self, schema): + """Applying original and roundtripped diff produces same result.""" + kc1 = KnowledgeComplex(schema=schema) + kc2 = KnowledgeComplex(schema=schema) + + diff = ( + ComplexDiff() + .add_vertex("a", type="Node") + .add_vertex("b", type="Node") + .add_edge("ab", type="Link", vertices={"a", "b"}) + ) + + # Apply original + diff.apply(kc1) + + # Roundtrip through SPARQL + sparql = diff.to_sparql(kc2) + restored = ComplexDiff.from_sparql(sparql, kc2) + restored.apply(kc2) + + assert set(kc1.element_ids()) == set(kc2.element_ids()) + + +# --- query() substitution fix --- + + +class TestQuerySubstitution: + def test_query_substitutes_placeholders(self, base_kc): + """query() now performs {placeholder} substitution.""" + # The coboundary template uses {simplex} + iri = f"<{base_kc._schema._base_iri}v1>" + df = base_kc.query("coboundary", simplex=iri) + assert len(df) > 0 # v1 has coboundary edges + + def test_query_ids_returns_set(self, base_kc): + """query_ids() returns set[str] of element IDs.""" + iri = f"<{base_kc._schema._base_iri}v1>" + ids = base_kc.query_ids("coboundary", simplex=iri) + assert isinstance(ids, set) + assert len(ids) > 0 + # Should contain edges incident to v1 + for eid in ids: + assert isinstance(eid, str) + + +# --- ComplexSequence --- + + +class TestComplexSequence: + def test_basic_sequence(self, schema): + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + + d1 = ComplexDiff().add_vertex("v3", type="Node") + d2 = ComplexDiff().add_edge("e12", type="Link", vertices={"v1", "v2"}) + + seq = ComplexSequence(kc, [d1, d2]) + assert len(seq) == 2 + assert "v3" in seq[0] + assert "e12" in seq[1] + + def test_new_at(self, schema): + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + + d1 = ComplexDiff().add_vertex("v2", type="Node") + d2 = ComplexDiff().add_vertex("v3", type="Node") + + seq = ComplexSequence(kc, [d1, d2]) + assert seq.new_at(0) == {"v2"} + assert seq.new_at(1) == {"v3"} + + def test_removed_at(self, schema): + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + + d1 = ComplexDiff().remove("v2") + seq = ComplexSequence(kc, [d1]) + assert seq.removed_at(0) == {"v2"} + + def test_iteration(self, schema): + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + + d1 = ComplexDiff().add_vertex("v2", type="Node") + d2 = ComplexDiff().add_vertex("v3", type="Node") + + seq = ComplexSequence(kc, [d1, d2]) + steps = list(seq) + assert len(steps) == 2 + assert "v2" in steps[0] + assert "v3" in steps[1] + + def test_repr(self, schema): + kc = KnowledgeComplex(schema=schema) + seq = ComplexSequence(kc, [ComplexDiff(), ComplexDiff()]) + assert "2 steps" in repr(seq) diff --git a/tests/test_element.py b/tests/test_element.py new file mode 100644 index 0000000..71f180e --- /dev/null +++ b/tests/test_element.py @@ -0,0 +1,198 @@ +""" +tests/test_element.py + +Tests for Element handles (kc.element()) and element listing +(kc.element_ids(), kc.elements()). + +Uses the QA domain schema with inheritance. +""" + +import pytest + +from knowledgecomplex.schema import SchemaBuilder, vocab, text +from knowledgecomplex.graph import KnowledgeComplex, Element +from knowledgecomplex.exceptions import SchemaError + + +@pytest.fixture +def qa_kc() -> KnowledgeComplex: + """QA domain with a few elements added.""" + sb = SchemaBuilder(namespace="qa") + sb.add_vertex_type("document", attributes={"title": text()}) + sb.add_vertex_type("specification", parent="document", attributes={"format": text()}) + sb.add_vertex_type("guidance", parent="document", attributes={"criteria": text()}) + sb.add_edge_type("typing", attributes={"scope": text()}) + sb.add_edge_type("verification", attributes={"status": vocab("pass", "fail", "pending")}) + sb.add_edge_type("validation", attributes={"status": vocab("pass", "fail", "pending")}) + sb.add_face_type("assurance") + + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("spec-001", type="specification", title="Spec A", format="PDF") + kc.add_vertex("guid-001", type="guidance", title="Guide A", criteria="Accuracy") + kc.add_vertex("doc-001", type="document", title="Doc A") + kc.add_edge("typ-001", type="typing", + vertices={"spec-001", "guid-001"}, scope="type A") + kc.add_edge("ver-001", type="verification", + vertices={"doc-001", "spec-001"}, status="pass") + kc.add_edge("val-001", type="validation", + vertices={"doc-001", "guid-001"}, status="pass") + kc.add_face("assur-001", type="assurance", + boundary=["typ-001", "ver-001", "val-001"]) + return kc + + +@pytest.fixture +def qa_kc_with_uri() -> KnowledgeComplex: + """QA domain with a vertex that has a URI.""" + sb = SchemaBuilder(namespace="qa") + sb.add_vertex_type("document", attributes={"title": text()}) + sb.add_vertex_type("specification", parent="document", attributes={"format": text()}) + + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("spec-001", type="specification", title="Spec A", format="PDF", + uri="file:///docs/spec-001.md") + kc.add_vertex("doc-001", type="document", title="Doc A") + return kc + + +@pytest.fixture +def empty_kc() -> KnowledgeComplex: + """Empty KC with schema but no elements.""" + sb = SchemaBuilder(namespace="qa") + sb.add_vertex_type("document", attributes={"title": text()}) + return KnowledgeComplex(schema=sb) + + +# =========================================================================== +# Element handle tests +# =========================================================================== + +class TestElementHandle: + + def test_element_id(self, qa_kc): + elem = qa_kc.element("spec-001") + assert elem.id == "spec-001" + + def test_element_type(self, qa_kc): + elem = qa_kc.element("spec-001") + assert elem.type == "specification" + + def test_element_type_parent(self, qa_kc): + elem = qa_kc.element("doc-001") + assert elem.type == "document" + + def test_element_uri_present(self, qa_kc_with_uri): + elem = qa_kc_with_uri.element("spec-001") + assert elem.uri == "file:///docs/spec-001.md" + + def test_element_uri_absent(self, qa_kc): + elem = qa_kc.element("spec-001") + assert elem.uri is None + + def test_element_attrs(self, qa_kc): + elem = qa_kc.element("spec-001") + attrs = elem.attrs + assert attrs["title"] == "Spec A" + assert attrs["format"] == "PDF" + + def test_element_attrs_inherited(self, qa_kc): + """Child element attrs include inherited attributes.""" + elem = qa_kc.element("spec-001") + assert "title" in elem.attrs # inherited from document + + def test_element_attrs_parent_only(self, qa_kc): + """Parent element has only its own attrs.""" + elem = qa_kc.element("doc-001") + assert "title" in elem.attrs + assert "format" not in elem.attrs + + def test_element_nonexistent_raises(self, qa_kc): + with pytest.raises(ValueError): + qa_kc.element("nonexistent") + + def test_element_is_element_type(self, qa_kc): + elem = qa_kc.element("spec-001") + assert isinstance(elem, Element) + + def test_edge_element(self, qa_kc): + elem = qa_kc.element("typ-001") + assert elem.type == "typing" + assert elem.attrs["scope"] == "type A" + + def test_face_element(self, qa_kc): + elem = qa_kc.element("assur-001") + assert elem.type == "assurance" + + +# =========================================================================== +# element_ids() tests +# =========================================================================== + +class TestElementIds: + + def test_all_ids(self, qa_kc): + ids = qa_kc.element_ids() + assert "spec-001" in ids + assert "guid-001" in ids + assert "doc-001" in ids + assert "typ-001" in ids + assert "ver-001" in ids + assert "val-001" in ids + assert "assur-001" in ids + assert len(ids) == 7 + + def test_filter_by_type(self, qa_kc): + ids = qa_kc.element_ids(type="specification") + assert ids == ["spec-001"] + + def test_filter_by_parent_includes_children(self, qa_kc): + """Filtering by 'document' includes specification and guidance.""" + ids = qa_kc.element_ids(type="document") + assert "doc-001" in ids + assert "spec-001" in ids + assert "guid-001" in ids + assert len(ids) == 3 + + def test_filter_nonexistent_type_raises(self, qa_kc): + with pytest.raises(SchemaError): + qa_kc.element_ids(type="nonexistent") + + def test_empty_complex(self, empty_kc): + assert empty_kc.element_ids() == [] + + def test_filter_edge_type(self, qa_kc): + ids = qa_kc.element_ids(type="typing") + assert ids == ["typ-001"] + + def test_filter_face_type(self, qa_kc): + ids = qa_kc.element_ids(type="assurance") + assert ids == ["assur-001"] + + +# =========================================================================== +# elements() tests +# =========================================================================== + +class TestElements: + + def test_all_elements(self, qa_kc): + elems = qa_kc.elements() + assert len(elems) == 7 + assert all(isinstance(e, Element) for e in elems) + + def test_filter_by_type(self, qa_kc): + elems = qa_kc.elements(type="specification") + assert len(elems) == 1 + assert elems[0].id == "spec-001" + assert elems[0].type == "specification" + + def test_filter_by_parent_includes_children(self, qa_kc): + elems = qa_kc.elements(type="document") + ids = {e.id for e in elems} + assert ids == {"doc-001", "spec-001", "guid-001"} + + def test_element_attrs_correct(self, qa_kc): + elems = qa_kc.elements(type="guidance") + assert len(elems) == 1 + assert elems[0].attrs["criteria"] == "Accuracy" + assert elems[0].attrs["title"] == "Guide A" diff --git a/tests/test_filtration.py b/tests/test_filtration.py new file mode 100644 index 0000000..0d738a5 --- /dev/null +++ b/tests/test_filtration.py @@ -0,0 +1,376 @@ +""" +tests/test_filtration.py + +Tests for is_subcomplex, Filtration class construction, indexing, iteration, +query methods, and composability with topological queries. + +Fixture: double-triangle complex (4 vertices, 5 edges, 2 faces). +""" + +import pytest + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.filtration import Filtration +from knowledgecomplex.exceptions import SchemaError + + +@pytest.fixture +def schema() -> SchemaBuilder: + sb = SchemaBuilder(namespace="topo") + sb.add_vertex_type("Node") + sb.add_edge_type("Link", attributes={"weight": vocab("light", "heavy")}) + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def kc(schema) -> KnowledgeComplex: + """4 vertices, 5 edges, 2 faces sharing edge e23.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_vertex("v4", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}, weight="light") + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}, weight="heavy") + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}, weight="light") + kc.add_edge("e24", type="Link", vertices={"v2", "v4"}, weight="heavy") + kc.add_edge("e34", type="Link", vertices={"v3", "v4"}, weight="light") + kc.add_face("f123", type="Triangle", boundary=["e12", "e23", "e13"]) + kc.add_face("f234", type="Triangle", boundary=["e23", "e24", "e34"]) + return kc + + +ALL_ELEMENTS = { + "v1", "v2", "v3", "v4", + "e12", "e23", "e13", "e24", "e34", + "f123", "f234", +} + + +# =========================================================================== +# is_subcomplex tests +# =========================================================================== + +class TestIsSubcomplex: + + def test_single_vertex(self, kc): + assert kc.is_subcomplex({"v1"}) is True + + def test_edge_with_vertices(self, kc): + assert kc.is_subcomplex({"v1", "v2", "e12"}) is True + + def test_edge_without_vertices(self, kc): + assert kc.is_subcomplex({"e12"}) is False + + def test_full_triangle(self, kc): + assert kc.is_subcomplex( + {"v1", "v2", "v3", "e12", "e13", "e23", "f123"} + ) is True + + def test_face_without_edges(self, kc): + assert kc.is_subcomplex({"f123"}) is False + + def test_empty_set(self, kc): + assert kc.is_subcomplex(set()) is True + + def test_whole_complex(self, kc): + assert kc.is_subcomplex(ALL_ELEMENTS) is True + + def test_edges_without_all_vertices(self, kc): + assert kc.is_subcomplex({"v1", "e12", "e13"}) is False + + def test_two_vertices(self, kc): + assert kc.is_subcomplex({"v1", "v2"}) is True + + def test_all_vertices(self, kc): + assert kc.is_subcomplex({"v1", "v2", "v3", "v4"}) is True + + +# =========================================================================== +# Filtration construction (append) +# =========================================================================== + +class TestFiltrationAppend: + + def test_valid_filtration(self, kc): + filt = Filtration(kc) + filt.append({"v1"}) + filt.append({"v1", "v2", "e12"}) + filt.append({"v1", "v2", "v3", "e12", "e23", "e13", "f123"}) + assert len(filt) == 3 + + def test_non_superset_raises(self, kc): + filt = Filtration(kc) + filt.append({"v1", "v2", "e12"}) + with pytest.raises(ValueError, match="monotone"): + filt.append({"v3"}) # doesn't contain v1, v2, e12 + + def test_non_subcomplex_raises(self, kc): + filt = Filtration(kc) + with pytest.raises(ValueError, match="subcomplex"): + filt.append({"e12"}) # missing boundary vertices + + def test_single_step(self, kc): + filt = Filtration(kc) + filt.append({"v1"}) + assert len(filt) == 1 + + def test_empty_first_step(self, kc): + filt = Filtration(kc) + filt.append(set()) + assert len(filt) == 1 + assert filt[0] == set() + + def test_chaining(self, kc): + filt = Filtration(kc) + result = filt.append({"v1"}).append({"v1", "v2", "e12"}) + assert result is filt + assert len(filt) == 2 + + +# =========================================================================== +# Filtration construction (append_closure) +# =========================================================================== + +class TestFiltrationAppendClosure: + + def test_single_vertex(self, kc): + filt = Filtration(kc) + filt.append_closure({"v1"}) + assert filt[0] == {"v1"} + + def test_edge_auto_closes(self, kc): + filt = Filtration(kc) + filt.append_closure({"v1"}) + filt.append_closure({"e12"}) + # closure(e12) = {v1, v2, e12}, union with {v1} = {v1, v2, e12} + assert filt[1] == {"v1", "v2", "e12"} + + def test_star_closure(self, kc): + filt = Filtration(kc) + filt.append_closure(kc.star("v1")) + # star(v1) includes v1, e12, e13, f123 + # closure of that adds v2, v3, e23 + step = filt[0] + assert "v1" in step + assert "f123" in step + assert "v2" in step # from closure + assert "e23" in step # from closure of f123 + + def test_builds_valid_filtration(self, kc): + filt = Filtration(kc) + filt.append_closure({"v1"}) + filt.append_closure({"e12"}) + filt.append_closure({"f123"}) + filt.append_closure({"f234"}) + # Each step should be a valid subcomplex + for i in range(len(filt)): + assert kc.is_subcomplex(filt[i]) + # Monotone + for i in range(1, len(filt)): + assert filt[i - 1] <= filt[i] + + def test_chaining(self, kc): + filt = Filtration(kc) + result = filt.append_closure({"v1"}).append_closure({"e12"}) + assert result is filt + + +# =========================================================================== +# Filtration construction (from_function) +# =========================================================================== + +class TestFiltrationFromFunction: + + def test_monotone_function(self, kc): + # Assign vertices=0, edges=1, faces=2 + def by_dimension(elem_id): + if elem_id.startswith("v"): + return 0 + elif elem_id.startswith("e"): + return 1 + else: + return 2 + + filt = Filtration.from_function(kc, by_dimension) + assert len(filt) == 3 + + def test_each_step_is_subcomplex(self, kc): + def by_dimension(elem_id): + if elem_id.startswith("v"): + return 0 + elif elem_id.startswith("e"): + return 1 + else: + return 2 + + filt = Filtration.from_function(kc, by_dimension) + for i in range(len(filt)): + assert kc.is_subcomplex(filt[i]) + + def test_all_elements_in_final_step(self, kc): + def by_dimension(elem_id): + if elem_id.startswith("v"): + return 0 + elif elem_id.startswith("e"): + return 1 + else: + return 2 + + filt = Filtration.from_function(kc, by_dimension) + assert filt[-1] == ALL_ELEMENTS + + def test_distinct_values_count(self, kc): + # All elements get the same value → 1 step + filt = Filtration.from_function(kc, lambda _: 0) + assert len(filt) == 1 + + def test_monotone_nesting(self, kc): + def by_dimension(elem_id): + if elem_id.startswith("v"): + return 0 + elif elem_id.startswith("e"): + return 1 + else: + return 2 + + filt = Filtration.from_function(kc, by_dimension) + for i in range(1, len(filt)): + assert filt[i - 1] <= filt[i] + + +# =========================================================================== +# Indexing and iteration +# =========================================================================== + +class TestIndexingIteration: + + def _build_filt(self, kc): + filt = Filtration(kc) + filt.append({"v1"}) + filt.append({"v1", "v2", "e12"}) + filt.append({"v1", "v2", "v3", "e12", "e23", "e13", "f123"}) + return filt + + def test_getitem_first(self, kc): + filt = self._build_filt(kc) + assert filt[0] == {"v1"} + + def test_getitem_last(self, kc): + filt = self._build_filt(kc) + assert filt[-1] == {"v1", "v2", "v3", "e12", "e23", "e13", "f123"} + + def test_len(self, kc): + filt = self._build_filt(kc) + assert len(filt) == 3 + + def test_iteration(self, kc): + filt = self._build_filt(kc) + steps = list(filt) + assert len(steps) == 3 + assert steps[0] == {"v1"} + + def test_out_of_bounds_raises(self, kc): + filt = self._build_filt(kc) + with pytest.raises(IndexError): + filt[10] + + def test_empty_filtration_len(self, kc): + filt = Filtration(kc) + assert len(filt) == 0 + + +# =========================================================================== +# Query methods +# =========================================================================== + +class TestQueryMethods: + + def _build_filt(self, kc): + filt = Filtration(kc) + filt.append({"v1"}) + filt.append({"v1", "v2", "e12"}) + filt.append({"v1", "v2", "v3", "e12", "e23", "e13", "f123"}) + return filt + + def test_birth(self, kc): + filt = self._build_filt(kc) + assert filt.birth("v1") == 0 + assert filt.birth("v2") == 1 + assert filt.birth("e12") == 1 + assert filt.birth("f123") == 2 + + def test_birth_nonexistent_raises(self, kc): + filt = self._build_filt(kc) + with pytest.raises(ValueError): + filt.birth("nonexistent") + + def test_new_at_first(self, kc): + filt = self._build_filt(kc) + assert filt.new_at(0) == {"v1"} + + def test_new_at_middle(self, kc): + filt = self._build_filt(kc) + assert filt.new_at(1) == {"v2", "e12"} + + def test_new_at_last(self, kc): + filt = self._build_filt(kc) + assert filt.new_at(2) == {"v3", "e23", "e13", "f123"} + + def test_elements_at(self, kc): + filt = self._build_filt(kc) + assert filt.elements_at(1) == {"v1", "v2", "e12"} + + def test_is_complete_false(self, kc): + filt = self._build_filt(kc) + assert filt.is_complete is False + + def test_is_complete_true(self, kc): + filt = Filtration(kc) + filt.append(ALL_ELEMENTS) + assert filt.is_complete is True + + def test_complex_reference(self, kc): + filt = Filtration(kc) + assert filt.complex is kc + + def test_length_property(self, kc): + filt = self._build_filt(kc) + assert filt.length == 3 + + +# =========================================================================== +# Composability with topological queries +# =========================================================================== + +class TestComposability: + + def test_skeleton_filtration(self, kc): + """Build filtration from skeleton: sk₀, sk₁, sk₂.""" + filt = Filtration(kc) + filt.append(kc.skeleton(0)) + filt.append(kc.skeleton(1)) + filt.append(kc.skeleton(2)) + assert len(filt) == 3 + assert filt[0] == {"v1", "v2", "v3", "v4"} + assert filt[-1] == ALL_ELEMENTS + + def test_star_expansion(self, kc): + """Build filtration by expanding from a vertex using closures.""" + filt = Filtration(kc) + filt.append_closure({"v1"}) + filt.append_closure(kc.star("v1")) + filt.append_closure(kc.star("v2")) + for i in range(len(filt)): + assert kc.is_subcomplex(filt[i]) + + def test_closure_driven(self, kc): + """Build filtration using closure of growing element sets.""" + filt = Filtration(kc) + filt.append_closure({"v1"}) + filt.append_closure({"v2"}) + filt.append_closure({"e12"}) + # After 3 steps, should have at least {v1, v2, e12} + assert {"v1", "v2", "e12"} <= filt[-1] diff --git a/tests/test_inheritance.py b/tests/test_inheritance.py new file mode 100644 index 0000000..6cdc964 --- /dev/null +++ b/tests/test_inheritance.py @@ -0,0 +1,604 @@ +""" +tests/test_inheritance.py + +Tests for user-defined type inheritance, attribute binding, and schema introspection. +Uses a quality-assurance domain as the primary fixture. + +All tests call the public API only — no rdflib imports except in tests that +inspect the OWL/SHACL graph output. +""" + +import pytest +from rdflib import Graph, URIRef +from rdflib.namespace import RDFS + +from knowledgecomplex.schema import SchemaBuilder, vocab, text +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.exceptions import SchemaError, ValidationError + + +# --------------------------------------------------------------------------- +# Fixtures +# --------------------------------------------------------------------------- + +@pytest.fixture +def qa_schema() -> SchemaBuilder: + """QA domain schema with document → specification/guidance inheritance.""" + sb = SchemaBuilder(namespace="qa") + sb.add_vertex_type("document", attributes={"title": text(), "category": text()}) + sb.add_vertex_type("specification", parent="document", + attributes={"format": text()}, + bind={"category": "structural"}) + sb.add_vertex_type("guidance", parent="document", + attributes={"criteria": text()}, + bind={"category": "quality"}) + sb.add_edge_type("typing", attributes={"scope": text()}) + sb.add_edge_type("verification", attributes={"status": vocab("pass", "fail", "pending")}) + sb.add_edge_type("validation", attributes={"status": vocab("pass", "fail", "pending")}) + sb.add_face_type("assurance") + return sb + + +@pytest.fixture +def qa_schema_no_bind() -> SchemaBuilder: + """QA domain schema without bind — for testing inheritance without binding.""" + sb = SchemaBuilder(namespace="qa") + sb.add_vertex_type("document", attributes={"title": text()}) + sb.add_vertex_type("specification", parent="document", attributes={"format": text()}) + sb.add_vertex_type("guidance", parent="document", attributes={"criteria": text()}) + sb.add_edge_type("typing", attributes={"scope": text()}) + sb.add_edge_type("verification", attributes={"status": vocab("pass", "fail", "pending")}) + sb.add_edge_type("validation", attributes={"status": vocab("pass", "fail", "pending")}) + sb.add_face_type("assurance") + return sb + + +@pytest.fixture +def deep_schema() -> SchemaBuilder: + """Three-deep inheritance chain for multi-level tests.""" + sb = SchemaBuilder(namespace="deep") + sb.add_vertex_type("document", attributes={"title": text()}) + sb.add_vertex_type("specification", parent="document", + attributes={"format": text()}, + bind={"title": "Untitled Spec"}) + sb.add_vertex_type("detailed_specification", parent="specification", + attributes={"section": text()}) + return sb + + +# =========================================================================== +# Schema-level tests (OWL/SHACL graph correctness) +# =========================================================================== + +class TestSchemaOWL: + + def test_child_subclass_of_parent(self, qa_schema): + """specification rdfs:subClassOf document in OWL graph.""" + g = Graph() + g.parse(data=qa_schema.dump_owl(), format="turtle") + spec = URIRef("https://example.org/qa#specification") + doc = URIRef("https://example.org/qa#document") + assert (spec, RDFS.subClassOf, doc) in g + + def test_parent_subclass_of_kc_vertex(self, qa_schema): + """document rdfs:subClassOf KC:Vertex in OWL graph.""" + g = Graph() + g.parse(data=qa_schema.dump_owl(), format="turtle") + doc = URIRef("https://example.org/qa#document") + kc_vertex = URIRef("https://example.org/kc#Vertex") + assert (doc, RDFS.subClassOf, kc_vertex) in g + + def test_child_not_direct_subclass_of_kc_vertex(self, qa_schema): + """specification does NOT have direct rdfs:subClassOf KC:Vertex.""" + g = Graph() + g.parse(data=qa_schema.dump_owl(), format="turtle") + spec = URIRef("https://example.org/qa#specification") + kc_vertex = URIRef("https://example.org/kc#Vertex") + assert (spec, RDFS.subClassOf, kc_vertex) not in g + + def test_guidance_subclass_of_document(self, qa_schema): + """guidance rdfs:subClassOf document in OWL graph.""" + g = Graph() + g.parse(data=qa_schema.dump_owl(), format="turtle") + guidance = URIRef("https://example.org/qa#guidance") + doc = URIRef("https://example.org/qa#document") + assert (guidance, RDFS.subClassOf, doc) in g + + def test_child_shape_targets_child(self, qa_schema): + """Child SHACL shape has sh:targetClass pointing to child IRI.""" + ttl = qa_schema.dump_shacl() + assert "specificationShape" in ttl + assert "qa:specification" in ttl or "qa#specification" in ttl + + def test_parent_shape_targets_parent(self, qa_schema): + """Parent SHACL shape has sh:targetClass pointing to parent IRI.""" + ttl = qa_schema.dump_shacl() + assert "documentShape" in ttl + assert "qa:document" in ttl or "qa#document" in ttl + + def test_both_shapes_exist(self, qa_schema): + """Both parent and child shapes exist in SHACL graph.""" + ttl = qa_schema.dump_shacl() + assert "documentShape" in ttl + assert "specificationShape" in ttl + assert "guidanceShape" in ttl + + +# =========================================================================== +# Schema error tests (bad add_*_type calls) +# =========================================================================== + +class TestSchemaErrors: + + def test_parent_nonexistent_raises(self): + sb = SchemaBuilder(namespace="err") + with pytest.raises(SchemaError): + sb.add_vertex_type("child", parent="nonexistent") + + def test_vertex_parent_is_edge_type_raises(self): + sb = SchemaBuilder(namespace="err") + sb.add_edge_type("my_edge") + with pytest.raises(SchemaError): + sb.add_vertex_type("child", parent="my_edge") + + def test_edge_parent_is_vertex_type_raises(self): + sb = SchemaBuilder(namespace="err") + sb.add_vertex_type("my_vertex") + with pytest.raises(SchemaError): + sb.add_edge_type("child", parent="my_vertex") + + def test_face_parent_is_vertex_type_raises(self): + sb = SchemaBuilder(namespace="err") + sb.add_vertex_type("my_vertex") + with pytest.raises(SchemaError): + sb.add_face_type("child", parent="my_vertex") + + def test_duplicate_type_name_raises(self): + sb = SchemaBuilder(namespace="err") + sb.add_vertex_type("thing") + with pytest.raises(SchemaError): + sb.add_vertex_type("thing") + + +# =========================================================================== +# Instance success tests (valid constructions) +# =========================================================================== + +class TestInstanceSuccess: + + def test_child_vertex_with_inherited_and_own_attrs(self, qa_schema_no_bind): + """specification vertex with both title (inherited) and format (own) passes.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + kc.add_vertex("spec-001", type="specification", title="My Spec", format="PDF") + + def test_guidance_vertex_with_inherited_and_own_attrs(self, qa_schema_no_bind): + """guidance vertex with both title (inherited) and criteria (own) passes.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + kc.add_vertex("g-001", type="guidance", title="My Guide", criteria="Accuracy") + + def test_parent_vertex_with_own_attrs(self, qa_schema_no_bind): + """Plain document vertex with just title passes.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + kc.add_vertex("doc-001", type="document", title="Plain Doc") + + def test_full_qa_triangle(self, qa_schema_no_bind): + """Build a complete assurance triangle with spec, guidance, edges, face.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + kc.add_vertex("spec-001", type="specification", title="Spec A", format="PDF") + kc.add_vertex("guid-001", type="guidance", title="Guide A", criteria="Accuracy") + kc.add_vertex("doc-001", type="document", title="Doc A") + + kc.add_edge("typ-001", type="typing", + vertices={"spec-001", "guid-001"}, scope="document type") + kc.add_edge("ver-001", type="verification", + vertices={"doc-001", "spec-001"}, status="pass") + kc.add_edge("val-001", type="validation", + vertices={"doc-001", "guid-001"}, status="pass") + + kc.add_face("assur-001", type="assurance", + boundary=["typ-001", "ver-001", "val-001"]) + + def test_multiple_triangles_sharing_vertices(self, qa_schema_no_bind): + """Build two assurance triangles that share vertices.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + kc.add_vertex("spec-001", type="specification", title="Spec A", format="PDF") + kc.add_vertex("guid-001", type="guidance", title="Guide A", criteria="Accuracy") + kc.add_vertex("doc-001", type="document", title="Doc A") + kc.add_vertex("doc-002", type="document", title="Doc B") + + # Triangle 1 + kc.add_edge("typ-001", type="typing", + vertices={"spec-001", "guid-001"}, scope="type A") + kc.add_edge("ver-001", type="verification", + vertices={"doc-001", "spec-001"}, status="pass") + kc.add_edge("val-001", type="validation", + vertices={"doc-001", "guid-001"}, status="pass") + kc.add_face("assur-001", type="assurance", + boundary=["typ-001", "ver-001", "val-001"]) + + # Triangle 2 shares spec-001 and guid-001 + kc.add_edge("ver-002", type="verification", + vertices={"doc-002", "spec-001"}, status="pending") + kc.add_edge("val-002", type="validation", + vertices={"doc-002", "guid-001"}, status="pending") + kc.add_face("assur-002", type="assurance", + boundary=["typ-001", "ver-002", "val-002"]) + + def test_child_vertex_in_edge_with_parent_vertex(self, qa_schema_no_bind): + """Child vertex works in edges alongside parent vertices.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + kc.add_vertex("spec-001", type="specification", title="Spec", format="PDF") + kc.add_vertex("doc-001", type="document", title="Doc") + kc.add_edge("ver-001", type="verification", + vertices={"doc-001", "spec-001"}, status="pass") + + +# =========================================================================== +# Instance failure tests (validation rejections) +# =========================================================================== + +class TestInstanceFailure: + + def test_child_missing_inherited_attr_fails(self, qa_schema_no_bind): + """specification vertex missing inherited title fails validation.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + with pytest.raises(ValidationError): + kc.add_vertex("spec-001", type="specification", format="PDF") + + def test_child_missing_own_attr_fails(self, qa_schema_no_bind): + """specification vertex missing own format fails validation.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + with pytest.raises(ValidationError): + kc.add_vertex("spec-001", type="specification", title="My Spec") + + def test_guidance_missing_inherited_attr_fails(self, qa_schema_no_bind): + """guidance vertex missing inherited title fails validation.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + with pytest.raises(ValidationError): + kc.add_vertex("g-001", type="guidance", criteria="Accuracy") + + def test_invalid_vocab_on_edge_fails(self, qa_schema_no_bind): + """Edge with invalid vocab value fails validation.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + kc.add_vertex("spec-001", type="specification", title="Spec", format="PDF") + kc.add_vertex("doc-001", type="document", title="Doc") + with pytest.raises(ValidationError): + kc.add_edge("ver-001", type="verification", + vertices={"doc-001", "spec-001"}, status="INVALID") + + def test_edge_missing_boundary_vertex_fails(self, qa_schema_no_bind): + """Edge referencing a vertex not in the complex fails.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + with pytest.raises(ValidationError): + kc.add_edge("ver-001", type="verification", + vertices={"ghost-001", "ghost-002"}, status="pass") + + def test_open_triangle_face_fails(self, qa_schema_no_bind): + """Face with edges that don't form a closed triangle fails.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + kc.add_vertex("v1", type="document", title="A") + kc.add_vertex("v2", type="specification", title="B", format="PDF") + kc.add_vertex("v3", type="guidance", title="C", criteria="X") + kc.add_vertex("v4", type="document", title="D") + + kc.add_edge("e1", type="typing", vertices={"v1", "v2"}, scope="s") + kc.add_edge("e2", type="verification", vertices={"v2", "v3"}, status="pass") + kc.add_edge("e3", type="validation", vertices={"v3", "v4"}, status="pass") + + with pytest.raises(ValidationError): + kc.add_face("bad", type="assurance", boundary=["e1", "e2", "e3"]) + + def test_unregistered_type_fails(self, qa_schema_no_bind): + """Adding an element with an unregistered type fails.""" + kc = KnowledgeComplex(schema=qa_schema_no_bind) + with pytest.raises(ValidationError): + kc.add_vertex("x", type="nonexistent_type") + + +# =========================================================================== +# Bind tests (sh:hasValue) +# =========================================================================== + +class TestBind: + + def test_bind_correct_value_passes(self, qa_schema): + """specification with category=structural passes (matches bind).""" + kc = KnowledgeComplex(schema=qa_schema) + kc.add_vertex("spec-001", type="specification", + title="Spec", format="PDF", category="structural") + + def test_bind_wrong_value_fails(self, qa_schema): + """specification with category=quality fails (sh:hasValue violation).""" + kc = KnowledgeComplex(schema=qa_schema) + with pytest.raises(ValidationError): + kc.add_vertex("spec-001", type="specification", + title="Spec", format="PDF", category="quality") + + def test_bind_omitted_fails(self, qa_schema): + """specification omitting category fails (bound implies required).""" + kc = KnowledgeComplex(schema=qa_schema) + with pytest.raises(ValidationError): + kc.add_vertex("spec-001", type="specification", + title="Spec", format="PDF") + + def test_parent_not_bound(self, qa_schema): + """Plain document with any category still passes (bind only constrains child).""" + kc = KnowledgeComplex(schema=qa_schema) + kc.add_vertex("doc-001", type="document", + title="Doc", category="anything") + + def test_bind_vocab_valid_value_schema_ok(self): + """Binding a vocab attribute to a value in the allowed set succeeds.""" + sb = SchemaBuilder(namespace="bv") + sb.add_vertex_type("base", attributes={ + "status": vocab("active", "inactive") + }) + sb.add_vertex_type("always_active", parent="base", + bind={"status": "active"}) + + def test_bind_vocab_invalid_value_raises(self): + """Binding a vocab attribute to a value not in the allowed set raises SchemaError.""" + sb = SchemaBuilder(namespace="bv") + sb.add_vertex_type("base", attributes={ + "status": vocab("active", "inactive") + }) + with pytest.raises(SchemaError): + sb.add_vertex_type("broken", parent="base", + bind={"status": "deleted"}) + + def test_bind_nonexistent_attr_raises(self): + """Binding an attribute not in the type's ancestry raises SchemaError.""" + sb = SchemaBuilder(namespace="bv") + sb.add_vertex_type("base", attributes={"title": text()}) + with pytest.raises(SchemaError): + sb.add_vertex_type("child", parent="base", + bind={"nonexistent": "value"}) + + def test_guidance_bind_correct_value_passes(self, qa_schema): + """guidance with category=quality passes (matches bind).""" + kc = KnowledgeComplex(schema=qa_schema) + kc.add_vertex("g-001", type="guidance", + title="Guide", criteria="Accuracy", category="quality") + + def test_guidance_bind_wrong_value_fails(self, qa_schema): + """guidance with category=structural fails.""" + kc = KnowledgeComplex(schema=qa_schema) + with pytest.raises(ValidationError): + kc.add_vertex("g-001", type="guidance", + title="Guide", criteria="Accuracy", category="structural") + + +# =========================================================================== +# Introspection tests (describe_type / type_names) +# =========================================================================== + +class TestIntrospection: + + def test_describe_child_parent_and_kind(self, qa_schema): + """describe_type returns correct parent and kind.""" + desc = qa_schema.describe_type("specification") + assert desc["parent"] == "document" + assert desc["kind"] == "vertex" + assert desc["name"] == "specification" + + def test_describe_own_attributes(self, qa_schema): + """own_attributes contains format but not title.""" + desc = qa_schema.describe_type("specification") + assert "format" in desc["own_attributes"] + assert "title" not in desc["own_attributes"] + + def test_describe_inherited_attributes(self, qa_schema): + """inherited_attributes contains title but not format.""" + desc = qa_schema.describe_type("specification") + assert "title" in desc["inherited_attributes"] + assert "format" not in desc["inherited_attributes"] + + def test_describe_all_attributes(self, qa_schema): + """all_attributes contains both inherited and own.""" + desc = qa_schema.describe_type("specification") + assert "title" in desc["all_attributes"] + assert "category" in desc["all_attributes"] + assert "format" in desc["all_attributes"] + + def test_describe_bound(self, qa_schema): + """bound returns the bound values.""" + desc = qa_schema.describe_type("specification") + assert desc["bound"] == {"category": "structural"} + + def test_describe_parent_has_no_parent(self, qa_schema): + """describe_type for root type returns parent=None.""" + desc = qa_schema.describe_type("document") + assert desc["parent"] is None + assert desc["inherited_attributes"] == {} + + def test_describe_nonexistent_raises(self, qa_schema): + """describe_type for nonexistent type raises SchemaError.""" + with pytest.raises(SchemaError): + qa_schema.describe_type("nonexistent") + + def test_type_names_all(self, qa_schema): + """type_names() returns all registered types.""" + names = qa_schema.type_names() + expected = {"document", "specification", "guidance", + "typing", "verification", "validation", "assurance"} + assert set(names) == expected + + def test_type_names_vertex(self, qa_schema): + """type_names(kind='vertex') returns only vertex types.""" + names = qa_schema.type_names(kind="vertex") + assert set(names) == {"document", "specification", "guidance"} + + def test_type_names_edge(self, qa_schema): + """type_names(kind='edge') returns only edge types.""" + names = qa_schema.type_names(kind="edge") + assert set(names) == {"typing", "verification", "validation"} + + def test_type_names_face(self, qa_schema): + """type_names(kind='face') returns only face types.""" + names = qa_schema.type_names(kind="face") + assert set(names) == {"assurance"} + + +# =========================================================================== +# Multi-level inheritance tests +# =========================================================================== + +class TestMultiLevel: + + def test_three_deep_chain_owl(self, deep_schema): + """detailed_specification subClassOf specification subClassOf document.""" + g = Graph() + g.parse(data=deep_schema.dump_owl(), format="turtle") + ds = URIRef("https://example.org/deep#detailed_specification") + spec = URIRef("https://example.org/deep#specification") + doc = URIRef("https://example.org/deep#document") + kc_vertex = URIRef("https://example.org/kc#Vertex") + + assert (ds, RDFS.subClassOf, spec) in g + assert (spec, RDFS.subClassOf, doc) in g + assert (doc, RDFS.subClassOf, kc_vertex) in g + + def test_three_deep_inherited_attributes(self, deep_schema): + """detailed_specification inherits from both ancestors.""" + desc = deep_schema.describe_type("detailed_specification") + assert "title" in desc["inherited_attributes"] + assert "format" in desc["inherited_attributes"] + assert "section" in desc["own_attributes"] + assert "section" not in desc["inherited_attributes"] + + def test_three_deep_all_attributes(self, deep_schema): + """all_attributes includes attrs from the entire chain.""" + desc = deep_schema.describe_type("detailed_specification") + assert "title" in desc["all_attributes"] + assert "format" in desc["all_attributes"] + assert "section" in desc["all_attributes"] + + def test_three_deep_instance_valid(self, deep_schema): + """Instance of detailed_specification must satisfy all ancestor constraints.""" + kc = KnowledgeComplex(schema=deep_schema) + kc.add_vertex("ds-001", type="detailed_specification", + title="Untitled Spec", format="PDF", section="Section 1") + + def test_three_deep_instance_missing_grandparent_attr_fails(self, deep_schema): + """Instance missing grandparent's required title fails.""" + kc = KnowledgeComplex(schema=deep_schema) + with pytest.raises(ValidationError): + kc.add_vertex("ds-001", type="detailed_specification", + format="PDF", section="Section 1") + + def test_three_deep_instance_missing_parent_attr_fails(self, deep_schema): + """Instance missing parent's required format fails.""" + kc = KnowledgeComplex(schema=deep_schema) + with pytest.raises(ValidationError): + kc.add_vertex("ds-001", type="detailed_specification", + title="Untitled Spec", section="Section 1") + + def test_three_deep_instance_missing_own_attr_fails(self, deep_schema): + """Instance missing own required section fails.""" + kc = KnowledgeComplex(schema=deep_schema) + with pytest.raises(ValidationError): + kc.add_vertex("ds-001", type="detailed_specification", + title="Untitled Spec", format="PDF") + + def test_middle_bind_propagates_to_grandchild(self, deep_schema): + """Bind at specification level constrains detailed_specification instances.""" + desc = deep_schema.describe_type("detailed_specification") + # The grandchild should see the bind from its parent + assert "title" in desc["inherited_attributes"] + + # Instance must use the bound value + kc = KnowledgeComplex(schema=deep_schema) + kc.add_vertex("ds-001", type="detailed_specification", + title="Untitled Spec", format="PDF", section="S1") + + def test_middle_bind_wrong_value_fails_on_grandchild(self, deep_schema): + """Grandchild instance with wrong bound value fails.""" + kc = KnowledgeComplex(schema=deep_schema) + with pytest.raises(ValidationError): + kc.add_vertex("ds-001", type="detailed_specification", + title="Wrong Title", format="PDF", section="S1") + + +# =========================================================================== +# Edge and face type inheritance +# =========================================================================== + +class TestEdgeFaceInheritance: + + def test_edge_type_inheritance(self): + """Edge types support parent parameter.""" + sb = SchemaBuilder(namespace="ef") + sb.add_vertex_type("node") + sb.add_edge_type("relation", attributes={"weight": text()}) + sb.add_edge_type("strong_relation", parent="relation", + attributes={"confidence": text()}) + + g = Graph() + g.parse(data=sb.dump_owl(), format="turtle") + strong = URIRef("https://example.org/ef#strong_relation") + rel = URIRef("https://example.org/ef#relation") + assert (strong, RDFS.subClassOf, rel) in g + + def test_edge_type_inheritance_instance(self): + """Instance of child edge type inherits parent attributes.""" + sb = SchemaBuilder(namespace="ef") + sb.add_vertex_type("node") + sb.add_edge_type("relation", attributes={"weight": text()}) + sb.add_edge_type("strong_relation", parent="relation", + attributes={"confidence": text()}) + + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="node") + kc.add_vertex("v2", type="node") + kc.add_edge("e1", type="strong_relation", + vertices={"v1", "v2"}, weight="high", confidence="0.95") + + def test_edge_child_missing_inherited_attr_fails(self): + """Child edge missing inherited weight fails.""" + sb = SchemaBuilder(namespace="ef") + sb.add_vertex_type("node") + sb.add_edge_type("relation", attributes={"weight": text()}) + sb.add_edge_type("strong_relation", parent="relation", + attributes={"confidence": text()}) + + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="node") + kc.add_vertex("v2", type="node") + with pytest.raises(ValidationError): + kc.add_edge("e1", type="strong_relation", + vertices={"v1", "v2"}, confidence="0.95") + + def test_face_type_inheritance(self): + """Face types support parent parameter.""" + sb = SchemaBuilder(namespace="ef") + sb.add_vertex_type("node") + sb.add_edge_type("link") + sb.add_face_type("region", attributes={"label": text()}) + sb.add_face_type("special_region", parent="region", + attributes={"priority": text()}) + + g = Graph() + g.parse(data=sb.dump_owl(), format="turtle") + special = URIRef("https://example.org/ef#special_region") + region = URIRef("https://example.org/ef#region") + assert (special, RDFS.subClassOf, region) in g + + def test_face_type_inheritance_instance(self): + """Instance of child face type inherits parent attributes.""" + sb = SchemaBuilder(namespace="ef") + sb.add_vertex_type("node") + sb.add_edge_type("link") + sb.add_face_type("region", attributes={"label": text()}) + sb.add_face_type("special_region", parent="region", + attributes={"priority": text()}) + + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="node") + kc.add_vertex("v2", type="node") + kc.add_vertex("v3", type="node") + kc.add_edge("e1", type="link", vertices={"v1", "v2"}) + kc.add_edge("e2", type="link", vertices={"v2", "v3"}) + kc.add_edge("e3", type="link", vertices={"v1", "v3"}) + kc.add_face("f1", type="special_region", + boundary=["e1", "e2", "e3"], + label="Zone A", priority="high") diff --git a/tests/test_io.py b/tests/test_io.py new file mode 100644 index 0000000..63c5e9e --- /dev/null +++ b/tests/test_io.py @@ -0,0 +1,205 @@ +""" +tests/test_io.py + +Tests for knowledgecomplex.io — multi-format save/load/dump with additive loading. +""" + +import json + +import pytest +from rdflib import Graph + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.io import save_graph, load_graph, dump_graph +from knowledgecomplex.exceptions import ValidationError + + +@pytest.fixture +def schema() -> SchemaBuilder: + sb = SchemaBuilder(namespace="demo") + sb.add_vertex_type("Node") + sb.add_edge_type( + "Link", + attributes={"kind": vocab("directed", "undirected")}, + ) + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def populated_kc(schema) -> KnowledgeComplex: + """3-vertex, 3-edge, 1-face valid closed triangle.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}, kind="undirected") + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}, kind="undirected") + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}, kind="undirected") + kc.add_face("f123", type="Triangle", boundary=["e12", "e23", "e13"]) + return kc + + +def _triple_count(kc: KnowledgeComplex) -> int: + return len(kc._instance_graph) + + +# ── Round-trip tests ──────────────────────────────────────────────────────── + + +def test_save_load_roundtrip_turtle(populated_kc, schema, tmp_path): + """Turtle round-trip: loaded graph is superset of original (TBox deduplicates, + but the fresh KC's own kc:Complex individual adds a few extra triples).""" + path = tmp_path / "instance.ttl" + save_graph(populated_kc, path) + + fresh = KnowledgeComplex(schema=schema) + baseline = _triple_count(fresh) + load_graph(fresh, path) + # All original triples loaded; fresh may have a few extra from its own kc:Complex + assert _triple_count(fresh) >= _triple_count(populated_kc) + # The loaded graph grew beyond the empty-KC baseline + assert _triple_count(fresh) > baseline + + +def test_save_load_roundtrip_jsonld(populated_kc, schema, tmp_path): + """JSON-LD round-trip: loaded graph is superset of original.""" + path = tmp_path / "instance.jsonld" + save_graph(populated_kc, path, format="json-ld") + + fresh = KnowledgeComplex(schema=schema) + baseline = _triple_count(fresh) + load_graph(fresh, path) + assert _triple_count(fresh) >= _triple_count(populated_kc) + assert _triple_count(fresh) > baseline + + +def test_save_load_roundtrip_ntriples(populated_kc, schema, tmp_path): + """N-Triples round-trip: loaded graph is superset of original.""" + path = tmp_path / "instance.nt" + save_graph(populated_kc, path, format="ntriples") + + fresh = KnowledgeComplex(schema=schema) + baseline = _triple_count(fresh) + load_graph(fresh, path) + assert _triple_count(fresh) >= _triple_count(populated_kc) + assert _triple_count(fresh) > baseline + + +# ── Format auto-detection ────────────────────────────────────────────────── + + +def test_format_autodetect(populated_kc, schema, tmp_path): + """.jsonld extension is auto-detected on load.""" + path = tmp_path / "data.jsonld" + save_graph(populated_kc, path, format="json-ld") + + fresh = KnowledgeComplex(schema=schema) + load_graph(fresh, path) # format=None, auto-detected + assert _triple_count(fresh) >= _triple_count(populated_kc) + + +def test_format_explicit_override(populated_kc, schema, tmp_path): + """Explicit format= overrides file extension.""" + path = tmp_path / "data.txt" # unknown extension + save_graph(populated_kc, path, format="turtle") + + fresh = KnowledgeComplex(schema=schema) + load_graph(fresh, path, format="turtle") # explicit override + assert _triple_count(fresh) >= _triple_count(populated_kc) + + +def test_unknown_extension_raises(populated_kc, tmp_path): + """Unknown extension without explicit format raises ValueError.""" + path = tmp_path / "data.xyz" + path.write_text("") + with pytest.raises(ValueError, match="Cannot auto-detect"): + load_graph(populated_kc, path) + + +# ── Additive loading ─────────────────────────────────────────────────────── + + +def test_additive_load(schema, tmp_path): + """Loading a file into an existing KC adds triples (does not replace).""" + # Build and save KC-A with 2 vertices + kc_a = KnowledgeComplex(schema=schema) + kc_a.add_vertex("a1", type="Node") + kc_a.add_vertex("a2", type="Node") + save_graph(kc_a, tmp_path / "a.ttl") + count_a = _triple_count(kc_a) + + # Build KC-B with 2 different vertices + kc_b = KnowledgeComplex(schema=schema) + kc_b.add_vertex("b1", type="Node") + kc_b.add_vertex("b2", type="Node") + count_b_before = _triple_count(kc_b) + + # Load A into B — triples should grow + load_graph(kc_b, tmp_path / "a.ttl") + assert _triple_count(kc_b) > count_b_before + + +# ── Validation on load ───────────────────────────────────────────────────── + + +def test_load_validate_pass(populated_kc, schema, tmp_path): + """validate=True with valid data succeeds.""" + path = tmp_path / "valid.ttl" + save_graph(populated_kc, path) + + fresh = KnowledgeComplex(schema=schema) + load_graph(fresh, path, validate=True) # should not raise + assert _triple_count(fresh) >= _triple_count(populated_kc) + + +def test_load_validate_fail_rollback(schema, tmp_path): + """Invalid data with validate=True raises ValidationError, graph unchanged.""" + # Create a file with a dangling edge (no boundary vertices in complex) + bad_ttl = tmp_path / "bad.ttl" + bad_ttl.write_text("""\ +@prefix rdf: . +@prefix kc: . +@prefix demo: . + + rdf:type demo:Link ; + kc:boundedBy , + . + + kc:hasElement + . +""") + + kc = KnowledgeComplex(schema=schema) + count_before = _triple_count(kc) + + with pytest.raises(ValidationError): + load_graph(kc, bad_ttl, validate=True) + + # Graph should be unchanged after rollback + assert _triple_count(kc) == count_before + + +# ── dump_graph ────────────────────────────────────────────────────────────── + + +def test_dump_graph_jsonld(populated_kc): + """dump_graph with json-ld returns parseable JSON-LD.""" + output = dump_graph(populated_kc, format="json-ld") + assert isinstance(output, str) + parsed = json.loads(output) # valid JSON + assert isinstance(parsed, (list, dict)) + + # Parseable by rdflib + g = Graph() + g.parse(data=output, format="json-ld") + assert len(g) > 0 + + +def test_dump_graph_turtle_default(populated_kc): + """dump_graph defaults to Turtle.""" + output = dump_graph(populated_kc) + g = Graph() + g.parse(data=output, format="turtle") + assert len(g) == _triple_count(populated_kc) diff --git a/tests/test_partition.py b/tests/test_partition.py new file mode 100644 index 0000000..9be1735 --- /dev/null +++ b/tests/test_partition.py @@ -0,0 +1,289 @@ +""" +tests/test_partition.py + +Tests for local partition algorithms: + - Graph version: graph_laplacian, approximate_pagerank, heat_kernel_pagerank, + sweep_cut, local_partition + - Simplicial version: edge_sweep_cut, edge_local_partition + +Fixtures: + - double_triangle: 4v, 5e, 2f (compact, no clear partition) + - barbell: two triangles joined by a bridge edge (clear partition target) +""" + +import pytest +import numpy as np +from numpy.testing import assert_allclose + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.analysis import ( + graph_laplacian, + approximate_pagerank, + heat_kernel_pagerank, + sweep_cut, + local_partition, + edge_sweep_cut, + edge_local_partition, + boundary_matrices, + hodge_laplacian, + SweepCut, + EdgeSweepCut, +) + + +@pytest.fixture +def schema() -> SchemaBuilder: + sb = SchemaBuilder(namespace="topo") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def double_triangle(schema) -> KnowledgeComplex: + """4 vertices, 5 edges, 2 faces sharing edge e23.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_vertex("v4", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}) + kc.add_edge("e24", type="Link", vertices={"v2", "v4"}) + kc.add_edge("e34", type="Link", vertices={"v3", "v4"}) + kc.add_face("f123", type="Triangle", boundary=["e12", "e23", "e13"]) + kc.add_face("f234", type="Triangle", boundary=["e23", "e24", "e34"]) + return kc + + +@pytest.fixture +def barbell(schema) -> KnowledgeComplex: + r"""Two triangles joined by a bridge edge. Clear partition target. + + v1 --e12-- v2 v5 --e56-- v6 + \ / \ / \ / + e13 e23 e25 e45 e46 e56 + \ / \ / \ / + v3 v4(bridge) (reuses v5,v6) + + Left triangle: v1,v2,v3 with edges e12,e23,e13 + Bridge: e24 connecting v2-v4 + Right triangle: v4,v5,v6 with edges e45,e56,e46 + """ + kc = KnowledgeComplex(schema=schema) + # Left triangle + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}) + kc.add_face("f_left", type="Triangle", boundary=["e12", "e23", "e13"]) + # Bridge + kc.add_vertex("v4", type="Node") + kc.add_edge("e24", type="Link", vertices={"v2", "v4"}) + # Right triangle + kc.add_vertex("v5", type="Node") + kc.add_vertex("v6", type="Node") + kc.add_edge("e45", type="Link", vertices={"v4", "v5"}) + kc.add_edge("e56", type="Link", vertices={"v5", "v6"}) + kc.add_edge("e46", type="Link", vertices={"v4", "v6"}) + kc.add_face("f_right", type="Triangle", boundary=["e45", "e56", "e46"]) + return kc + + +# =========================================================================== +# Graph Laplacian +# =========================================================================== + +class TestGraphLaplacian: + + def test_shape(self, double_triangle): + L = graph_laplacian(double_triangle) + assert L.shape == (4, 4) + + def test_symmetric(self, double_triangle): + L = graph_laplacian(double_triangle) + assert_allclose(L.toarray(), L.T.toarray(), atol=1e-12) + + def test_diagonal_is_one(self, double_triangle): + """Normalized Laplacian L = I - D⁻¹A has 1s on the diagonal.""" + L = graph_laplacian(double_triangle) + assert_allclose(L.diagonal(), np.ones(4)) + + def test_row_sums_zero(self, double_triangle): + """Normalized Laplacian: L = I - D⁻¹A, so L·1 ≠ 0 in general. + But the combinatorial Laplacian D-A has row sums 0.""" + L = graph_laplacian(double_triangle) + # For normalized Laplacian, diagonal is 1 and row sums + # depend on degree distribution — just check it's valid + assert L.shape[0] == 4 + + def test_barbell_shape(self, barbell): + L = graph_laplacian(barbell) + assert L.shape == (6, 6) + + +# =========================================================================== +# Approximate PageRank +# =========================================================================== + +class TestApproximatePageRank: + + def test_returns_dicts(self, double_triangle): + p, r = approximate_pagerank(double_triangle, "v1") + assert isinstance(p, dict) + assert isinstance(r, dict) + + def test_p_sums_to_at_most_one(self, double_triangle): + p, r = approximate_pagerank(double_triangle, "v1") + assert sum(p.values()) <= 1.0 + 1e-10 + + def test_residual_bounded(self, double_triangle): + eps = 1e-3 + p, r = approximate_pagerank(double_triangle, "v1", epsilon=eps) + for vid, rv in r.items(): + deg = double_triangle.degree(vid) + if deg > 0: + assert rv / deg < eps + 1e-10 + + def test_seed_has_most_mass(self, double_triangle): + p, r = approximate_pagerank(double_triangle, "v1", alpha=0.5) + # With high alpha, seed should have significant mass + assert p.get("v1", 0) > 0 + + def test_barbell_locality(self, barbell): + """Starting from v1, more mass on left side than right.""" + p, r = approximate_pagerank(barbell, "v1", alpha=0.15) + left_mass = sum(p.get(v, 0) for v in ["v1", "v2", "v3"]) + right_mass = sum(p.get(v, 0) for v in ["v4", "v5", "v6"]) + assert left_mass > right_mass + + +# =========================================================================== +# Heat kernel PageRank +# =========================================================================== + +class TestHeatKernelPageRank: + + def test_returns_dict(self, double_triangle): + rho = heat_kernel_pagerank(double_triangle, "v1") + assert isinstance(rho, dict) + + def test_sums_near_one(self, double_triangle): + rho = heat_kernel_pagerank(double_triangle, "v1", t=5.0) + assert abs(sum(rho.values()) - 1.0) < 0.01 + + def test_concentrated_small_t(self, double_triangle): + """For small t, mass is concentrated near seed.""" + rho = heat_kernel_pagerank(double_triangle, "v1", t=0.1) + assert rho.get("v1", 0) > rho.get("v4", 0) + + def test_spreads_large_t(self, double_triangle): + """For large t, mass spreads toward stationary distribution.""" + rho_small = heat_kernel_pagerank(double_triangle, "v1", t=0.5) + rho_large = heat_kernel_pagerank(double_triangle, "v1", t=50.0) + # Large t should be more uniform + vals_large = list(rho_large.values()) + vals_small = list(rho_small.values()) + assert np.std(vals_large) < np.std(vals_small) + + +# =========================================================================== +# Sweep cut (graph) +# =========================================================================== + +class TestSweepCut: + + def test_returns_sweepcut(self, double_triangle): + p, _ = approximate_pagerank(double_triangle, "v1") + cut = sweep_cut(double_triangle, p) + assert isinstance(cut, SweepCut) + + def test_conductance_positive(self, double_triangle): + p, _ = approximate_pagerank(double_triangle, "v1") + cut = sweep_cut(double_triangle, p) + assert cut.conductance > 0 + + def test_barbell_finds_bridge(self, barbell): + """Barbell graph should yield a cut with low conductance at the bridge.""" + p, _ = approximate_pagerank(barbell, "v1", alpha=0.15) + cut = sweep_cut(barbell, p) + assert cut.conductance < 1.0 + # The small side should be one of the two triangles (3 vertices) + assert len(cut.vertices) <= 4 + + def test_max_volume(self, barbell): + p, _ = approximate_pagerank(barbell, "v1") + cut = sweep_cut(barbell, p, max_volume=6) + assert cut.volume <= 6 + + +# =========================================================================== +# Local partition (graph) +# =========================================================================== + +class TestLocalPartition: + + def test_pagerank_method(self, barbell): + cut = local_partition(barbell, "v1", method="pagerank") + assert isinstance(cut, SweepCut) + assert cut.conductance > 0 + + def test_heat_kernel_method(self, barbell): + cut = local_partition(barbell, "v1", method="heat_kernel") + assert isinstance(cut, SweepCut) + assert cut.conductance > 0 + + def test_barbell_low_conductance(self, barbell): + cut = local_partition(barbell, "v1", method="pagerank") + # Barbell has a clear bottleneck; conductance should be small + assert cut.conductance < 1.0 + + +# =========================================================================== +# Edge sweep cut (simplicial) +# =========================================================================== + +class TestEdgeSweepCut: + + def test_returns_result(self, double_triangle): + from knowledgecomplex.analysis import edge_pagerank + pr = edge_pagerank(double_triangle, "e12", beta=0.1) + cut = edge_sweep_cut(double_triangle, pr) + assert isinstance(cut, EdgeSweepCut) + + def test_conductance_positive(self, double_triangle): + from knowledgecomplex.analysis import edge_pagerank + pr = edge_pagerank(double_triangle, "e12", beta=0.1) + cut = edge_sweep_cut(double_triangle, pr) + assert cut.conductance > 0 + + +# =========================================================================== +# Edge local partition (simplicial) +# =========================================================================== + +class TestEdgeLocalPartition: + + def test_hodge_pagerank_method(self, double_triangle): + cut = edge_local_partition(double_triangle, "e12", method="hodge_pagerank") + assert isinstance(cut, EdgeSweepCut) + + def test_hodge_heat_method(self, double_triangle): + cut = edge_local_partition(double_triangle, "e12", method="hodge_heat") + assert isinstance(cut, EdgeSweepCut) + + def test_with_weights(self, double_triangle): + w = {"v1": 2.0, "f123": 3.0} + cut = edge_local_partition(double_triangle, "e12", + method="hodge_pagerank", weights=w) + assert isinstance(cut, EdgeSweepCut) + + def test_barbell_edge_partition(self, barbell): + cut = edge_local_partition(barbell, "e12", method="hodge_pagerank") + assert isinstance(cut, EdgeSweepCut) + assert cut.conductance > 0 diff --git a/tests/test_stress.py b/tests/test_stress.py new file mode 100644 index 0000000..24fe5c2 --- /dev/null +++ b/tests/test_stress.py @@ -0,0 +1,511 @@ +""" +tests/test_stress.py + +Adversarial and edge-case tests to smoke out issues before public release. +Covers: weird inputs, boundary conditions, concurrency-like patterns, +round-trip fidelity, API misuse, and internal consistency. +""" + +import pytest +import re +from pathlib import Path + +from knowledgecomplex.schema import SchemaBuilder, vocab, text, TextDescriptor, VocabDescriptor +from knowledgecomplex.graph import KnowledgeComplex, Element +from knowledgecomplex.filtration import Filtration +from knowledgecomplex.exceptions import ValidationError, SchemaError, UnknownQueryError + + +# =========================================================================== +# Namespace and ID edge cases +# =========================================================================== + +class TestWeirdNames: + + def test_hyphenated_ids(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("my-type", attributes={"my-attr": text()}) + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("my-vertex-1", type="my-type", **{"my-attr": "hello"}) + + def test_numeric_ids(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("123", type="Node") + kc.add_vertex("456", type="Node") + + def test_unicode_attribute_values(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node", attributes={"name": text()}) + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node", name="日本語テスト") + elem = kc.element("v1") + assert elem.attrs["name"] == "日本語テスト" + + def test_empty_string_attribute(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node", attributes={"name": text()}) + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node", name="") + + def test_long_ids(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + long_id = "x" * 500 + kc.add_vertex(long_id, type="Node") + assert kc.element(long_id).id == long_id + + def test_special_chars_in_namespace(self): + """Namespace with dots or underscores.""" + sb = SchemaBuilder(namespace="my_project") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node") + + +# =========================================================================== +# Empty and minimal complexes +# =========================================================================== + +class TestEmptyComplex: + + def test_empty_verify(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + kc.verify() # empty complex is valid + + def test_empty_audit(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + report = kc.audit() + assert report.conforms + + def test_empty_element_ids(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + assert kc.element_ids() == [] + + def test_empty_skeleton(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + assert kc.skeleton(0) == set() + assert kc.skeleton(1) == set() + assert kc.skeleton(2) == set() + + def test_single_vertex(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node") + assert kc.boundary("v1") == set() + assert kc.coboundary("v1") == set() + assert kc.star("v1") == {"v1"} + assert kc.degree("v1") == 0 + + +# =========================================================================== +# Duplicate element IDs +# =========================================================================== + +class TestDuplicateIds: + + def test_duplicate_vertex_id(self): + """Adding a vertex with an existing ID should fail or produce invalid state.""" + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node") + # Second add with same ID — what happens? + # This should either raise or the graph should still verify + try: + kc.add_vertex("v1", type="Node") + except (ValidationError, ValueError): + pass # acceptable + else: + # If no exception, at least verify the complex is still valid + kc.verify() + + +# =========================================================================== +# Schema consistency +# =========================================================================== + +class TestSchemaConsistency: + + def test_describe_type_after_inheritance(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("parent", attributes={"a": text()}) + sb.add_vertex_type("child", parent="parent", attributes={"b": text()}) + desc = sb.describe_type("child") + assert "a" in desc["inherited_attributes"] + assert "b" in desc["own_attributes"] + assert "a" in desc["all_attributes"] + assert "b" in desc["all_attributes"] + + def test_type_names_consistent_with_describe(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("V1") + sb.add_edge_type("E1") + sb.add_face_type("F1") + for name in sb.type_names(): + desc = sb.describe_type(name) + assert desc["name"] == name + + +# =========================================================================== +# Export/load round-trip fidelity +# =========================================================================== + +class TestRoundTrip: + + def test_schema_round_trip(self, tmp_path): + sb = SchemaBuilder(namespace="rt") + sb.add_vertex_type("Node", attributes={"name": text()}) + sb.add_edge_type("Link", attributes={"weight": vocab("light", "heavy")}) + sb.add_face_type("Tri") + sb.export(tmp_path) + + sb2 = SchemaBuilder.load(tmp_path) + assert set(sb2.type_names()) == set(sb.type_names()) + + def test_complex_round_trip(self, tmp_path): + sb = SchemaBuilder(namespace="rt") + sb.add_vertex_type("Node", attributes={"name": text()}) + sb.add_edge_type("Link") + sb.add_face_type("Tri") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node", name="Alice") + kc.add_vertex("v2", type="Node", name="Bob") + kc.add_vertex("v3", type="Node", name="Carol") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}) + kc.add_face("f", type="Tri", boundary=["e12", "e23", "e13"]) + + kc.export(tmp_path / "out") + kc2 = KnowledgeComplex.load(tmp_path / "out") + + assert set(kc2.element_ids()) == set(kc.element_ids()) + assert kc2.element("v1").attrs["name"] == "Alice" + + def test_round_trip_preserves_boundary(self, tmp_path): + sb = SchemaBuilder(namespace="rt") + sb.add_vertex_type("N") + sb.add_edge_type("E") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("a", type="N") + kc.add_vertex("b", type="N") + kc.add_edge("e", type="E", vertices={"a", "b"}) + + kc.export(tmp_path / "out") + kc2 = KnowledgeComplex.load(tmp_path / "out") + assert kc2.boundary("e") == {"a", "b"} + + +# =========================================================================== +# Filtration edge cases +# =========================================================================== + +class TestFiltrationEdgeCases: + + def _make_kc(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + sb.add_edge_type("E") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + kc.add_vertex("v2", type="N") + kc.add_edge("e12", type="E", vertices={"v1", "v2"}) + return kc + + def test_empty_filtration_iteration(self): + kc = self._make_kc() + filt = Filtration(kc) + assert list(filt) == [] + + def test_single_element_filtration(self): + kc = self._make_kc() + filt = Filtration(kc) + filt.append({"v1"}) + assert len(filt) == 1 + assert filt[0] == {"v1"} + + def test_append_closure_from_empty(self): + kc = self._make_kc() + filt = Filtration(kc) + filt.append_closure({"e12"}) + assert filt[0] == {"v1", "v2", "e12"} + + def test_from_function_all_same_value(self): + kc = self._make_kc() + filt = Filtration.from_function(kc, lambda _: 0) + assert len(filt) == 1 + + +# =========================================================================== +# Analysis edge cases +# =========================================================================== + +class TestAnalysisEdgeCases: + + def test_betti_single_vertex(self): + from knowledgecomplex.analysis import betti_numbers + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + assert betti_numbers(kc) == [1, 0, 0] + + def test_betti_no_elements(self): + from knowledgecomplex.analysis import betti_numbers + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + assert betti_numbers(kc) == [0, 0, 0] + + def test_boundary_matrices_vertices_only(self): + from knowledgecomplex.analysis import boundary_matrices + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + kc.add_vertex("v2", type="N") + bm = boundary_matrices(kc) + assert bm.B1.shape == (2, 0) + assert bm.B2.shape == (0, 0) + + +# =========================================================================== +# Clique inference edge cases +# =========================================================================== + +class TestCliqueEdgeCases: + + def test_find_cliques_no_edges(self): + from knowledgecomplex import find_cliques + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + kc.add_vertex("v2", type="N") + assert find_cliques(kc, k=3) == [] + + def test_find_cliques_no_triangles(self): + from knowledgecomplex import find_cliques + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + sb.add_edge_type("E") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + kc.add_vertex("v2", type="N") + kc.add_edge("e12", type="E", vertices={"v1", "v2"}) + assert find_cliques(kc, k=3) == [] + + +# =========================================================================== +# Deferred verification with invalid final state +# =========================================================================== + +class TestDeferredVerificationInvalid: + + def test_deferred_then_verify_catches_issues(self): + """Build something invalid in deferred mode, verify at exit should catch it.""" + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + sb.add_edge_type("E") + kc = KnowledgeComplex(schema=sb) + + # We can't easily build invalid state with deferred mode since + # Python guards (cardinality checks) still fire. + # But we can test that deferred + valid construction works. + with kc.deferred_verification(): + kc.add_vertex("v1", type="N") + kc.add_vertex("v2", type="N") + kc.add_edge("e12", type="E", vertices={"v1", "v2"}) + + +# =========================================================================== +# Codec edge cases +# =========================================================================== + +class TestCodecEdgeCases: + + def test_register_codec_for_nonexistent_type(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + + class FakeCodec: + def compile(self, element): pass + def decompile(self, uri): return {} + + with pytest.raises(SchemaError): + kc.register_codec("Nonexistent", FakeCodec()) + + def test_compile_without_uri_raises(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node") + + class FakeCodec: + def compile(self, element): pass + def decompile(self, uri): return {} + + kc.register_codec("Node", FakeCodec()) + with pytest.raises(ValueError): + kc.element("v1").compile() + + +# =========================================================================== +# is_subcomplex edge cases +# =========================================================================== + +class TestSubcomplexEdgeCases: + + def test_empty_is_subcomplex(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + assert kc.is_subcomplex(set()) is True + + def test_full_complex_is_subcomplex(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + sb.add_edge_type("E") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + kc.add_vertex("v2", type="N") + kc.add_edge("e12", type="E", vertices={"v1", "v2"}) + all_ids = set(kc.element_ids()) + assert kc.is_subcomplex(all_ids) is True + + +# =========================================================================== +# Topological query edge cases +# =========================================================================== + +class TestTopologyEdgeCases: + + def test_skeleton_invalid_k(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + with pytest.raises(ValueError): + kc.skeleton(-1) + with pytest.raises(ValueError): + kc.skeleton(3) + + def test_boundary_of_vertex_is_empty(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + assert kc.boundary("v1") == set() + + def test_star_of_isolated_vertex(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + assert kc.star("v1") == {"v1"} + + def test_closure_of_single_vertex(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + assert kc.closure("v1") == {"v1"} + + +# =========================================================================== +# remove_element edge cases +# =========================================================================== + +class TestRemoveElement: + + def test_remove_nonexistent_raises(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + with pytest.raises(ValueError): + kc.remove_element("ghost") + + def test_remove_then_readd(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + kc.remove_element("v1") + assert "v1" not in kc.element_ids() + kc.add_vertex("v1", type="N") + assert "v1" in kc.element_ids() + + +# =========================================================================== +# Ontology module imports +# =========================================================================== + +class TestOntologyImports: + + def test_operations_schema(self): + from knowledgecomplex.ontologies import operations + sb = operations.schema() + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("alice", type="actor", name="Alice") + + def test_brand_schema(self): + from knowledgecomplex.ontologies import brand + sb = brand.schema() + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("gen-z", type="audience", name="Gen Z") + kc.add_vertex("trust", type="theme", name="Trust") + kc.add_edge("r1", type="resonance", + vertices={"gen-z", "trust"}, + valence="positive", intensity="strong") + + def test_research_schema(self): + from knowledgecomplex.ontologies import research + sb = research.schema() + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("paper1", type="paper", title="A Great Paper") + kc.add_vertex("ml", type="concept", name="Machine Learning") + kc.add_edge("d1", type="discusses", + vertices={"paper1", "ml"}, depth="primary") + + +# =========================================================================== +# Attribute validation +# =========================================================================== + +class TestAttributeValidation: + + def test_invalid_vocab_value_rejected(self): + sb = SchemaBuilder(namespace="test") + sb.add_edge_type("E", attributes={"status": vocab("open", "closed")}) + sb.add_vertex_type("N") + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="N") + kc.add_vertex("v2", type="N") + with pytest.raises(ValidationError): + kc.add_edge("e1", type="E", vertices={"v1", "v2"}, status="INVALID") + + def test_missing_required_attribute_rejected(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node", attributes={"name": text()}) + kc = KnowledgeComplex(schema=sb) + with pytest.raises(ValidationError): + kc.add_vertex("v1", type="Node") # missing name + + def test_optional_attribute_not_required(self): + sb = SchemaBuilder(namespace="test") + sb.add_vertex_type("Node", attributes={"name": text(required=False)}) + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node") # should not raise diff --git a/tests/test_topology.py b/tests/test_topology.py new file mode 100644 index 0000000..2a20b8e --- /dev/null +++ b/tests/test_topology.py @@ -0,0 +1,289 @@ +""" +tests/test_topology.py + +Tests for topological query methods on KnowledgeComplex: +boundary, coboundary, star, closure, closed_star, link, skeleton, degree. + +Test fixture: double-triangle complex sharing edge e23. + + v1 --e12-- v2 --e24-- v4 + \ / \ / + e13 e23 e24 e34 + \ / \ / + v3 (v4 reused) + + f123 = (e12, e23, e13) + f234 = (e23, e24, e34) + +4 vertices, 5 edges, 2 faces = 11 elements. +""" + +import pytest + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.exceptions import SchemaError + + +@pytest.fixture +def schema() -> SchemaBuilder: + sb = SchemaBuilder(namespace="topo") + sb.add_vertex_type("Node") + sb.add_edge_type("Link", attributes={"weight": vocab("light", "heavy")}) + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def double_triangle(schema) -> KnowledgeComplex: + """4 vertices, 5 edges, 2 faces sharing edge e23.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_vertex("v4", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}, weight="light") + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}, weight="heavy") + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}, weight="light") + kc.add_edge("e24", type="Link", vertices={"v2", "v4"}, weight="heavy") + kc.add_edge("e34", type="Link", vertices={"v3", "v4"}, weight="light") + kc.add_face("f123", type="Triangle", boundary=["e12", "e23", "e13"]) + kc.add_face("f234", type="Triangle", boundary=["e23", "e24", "e34"]) + return kc + + +# --- boundary --- + +class TestBoundary: + def test_vertex_boundary_is_empty(self, double_triangle): + assert double_triangle.boundary("v1") == set() + + def test_edge_boundary(self, double_triangle): + assert double_triangle.boundary("e12") == {"v1", "v2"} + + def test_face_boundary(self, double_triangle): + assert double_triangle.boundary("f123") == {"e12", "e23", "e13"} + + def test_boundary_with_type_filter(self, double_triangle): + # All boundary elements of e12 are Node vertices + assert double_triangle.boundary("e12", type="Node") == {"v1", "v2"} + + def test_boundary_type_filter_excludes(self, double_triangle): + # boundary of e12 are vertices, filtering by Link returns empty + assert double_triangle.boundary("e12", type="Link") == set() + + +# --- coboundary --- + +class TestCoboundary: + def test_vertex_coboundary(self, double_triangle): + assert double_triangle.coboundary("v1") == {"e12", "e13"} + + def test_shared_edge_coboundary(self, double_triangle): + # e23 is shared by both faces + assert double_triangle.coboundary("e23") == {"f123", "f234"} + + def test_face_coboundary_is_empty(self, double_triangle): + assert double_triangle.coboundary("f123") == set() + + def test_coboundary_with_type_filter(self, double_triangle): + # coboundary of v2 filtered to Triangle type + assert double_triangle.coboundary("v2", type="Triangle") == set() + # coboundary of v2 filtered to Link type + assert double_triangle.coboundary("v2", type="Link") == {"e12", "e23", "e24"} + + +# --- star --- + +class TestStar: + def test_vertex_star(self, double_triangle): + # v1 is in e12, e13, f123 + assert double_triangle.star("v1") == {"v1", "e12", "e13", "f123"} + + def test_central_vertex_star(self, double_triangle): + # v2 is in e12, e23, e24, f123, f234 + assert double_triangle.star("v2") == {"v2", "e12", "e23", "e24", "f123", "f234"} + + def test_shared_edge_star(self, double_triangle): + assert double_triangle.star("e23") == {"e23", "f123", "f234"} + + def test_face_star_is_self(self, double_triangle): + assert double_triangle.star("f123") == {"f123"} + + def test_star_with_type_filter(self, double_triangle): + # star of v1 filtered to Link edges only + assert double_triangle.star("v1", type="Link") == {"e12", "e13"} + + +# --- closure --- + +class TestClosure: + def test_vertex_closure_is_self(self, double_triangle): + assert double_triangle.closure("v1") == {"v1"} + + def test_edge_closure(self, double_triangle): + assert double_triangle.closure("e12") == {"e12", "v1", "v2"} + + def test_face_closure(self, double_triangle): + assert double_triangle.closure("f123") == { + "f123", "e12", "e23", "e13", "v1", "v2", "v3" + } + + def test_closure_set_input(self, double_triangle): + # closure of {e12, e34} = union of their closures + assert double_triangle.closure({"e12", "e34"}) == { + "e12", "v1", "v2", "e34", "v3", "v4" + } + + def test_closure_with_type_filter(self, double_triangle): + # closure of f123 filtered to Node vertices only + assert double_triangle.closure("f123", type="Node") == {"v1", "v2", "v3"} + + +# --- closed_star --- + +class TestClosedStar: + def test_closed_star_of_central_vertex(self, double_triangle): + # v2 touches everything — closed star should be the entire complex + cs = double_triangle.closed_star("v2") + all_elements = { + "v1", "v2", "v3", "v4", + "e12", "e23", "e13", "e24", "e34", + "f123", "f234", + } + assert cs == all_elements + + def test_closed_star_of_peripheral_vertex(self, double_triangle): + # v1 star = {v1, e12, e13, f123} + # closure adds boundary of f123: e23, v2, v3 + cs = double_triangle.closed_star("v1") + assert cs == {"v1", "v2", "v3", "e12", "e13", "e23", "f123"} + + +# --- link --- + +class TestLink: + def test_link_of_central_vertex(self, double_triangle): + # Lk(v2) = Cl(St(v2)) \ St(v2) + # St(v2) = {v2, e12, e23, e24, f123, f234} + # Cl(St(v2)) = entire complex + # Link = {v1, v3, v4, e13, e34} + assert double_triangle.link("v2") == {"v1", "v3", "v4", "e13", "e34"} + + def test_link_of_peripheral_vertex(self, double_triangle): + # St(v1) = {v1, e12, e13, f123} + # Cl(St(v1)) = {v1, v2, v3, e12, e13, e23, f123} + # Link = {v2, v3, e23} + assert double_triangle.link("v1") == {"v2", "v3", "e23"} + + def test_link_of_shared_edge(self, double_triangle): + # St(e23) = {e23, f123, f234} + # Cl(St(e23)) = {e23, v2, v3, f123, e12, e13, v1, f234, e24, e34, v4} + # Link = Cl(St) - St = {v2, v3, e12, e13, v1, e24, e34, v4} + expected = {"v1", "v2", "v3", "v4", "e12", "e13", "e24", "e34"} + assert double_triangle.link("e23") == expected + + def test_link_of_face(self, double_triangle): + # St(f123) = {f123} + # Cl(St(f123)) = {f123, e12, e23, e13, v1, v2, v3} + # Link = {e12, e23, e13, v1, v2, v3} + assert double_triangle.link("f123") == {"e12", "e23", "e13", "v1", "v2", "v3"} + + def test_link_with_type_filter(self, double_triangle): + # link of v2 filtered to Node only + assert double_triangle.link("v2", type="Node") == {"v1", "v3", "v4"} + + +# --- skeleton --- + +class TestSkeleton: + def test_skeleton_0(self, double_triangle): + assert double_triangle.skeleton(0) == {"v1", "v2", "v3", "v4"} + + def test_skeleton_1(self, double_triangle): + assert double_triangle.skeleton(1) == { + "v1", "v2", "v3", "v4", + "e12", "e23", "e13", "e24", "e34", + } + + def test_skeleton_2(self, double_triangle): + assert double_triangle.skeleton(2) == { + "v1", "v2", "v3", "v4", + "e12", "e23", "e13", "e24", "e34", + "f123", "f234", + } + + def test_skeleton_negative_raises(self, double_triangle): + with pytest.raises(ValueError, match="skeleton dimension"): + double_triangle.skeleton(-1) + + def test_skeleton_too_high_raises(self, double_triangle): + with pytest.raises(ValueError, match="skeleton dimension"): + double_triangle.skeleton(3) + + +# --- degree --- + +class TestDegree: + def test_degree_peripheral_vertex(self, double_triangle): + assert double_triangle.degree("v1") == 2 # e12, e13 + + def test_degree_central_vertex(self, double_triangle): + assert double_triangle.degree("v2") == 3 # e12, e23, e24 + + def test_degree_v3(self, double_triangle): + assert double_triangle.degree("v3") == 3 # e23, e13, e34 + + def test_degree_v4(self, double_triangle): + assert double_triangle.degree("v4") == 2 # e24, e34 + + +# --- composability --- + +class TestComposability: + def test_closure_of_star(self, double_triangle): + """closure(star(id)) == closed_star(id).""" + cs = double_triangle.closed_star("v1") + composed = double_triangle.closure(double_triangle.star("v1")) + assert cs == composed + + def test_set_intersection(self, double_triangle): + """Star intersection finds shared elements.""" + s1 = double_triangle.star("v1") + s2 = double_triangle.star("v3") + # v1 and v3 share e13 and f123 + shared = s1 & s2 + assert "e13" in shared + assert "f123" in shared + + def test_set_union(self, double_triangle): + """Star union combines neighborhoods.""" + s1 = double_triangle.star("v1") + s4 = double_triangle.star("v4") + combined = s1 | s4 + assert "v1" in combined + assert "v4" in combined + + def test_set_difference(self, double_triangle): + """Set difference works for custom link-like operations.""" + st = double_triangle.star("v1") + bd = double_triangle.boundary("e12") + # Remove v1's star boundary vertices + result = st - bd + assert "v1" not in result or "v2" not in result + + +# --- type filter edge cases --- + +class TestTypeFilterEdgeCases: + def test_invalid_type_raises(self, double_triangle): + with pytest.raises(SchemaError): + double_triangle.star("v1", type="NonexistentType") + + def test_star_filter_to_triangle(self, double_triangle): + assert double_triangle.star("v1", type="Triangle") == {"f123"} + + def test_coboundary_filter_empty_result(self, double_triangle): + # v1 has no Triangle in its direct coboundary (coboundary is edges) + # but coboundary means direct containment only + assert double_triangle.coboundary("v1", type="Triangle") == set() diff --git a/tests/test_topology_constraints.py b/tests/test_topology_constraints.py new file mode 100644 index 0000000..333249a --- /dev/null +++ b/tests/test_topology_constraints.py @@ -0,0 +1,198 @@ +""" +tests/test_topology_constraints.py + +Tests for Tier 3: schema-level query registration (add_query) and +topological constraint escalation (add_topological_constraint). +""" + +import pytest + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.exceptions import ValidationError, SchemaError + + +# --- Schema-level query registration --- + + +class TestAddQuery: + def test_add_query_registers_template(self): + sb = SchemaBuilder(namespace="q") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_query("node_coboundary", "coboundary", target_type="Link") + assert "node_coboundary" in sb._queries + assert "kc:boundedBy" in sb._queries["node_coboundary"] + + def test_add_query_export_creates_sparql_file(self, tmp_path): + sb = SchemaBuilder(namespace="q") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_query("node_coboundary", "coboundary", target_type="Link") + sb.export(tmp_path / "schema") + sparql_file = tmp_path / "schema" / "queries" / "node_coboundary.sparql" + assert sparql_file.exists() + content = sparql_file.read_text() + assert "SELECT" in content + assert "kc:boundedBy" in content + + def test_add_query_loadable_at_runtime(self, tmp_path): + sb = SchemaBuilder(namespace="q") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_query("node_coboundary", "coboundary", target_type="Link") + + # Export schema + queries + instance + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + kc.export(tmp_path / "out") + + # Reload and verify query is available + loaded = KnowledgeComplex.load(tmp_path / "out") + assert "node_coboundary" in loaded._query_templates + + def test_add_query_unknown_operation_raises(self): + sb = SchemaBuilder(namespace="q") + sb.add_vertex_type("Node") + with pytest.raises(SchemaError, match="Unknown topological operation"): + sb.add_query("bad", "nonexistent") + + def test_add_query_unknown_target_type_raises(self): + sb = SchemaBuilder(namespace="q") + sb.add_vertex_type("Node") + with pytest.raises(SchemaError, match="not registered"): + sb.add_query("bad", "coboundary", target_type="Nonexistent") + + def test_add_query_chaining(self): + sb = SchemaBuilder(namespace="q") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + result = sb.add_query("q1", "boundary").add_query("q2", "star") + assert result is sb + assert "q1" in sb._queries + assert "q2" in sb._queries + + +# --- Topological constraint escalation --- + + +class TestAddTopologicalConstraint: + def test_coboundary_min_count_isolated_vertex_fails(self): + """Isolated vertex violates min_count=1 coboundary constraint.""" + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_topological_constraint("Node", "coboundary", min_count=1) + + kc = KnowledgeComplex(schema=sb) + # Adding a vertex with no edges should fail validation + with pytest.raises(ValidationError): + kc.add_vertex("lonely", type="Node") + + def test_coboundary_min_count_connected_vertex_passes(self): + """Vertex with an edge satisfies min_count=1 coboundary constraint.""" + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_topological_constraint("Node", "coboundary", min_count=1) + + kc = KnowledgeComplex(schema=sb) + # We need to add both vertices and edge together — but the slice rule + # means vertices must be added before edges. With this constraint, + # even the first vertex would fail because it has no edges yet. + # This demonstrates that topological constraints interact with the + # slice rule: they should typically be used with deferred validation. + with pytest.raises(ValidationError): + kc.add_vertex("v1", type="Node") + + def test_coboundary_with_target_type(self): + """Constraint with target_type filters to specific edge type.""" + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_edge_type("Special") + sb.add_topological_constraint( + "Node", "coboundary", + target_type="Special", + min_count=1, + message="Every Node needs at least one Special edge", + ) + + kc = KnowledgeComplex(schema=sb) + # Even with a Link edge, should fail without Special + with pytest.raises(ValidationError): + kc.add_vertex("v1", type="Node") + + def test_constraint_unknown_operation_raises(self): + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + with pytest.raises(SchemaError, match="Unknown topological operation"): + sb.add_topological_constraint("Node", "nonexistent") + + def test_constraint_unknown_type_raises(self): + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + with pytest.raises(SchemaError, match="not registered"): + sb.add_topological_constraint("NonExistent", "coboundary") + + def test_constraint_unknown_target_type_raises(self): + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + with pytest.raises(SchemaError, match="not registered"): + sb.add_topological_constraint( + "Node", "coboundary", target_type="NonExistent" + ) + + def test_constraint_unknown_predicate_raises(self): + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + with pytest.raises(SchemaError, match="Unknown predicate"): + sb.add_topological_constraint("Node", "coboundary", predicate="invalid") + + def test_max_count_predicate_requires_max_count(self): + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + with pytest.raises(SchemaError, match="max_count"): + sb.add_topological_constraint( + "Node", "coboundary", predicate="max_count" + ) + + def test_max_count_constraint(self): + """max_count constraint limits coboundary cardinality.""" + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + sb.add_topological_constraint( + "Node", "coboundary", + predicate="max_count", max_count=1, + ) + + kc = KnowledgeComplex(schema=sb) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}) + # v2 now has 1 edge (ok, max_count=1) + # Adding second edge to v2 should fail + with pytest.raises(ValidationError): + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}) + + def test_chaining(self): + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + result = sb.add_topological_constraint("Node", "boundary", min_count=0) + assert result is sb + + def test_auto_generated_message(self): + """Constraint without explicit message gets auto-generated one.""" + sb = SchemaBuilder(namespace="tc") + sb.add_vertex_type("Node") + sb.add_edge_type("Link") + # Should not raise — message is auto-generated + sb.add_topological_constraint("Node", "coboundary", min_count=2) + # Verify shape was added by checking SHACL graph has the constraint + shacl = sb.dump_shacl() + assert "Topological constraint violated" in shacl diff --git a/tests/test_viz.py b/tests/test_viz.py new file mode 100644 index 0000000..804f655 --- /dev/null +++ b/tests/test_viz.py @@ -0,0 +1,341 @@ +""" +tests/test_viz.py + +Tests for knowledgecomplex.viz — NetworkX export (DiGraph), Hasse diagram +plots, geometric realization, and verify_networkx. + +Skipped if networkx or matplotlib are not installed. +""" + +import warnings + +import pytest + +nx = pytest.importorskip("networkx") +mpl = pytest.importorskip("matplotlib") +mpl.use("Agg") # non-interactive backend for CI + +from knowledgecomplex.schema import SchemaBuilder, vocab +from knowledgecomplex.graph import KnowledgeComplex +from knowledgecomplex.viz import ( + to_networkx, + verify_networkx, + type_color_map, + plot_hasse, + plot_hasse_star, + plot_hasse_skeleton, + plot_geometric, + plot_complex, + plot_star, + plot_skeleton, +) + + +@pytest.fixture +def schema() -> SchemaBuilder: + sb = SchemaBuilder(namespace="viz") + sb.add_vertex_type("Node") + sb.add_edge_type("Link", attributes={"weight": vocab("light", "heavy")}) + sb.add_face_type("Triangle") + return sb + + +@pytest.fixture +def kc(schema) -> KnowledgeComplex: + """3 vertices, 3 edges, 1 face.""" + kc = KnowledgeComplex(schema=schema) + kc.add_vertex("v1", type="Node") + kc.add_vertex("v2", type="Node") + kc.add_vertex("v3", type="Node") + kc.add_edge("e12", type="Link", vertices={"v1", "v2"}, weight="light") + kc.add_edge("e23", type="Link", vertices={"v2", "v3"}, weight="heavy") + kc.add_edge("e13", type="Link", vertices={"v1", "v3"}, weight="light") + kc.add_face("f123", type="Triangle", boundary=["e12", "e23", "e13"]) + return kc + + +@pytest.fixture +def empty_kc(schema) -> KnowledgeComplex: + return KnowledgeComplex(schema=schema) + + +# --- to_networkx --- + + +class TestToNetworkx: + def test_is_digraph(self, kc): + G = to_networkx(kc) + assert isinstance(G, nx.DiGraph) + + def test_node_count(self, kc): + G = to_networkx(kc) + assert len(G.nodes) == 7 # 3 vertices + 3 edges + 1 face + + def test_edge_count(self, kc): + G = to_networkx(kc) + # boundedBy: 3 edges × 2 vertices + 1 face × 3 edges = 9 + assert len(G.edges) == 9 + + def test_edge_direction_high_to_low(self, kc): + """All edges point from higher dim to lower dim.""" + G = to_networkx(kc) + for u, v in G.edges(): + assert G.nodes[u]["dim"] > G.nodes[v]["dim"] + + def test_vertex_out_degree_zero(self, kc): + G = to_networkx(kc) + for n in G.nodes: + if G.nodes[n]["dim"] == 0: + assert G.out_degree(n) == 0, f"Vertex {n} has out-degree {G.out_degree(n)}" + + def test_edge_out_degree_two(self, kc): + G = to_networkx(kc) + for n in G.nodes: + if G.nodes[n]["dim"] == 1: + assert G.out_degree(n) == 2, f"Edge {n} has out-degree {G.out_degree(n)}" + + def test_face_out_degree_three(self, kc): + G = to_networkx(kc) + for n in G.nodes: + if G.nodes[n]["dim"] == 2: + assert G.out_degree(n) == 3, f"Face {n} has out-degree {G.out_degree(n)}" + + def test_face_in_degree_zero(self, kc): + G = to_networkx(kc) + for n in G.nodes: + if G.nodes[n]["dim"] == 2: + assert G.in_degree(n) == 0, f"Face {n} has in-degree {G.in_degree(n)}" + + def test_node_has_type_kind_dim(self, kc): + G = to_networkx(kc) + for n in G.nodes: + assert "type" in G.nodes[n] + assert "kind" in G.nodes[n] + assert "dim" in G.nodes[n] + + def test_model_attributes(self, kc): + G = to_networkx(kc) + assert G.nodes["e12"]["weight"] == "light" + + def test_graph_name(self, kc): + G = to_networkx(kc) + assert G.graph["name"] == "viz" + + def test_empty_kc(self, empty_kc): + G = to_networkx(empty_kc) + assert len(G.nodes) == 0 + assert len(G.edges) == 0 + + +# --- verify_networkx --- + + +class TestVerifyNetworkx: + def test_valid_complex(self, kc): + G = to_networkx(kc) + assert verify_networkx(G) is True + + def test_not_digraph_raises(self): + G = nx.Graph() + with pytest.raises(TypeError, match="DiGraph"): + verify_networkx(G) + + def test_missing_attributes_raises(self): + G = nx.DiGraph() + G.add_node("x") + with pytest.raises(ValueError, match="missing"): + verify_networkx(G) + + def test_vertex_with_outgoing_edge_raises(self): + G = nx.DiGraph() + G.add_node("v1", kind="vertex", dim=0, type="V", uri=None) + G.add_node("v2", kind="vertex", dim=0, type="V", uri=None) + G.add_edge("v1", "v2") + with pytest.raises(ValueError, match="out-degree"): + verify_networkx(G) + + def test_edge_wrong_out_degree_raises(self): + G = nx.DiGraph() + G.add_node("e1", kind="edge", dim=1, type="E", uri=None) + G.add_node("v1", kind="vertex", dim=0, type="V", uri=None) + G.add_edge("e1", "v1") + # out-degree 1 instead of 2 + with pytest.raises(ValueError, match="out-degree 1"): + verify_networkx(G) + + def test_edge_target_not_vertex_raises(self): + G = nx.DiGraph() + G.add_node("e1", kind="edge", dim=1, type="E", uri=None) + G.add_node("e2", kind="edge", dim=1, type="E", uri=None) + G.add_node("v1", kind="vertex", dim=0, type="V", uri=None) + G.add_edge("e1", "v1") + G.add_edge("e1", "e2") + with pytest.raises(ValueError, match="not a vertex"): + verify_networkx(G) + + def test_closed_triangle_invariant(self, kc): + """The face's 3 boundary edges share exactly 3 distinct vertices.""" + G = to_networkx(kc) + # This is implicitly tested by verify_networkx succeeding, + # but let's also check explicitly + face_nodes = [n for n in G if G.nodes[n]["dim"] == 2] + for face in face_nodes: + edges = list(G.successors(face)) + assert len(edges) == 3 + verts = set() + for e in edges: + verts |= set(G.successors(e)) + assert len(verts) == 3 + + def test_open_triangle_detected(self): + """A face whose boundary edges don't form a closed triangle fails.""" + G = nx.DiGraph() + for v in ["v1", "v2", "v3", "v4"]: + G.add_node(v, kind="vertex", dim=0, type="V", uri=None) + # e1: v1-v2, e2: v2-v3, e3: v1-v4 (open — v4 instead of v3) + G.add_node("e1", kind="edge", dim=1, type="E", uri=None) + G.add_node("e2", kind="edge", dim=1, type="E", uri=None) + G.add_node("e3", kind="edge", dim=1, type="E", uri=None) + G.add_edge("e1", "v1"); G.add_edge("e1", "v2") + G.add_edge("e2", "v2"); G.add_edge("e2", "v3") + G.add_edge("e3", "v1"); G.add_edge("e3", "v4") + G.add_node("f", kind="face", dim=2, type="F", uri=None) + G.add_edge("f", "e1"); G.add_edge("f", "e2"); G.add_edge("f", "e3") + with pytest.raises(ValueError, match="4 distinct vertices"): + verify_networkx(G) + + +# --- type_color_map --- + + +class TestTypeColorMap: + def test_covers_all_types(self, kc): + colors = type_color_map(kc) + for type_name in kc._schema._types: + assert type_name in colors + + def test_returns_hex_strings(self, kc): + colors = type_color_map(kc) + for color in colors.values(): + assert color.startswith("#") + assert len(color) == 7 + + def test_distinct_colors(self, kc): + colors = type_color_map(kc) + assert len(set(colors.values())) == len(colors) + + +# --- plot_hasse --- + + +class TestPlotHasse: + def test_returns_fig_ax(self, kc): + import matplotlib.pyplot as plt + fig, ax = plot_hasse(kc) + assert isinstance(fig, plt.Figure) + assert isinstance(ax, plt.Axes) + plt.close(fig) + + def test_empty_kc(self, empty_kc): + import matplotlib.pyplot as plt + fig, ax = plot_hasse(empty_kc) + assert isinstance(fig, plt.Figure) + plt.close(fig) + + def test_title_contains_hasse(self, kc): + import matplotlib.pyplot as plt + fig, ax = plot_hasse(kc) + assert "Hasse" in ax.get_title() + plt.close(fig) + + def test_star_returns_fig_ax(self, kc): + import matplotlib.pyplot as plt + fig, ax = plot_hasse_star(kc, "v1") + assert isinstance(fig, plt.Figure) + assert "Star" in ax.get_title() + plt.close(fig) + + def test_skeleton_returns_fig_ax(self, kc): + import matplotlib.pyplot as plt + fig, ax = plot_hasse_skeleton(kc, 1) + assert isinstance(fig, plt.Figure) + assert "Skeleton" in ax.get_title() + plt.close(fig) + + +# --- plot_geometric --- + + +class TestPlotGeometric: + def test_returns_fig_and_3d_ax(self, kc): + import matplotlib.pyplot as plt + from mpl_toolkits.mplot3d import Axes3D + fig, ax = plot_geometric(kc) + assert isinstance(fig, plt.Figure) + # Axes3D is a subclass of Axes + assert hasattr(ax, "zaxis") or isinstance(ax, Axes3D) + plt.close(fig) + + def test_empty_kc(self, empty_kc): + import matplotlib.pyplot as plt + fig, ax = plot_geometric(empty_kc) + assert isinstance(fig, plt.Figure) + plt.close(fig) + + def test_title_contains_geometric(self, kc): + import matplotlib.pyplot as plt + fig, ax = plot_geometric(kc) + assert "Geometric" in ax.get_title() + plt.close(fig) + + +# --- plot_geometric_interactive --- + + +class TestPlotGeometricInteractive: + def test_returns_plotly_figure(self, kc): + plotly = pytest.importorskip("plotly") + from knowledgecomplex.viz import plot_geometric_interactive + fig = plot_geometric_interactive(kc) + assert isinstance(fig, plotly.graph_objects.Figure) + + def test_empty_kc(self, empty_kc): + plotly = pytest.importorskip("plotly") + from knowledgecomplex.viz import plot_geometric_interactive + fig = plot_geometric_interactive(empty_kc) + assert isinstance(fig, plotly.graph_objects.Figure) + + +# --- deprecated aliases --- + + +class TestDeprecatedAliases: + def test_plot_complex_warns(self, kc): + import matplotlib.pyplot as plt + with warnings.catch_warnings(record=True) as w: + warnings.simplefilter("always") + fig, ax = plot_complex(kc) + assert len(w) == 1 + assert issubclass(w[0].category, DeprecationWarning) + assert "plot_hasse" in str(w[0].message) + plt.close(fig) + + def test_plot_star_warns(self, kc): + import matplotlib.pyplot as plt + with warnings.catch_warnings(record=True) as w: + warnings.simplefilter("always") + fig, ax = plot_star(kc, "v1") + assert len(w) == 1 + assert issubclass(w[0].category, DeprecationWarning) + assert "plot_hasse_star" in str(w[0].message) + plt.close(fig) + + def test_plot_skeleton_warns(self, kc): + import matplotlib.pyplot as plt + with warnings.catch_warnings(record=True) as w: + warnings.simplefilter("always") + fig, ax = plot_skeleton(kc, 1) + assert len(w) == 1 + assert issubclass(w[0].category, DeprecationWarning) + assert "plot_hasse_skeleton" in str(w[0].message) + plt.close(fig) diff --git a/uv.lock b/uv.lock index 7bf92f4..3df45d5 100644 --- a/uv.lock +++ b/uv.lock @@ -155,6 +155,88 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, ] +[[package]] +name = "contourpy" +version = "1.3.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "numpy" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/58/01/1253e6698a07380cd31a736d248a3f2a50a7c88779a1813da27503cadc2a/contourpy-1.3.3.tar.gz", hash = "sha256:083e12155b210502d0bca491432bb04d56dc3432f95a979b429f2848c3dbe880", size = 13466174, upload-time = "2025-07-26T12:03:12.549Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/91/2e/c4390a31919d8a78b90e8ecf87cd4b4c4f05a5b48d05ec17db8e5404c6f4/contourpy-1.3.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:709a48ef9a690e1343202916450bc48b9e51c049b089c7f79a267b46cffcdaa1", size = 288773, upload-time = "2025-07-26T12:01:02.277Z" }, + { url = "https://files.pythonhosted.org/packages/0d/44/c4b0b6095fef4dc9c420e041799591e3b63e9619e3044f7f4f6c21c0ab24/contourpy-1.3.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:23416f38bfd74d5d28ab8429cc4d63fa67d5068bd711a85edb1c3fb0c3e2f381", size = 270149, upload-time = "2025-07-26T12:01:04.072Z" }, + { url = "https://files.pythonhosted.org/packages/30/2e/dd4ced42fefac8470661d7cb7e264808425e6c5d56d175291e93890cce09/contourpy-1.3.3-cp311-cp311-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:929ddf8c4c7f348e4c0a5a3a714b5c8542ffaa8c22954862a46ca1813b667ee7", size = 329222, upload-time = "2025-07-26T12:01:05.688Z" }, + { url = "https://files.pythonhosted.org/packages/f2/74/cc6ec2548e3d276c71389ea4802a774b7aa3558223b7bade3f25787fafc2/contourpy-1.3.3-cp311-cp311-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:9e999574eddae35f1312c2b4b717b7885d4edd6cb46700e04f7f02db454e67c1", size = 377234, upload-time = "2025-07-26T12:01:07.054Z" }, + { url = "https://files.pythonhosted.org/packages/03/b3/64ef723029f917410f75c09da54254c5f9ea90ef89b143ccadb09df14c15/contourpy-1.3.3-cp311-cp311-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0bf67e0e3f482cb69779dd3061b534eb35ac9b17f163d851e2a547d56dba0a3a", size = 380555, upload-time = "2025-07-26T12:01:08.801Z" }, + { url = "https://files.pythonhosted.org/packages/5f/4b/6157f24ca425b89fe2eb7e7be642375711ab671135be21e6faa100f7448c/contourpy-1.3.3-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:51e79c1f7470158e838808d4a996fa9bac72c498e93d8ebe5119bc1e6becb0db", size = 355238, upload-time = "2025-07-26T12:01:10.319Z" }, + { url = "https://files.pythonhosted.org/packages/98/56/f914f0dd678480708a04cfd2206e7c382533249bc5001eb9f58aa693e200/contourpy-1.3.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:598c3aaece21c503615fd59c92a3598b428b2f01bfb4b8ca9c4edeecc2438620", size = 1326218, upload-time = "2025-07-26T12:01:12.659Z" }, + { url = "https://files.pythonhosted.org/packages/fb/d7/4a972334a0c971acd5172389671113ae82aa7527073980c38d5868ff1161/contourpy-1.3.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:322ab1c99b008dad206d406bb61d014cf0174df491ae9d9d0fac6a6fda4f977f", size = 1392867, upload-time = "2025-07-26T12:01:15.533Z" }, + { url = "https://files.pythonhosted.org/packages/75/3e/f2cc6cd56dc8cff46b1a56232eabc6feea52720083ea71ab15523daab796/contourpy-1.3.3-cp311-cp311-win32.whl", hash = "sha256:fd907ae12cd483cd83e414b12941c632a969171bf90fc937d0c9f268a31cafff", size = 183677, upload-time = "2025-07-26T12:01:17.088Z" }, + { url = "https://files.pythonhosted.org/packages/98/4b/9bd370b004b5c9d8045c6c33cf65bae018b27aca550a3f657cdc99acdbd8/contourpy-1.3.3-cp311-cp311-win_amd64.whl", hash = "sha256:3519428f6be58431c56581f1694ba8e50626f2dd550af225f82fb5f5814d2a42", size = 225234, upload-time = "2025-07-26T12:01:18.256Z" }, + { url = "https://files.pythonhosted.org/packages/d9/b6/71771e02c2e004450c12b1120a5f488cad2e4d5b590b1af8bad060360fe4/contourpy-1.3.3-cp311-cp311-win_arm64.whl", hash = "sha256:15ff10bfada4bf92ec8b31c62bf7c1834c244019b4a33095a68000d7075df470", size = 193123, upload-time = "2025-07-26T12:01:19.848Z" }, + { url = "https://files.pythonhosted.org/packages/be/45/adfee365d9ea3d853550b2e735f9d66366701c65db7855cd07621732ccfc/contourpy-1.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b08a32ea2f8e42cf1d4be3169a98dd4be32bafe4f22b6c4cb4ba810fa9e5d2cb", size = 293419, upload-time = "2025-07-26T12:01:21.16Z" }, + { url = "https://files.pythonhosted.org/packages/53/3e/405b59cfa13021a56bba395a6b3aca8cec012b45bf177b0eaf7a202cde2c/contourpy-1.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:556dba8fb6f5d8742f2923fe9457dbdd51e1049c4a43fd3986a0b14a1d815fc6", size = 273979, upload-time = "2025-07-26T12:01:22.448Z" }, + { url = "https://files.pythonhosted.org/packages/d4/1c/a12359b9b2ca3a845e8f7f9ac08bdf776114eb931392fcad91743e2ea17b/contourpy-1.3.3-cp312-cp312-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92d9abc807cf7d0e047b95ca5d957cf4792fcd04e920ca70d48add15c1a90ea7", size = 332653, upload-time = "2025-07-26T12:01:24.155Z" }, + { url = "https://files.pythonhosted.org/packages/63/12/897aeebfb475b7748ea67b61e045accdfcf0d971f8a588b67108ed7f5512/contourpy-1.3.3-cp312-cp312-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b2e8faa0ed68cb29af51edd8e24798bb661eac3bd9f65420c1887b6ca89987c8", size = 379536, upload-time = "2025-07-26T12:01:25.91Z" }, + { url = "https://files.pythonhosted.org/packages/43/8a/a8c584b82deb248930ce069e71576fc09bd7174bbd35183b7943fb1064fd/contourpy-1.3.3-cp312-cp312-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:626d60935cf668e70a5ce6ff184fd713e9683fb458898e4249b63be9e28286ea", size = 384397, upload-time = "2025-07-26T12:01:27.152Z" }, + { url = "https://files.pythonhosted.org/packages/cc/8f/ec6289987824b29529d0dfda0d74a07cec60e54b9c92f3c9da4c0ac732de/contourpy-1.3.3-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4d00e655fcef08aba35ec9610536bfe90267d7ab5ba944f7032549c55a146da1", size = 362601, upload-time = "2025-07-26T12:01:28.808Z" }, + { url = "https://files.pythonhosted.org/packages/05/0a/a3fe3be3ee2dceb3e615ebb4df97ae6f3828aa915d3e10549ce016302bd1/contourpy-1.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:451e71b5a7d597379ef572de31eeb909a87246974d960049a9848c3bc6c41bf7", size = 1331288, upload-time = "2025-07-26T12:01:31.198Z" }, + { url = "https://files.pythonhosted.org/packages/33/1d/acad9bd4e97f13f3e2b18a3977fe1b4a37ecf3d38d815333980c6c72e963/contourpy-1.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:459c1f020cd59fcfe6650180678a9993932d80d44ccde1fa1868977438f0b411", size = 1403386, upload-time = "2025-07-26T12:01:33.947Z" }, + { url = "https://files.pythonhosted.org/packages/cf/8f/5847f44a7fddf859704217a99a23a4f6417b10e5ab1256a179264561540e/contourpy-1.3.3-cp312-cp312-win32.whl", hash = "sha256:023b44101dfe49d7d53932be418477dba359649246075c996866106da069af69", size = 185018, upload-time = "2025-07-26T12:01:35.64Z" }, + { url = "https://files.pythonhosted.org/packages/19/e8/6026ed58a64563186a9ee3f29f41261fd1828f527dd93d33b60feca63352/contourpy-1.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:8153b8bfc11e1e4d75bcb0bff1db232f9e10b274e0929de9d608027e0d34ff8b", size = 226567, upload-time = "2025-07-26T12:01:36.804Z" }, + { url = "https://files.pythonhosted.org/packages/d1/e2/f05240d2c39a1ed228d8328a78b6f44cd695f7ef47beb3e684cf93604f86/contourpy-1.3.3-cp312-cp312-win_arm64.whl", hash = "sha256:07ce5ed73ecdc4a03ffe3e1b3e3c1166db35ae7584be76f65dbbe28a7791b0cc", size = 193655, upload-time = "2025-07-26T12:01:37.999Z" }, + { url = "https://files.pythonhosted.org/packages/68/35/0167aad910bbdb9599272bd96d01a9ec6852f36b9455cf2ca67bd4cc2d23/contourpy-1.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:177fb367556747a686509d6fef71d221a4b198a3905fe824430e5ea0fda54eb5", size = 293257, upload-time = "2025-07-26T12:01:39.367Z" }, + { url = "https://files.pythonhosted.org/packages/96/e4/7adcd9c8362745b2210728f209bfbcf7d91ba868a2c5f40d8b58f54c509b/contourpy-1.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d002b6f00d73d69333dac9d0b8d5e84d9724ff9ef044fd63c5986e62b7c9e1b1", size = 274034, upload-time = "2025-07-26T12:01:40.645Z" }, + { url = "https://files.pythonhosted.org/packages/73/23/90e31ceeed1de63058a02cb04b12f2de4b40e3bef5e082a7c18d9c8ae281/contourpy-1.3.3-cp313-cp313-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:348ac1f5d4f1d66d3322420f01d42e43122f43616e0f194fc1c9f5d830c5b286", size = 334672, upload-time = "2025-07-26T12:01:41.942Z" }, + { url = "https://files.pythonhosted.org/packages/ed/93/b43d8acbe67392e659e1d984700e79eb67e2acb2bd7f62012b583a7f1b55/contourpy-1.3.3-cp313-cp313-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:655456777ff65c2c548b7c454af9c6f33f16c8884f11083244b5819cc214f1b5", size = 381234, upload-time = "2025-07-26T12:01:43.499Z" }, + { url = "https://files.pythonhosted.org/packages/46/3b/bec82a3ea06f66711520f75a40c8fc0b113b2a75edb36aa633eb11c4f50f/contourpy-1.3.3-cp313-cp313-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:644a6853d15b2512d67881586bd03f462c7ab755db95f16f14d7e238f2852c67", size = 385169, upload-time = "2025-07-26T12:01:45.219Z" }, + { url = "https://files.pythonhosted.org/packages/4b/32/e0f13a1c5b0f8572d0ec6ae2f6c677b7991fafd95da523159c19eff0696a/contourpy-1.3.3-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4debd64f124ca62069f313a9cb86656ff087786016d76927ae2cf37846b006c9", size = 362859, upload-time = "2025-07-26T12:01:46.519Z" }, + { url = "https://files.pythonhosted.org/packages/33/71/e2a7945b7de4e58af42d708a219f3b2f4cff7386e6b6ab0a0fa0033c49a9/contourpy-1.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a15459b0f4615b00bbd1e91f1b9e19b7e63aea7483d03d804186f278c0af2659", size = 1332062, upload-time = "2025-07-26T12:01:48.964Z" }, + { url = "https://files.pythonhosted.org/packages/12/fc/4e87ac754220ccc0e807284f88e943d6d43b43843614f0a8afa469801db0/contourpy-1.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ca0fdcd73925568ca027e0b17ab07aad764be4706d0a925b89227e447d9737b7", size = 1403932, upload-time = "2025-07-26T12:01:51.979Z" }, + { url = "https://files.pythonhosted.org/packages/a6/2e/adc197a37443f934594112222ac1aa7dc9a98faf9c3842884df9a9d8751d/contourpy-1.3.3-cp313-cp313-win32.whl", hash = "sha256:b20c7c9a3bf701366556e1b1984ed2d0cedf999903c51311417cf5f591d8c78d", size = 185024, upload-time = "2025-07-26T12:01:53.245Z" }, + { url = "https://files.pythonhosted.org/packages/18/0b/0098c214843213759692cc638fce7de5c289200a830e5035d1791d7a2338/contourpy-1.3.3-cp313-cp313-win_amd64.whl", hash = "sha256:1cadd8b8969f060ba45ed7c1b714fe69185812ab43bd6b86a9123fe8f99c3263", size = 226578, upload-time = "2025-07-26T12:01:54.422Z" }, + { url = "https://files.pythonhosted.org/packages/8a/9a/2f6024a0c5995243cd63afdeb3651c984f0d2bc727fd98066d40e141ad73/contourpy-1.3.3-cp313-cp313-win_arm64.whl", hash = "sha256:fd914713266421b7536de2bfa8181aa8c699432b6763a0ea64195ebe28bff6a9", size = 193524, upload-time = "2025-07-26T12:01:55.73Z" }, + { url = "https://files.pythonhosted.org/packages/c0/b3/f8a1a86bd3298513f500e5b1f5fd92b69896449f6cab6a146a5d52715479/contourpy-1.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:88df9880d507169449d434c293467418b9f6cbe82edd19284aa0409e7fdb933d", size = 306730, upload-time = "2025-07-26T12:01:57.051Z" }, + { url = "https://files.pythonhosted.org/packages/3f/11/4780db94ae62fc0c2053909b65dc3246bd7cecfc4f8a20d957ad43aa4ad8/contourpy-1.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:d06bb1f751ba5d417047db62bca3c8fde202b8c11fb50742ab3ab962c81e8216", size = 287897, upload-time = "2025-07-26T12:01:58.663Z" }, + { url = "https://files.pythonhosted.org/packages/ae/15/e59f5f3ffdd6f3d4daa3e47114c53daabcb18574a26c21f03dc9e4e42ff0/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e4e6b05a45525357e382909a4c1600444e2a45b4795163d3b22669285591c1ae", size = 326751, upload-time = "2025-07-26T12:02:00.343Z" }, + { url = "https://files.pythonhosted.org/packages/0f/81/03b45cfad088e4770b1dcf72ea78d3802d04200009fb364d18a493857210/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ab3074b48c4e2cf1a960e6bbeb7f04566bf36b1861d5c9d4d8ac04b82e38ba20", size = 375486, upload-time = "2025-07-26T12:02:02.128Z" }, + { url = "https://files.pythonhosted.org/packages/0c/ba/49923366492ffbdd4486e970d421b289a670ae8cf539c1ea9a09822b371a/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6c3d53c796f8647d6deb1abe867daeb66dcc8a97e8455efa729516b997b8ed99", size = 388106, upload-time = "2025-07-26T12:02:03.615Z" }, + { url = "https://files.pythonhosted.org/packages/9f/52/5b00ea89525f8f143651f9f03a0df371d3cbd2fccd21ca9b768c7a6500c2/contourpy-1.3.3-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:50ed930df7289ff2a8d7afeb9603f8289e5704755c7e5c3bbd929c90c817164b", size = 352548, upload-time = "2025-07-26T12:02:05.165Z" }, + { url = "https://files.pythonhosted.org/packages/32/1d/a209ec1a3a3452d490f6b14dd92e72280c99ae3d1e73da74f8277d4ee08f/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4feffb6537d64b84877da813a5c30f1422ea5739566abf0bd18065ac040e120a", size = 1322297, upload-time = "2025-07-26T12:02:07.379Z" }, + { url = "https://files.pythonhosted.org/packages/bc/9e/46f0e8ebdd884ca0e8877e46a3f4e633f6c9c8c4f3f6e72be3fe075994aa/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:2b7e9480ffe2b0cd2e787e4df64270e3a0440d9db8dc823312e2c940c167df7e", size = 1391023, upload-time = "2025-07-26T12:02:10.171Z" }, + { url = "https://files.pythonhosted.org/packages/b9/70/f308384a3ae9cd2209e0849f33c913f658d3326900d0ff5d378d6a1422d2/contourpy-1.3.3-cp313-cp313t-win32.whl", hash = "sha256:283edd842a01e3dcd435b1c5116798d661378d83d36d337b8dde1d16a5fc9ba3", size = 196157, upload-time = "2025-07-26T12:02:11.488Z" }, + { url = "https://files.pythonhosted.org/packages/b2/dd/880f890a6663b84d9e34a6f88cded89d78f0091e0045a284427cb6b18521/contourpy-1.3.3-cp313-cp313t-win_amd64.whl", hash = "sha256:87acf5963fc2b34825e5b6b048f40e3635dd547f590b04d2ab317c2619ef7ae8", size = 240570, upload-time = "2025-07-26T12:02:12.754Z" }, + { url = "https://files.pythonhosted.org/packages/80/99/2adc7d8ffead633234817ef8e9a87115c8a11927a94478f6bb3d3f4d4f7d/contourpy-1.3.3-cp313-cp313t-win_arm64.whl", hash = "sha256:3c30273eb2a55024ff31ba7d052dde990d7d8e5450f4bbb6e913558b3d6c2301", size = 199713, upload-time = "2025-07-26T12:02:14.4Z" }, + { url = "https://files.pythonhosted.org/packages/72/8b/4546f3ab60f78c514ffb7d01a0bd743f90de36f0019d1be84d0a708a580a/contourpy-1.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fde6c716d51c04b1c25d0b90364d0be954624a0ee9d60e23e850e8d48353d07a", size = 292189, upload-time = "2025-07-26T12:02:16.095Z" }, + { url = "https://files.pythonhosted.org/packages/fd/e1/3542a9cb596cadd76fcef413f19c79216e002623158befe6daa03dbfa88c/contourpy-1.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:cbedb772ed74ff5be440fa8eee9bd49f64f6e3fc09436d9c7d8f1c287b121d77", size = 273251, upload-time = "2025-07-26T12:02:17.524Z" }, + { url = "https://files.pythonhosted.org/packages/b1/71/f93e1e9471d189f79d0ce2497007731c1e6bf9ef6d1d61b911430c3db4e5/contourpy-1.3.3-cp314-cp314-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:22e9b1bd7a9b1d652cd77388465dc358dafcd2e217d35552424aa4f996f524f5", size = 335810, upload-time = "2025-07-26T12:02:18.9Z" }, + { url = "https://files.pythonhosted.org/packages/91/f9/e35f4c1c93f9275d4e38681a80506b5510e9327350c51f8d4a5a724d178c/contourpy-1.3.3-cp314-cp314-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a22738912262aa3e254e4f3cb079a95a67132fc5a063890e224393596902f5a4", size = 382871, upload-time = "2025-07-26T12:02:20.418Z" }, + { url = "https://files.pythonhosted.org/packages/b5/71/47b512f936f66a0a900d81c396a7e60d73419868fba959c61efed7a8ab46/contourpy-1.3.3-cp314-cp314-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:afe5a512f31ee6bd7d0dda52ec9864c984ca3d66664444f2d72e0dc4eb832e36", size = 386264, upload-time = "2025-07-26T12:02:21.916Z" }, + { url = "https://files.pythonhosted.org/packages/04/5f/9ff93450ba96b09c7c2b3f81c94de31c89f92292f1380261bd7195bea4ea/contourpy-1.3.3-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f64836de09927cba6f79dcd00fdd7d5329f3fccc633468507079c829ca4db4e3", size = 363819, upload-time = "2025-07-26T12:02:23.759Z" }, + { url = "https://files.pythonhosted.org/packages/3e/a6/0b185d4cc480ee494945cde102cb0149ae830b5fa17bf855b95f2e70ad13/contourpy-1.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:1fd43c3be4c8e5fd6e4f2baeae35ae18176cf2e5cced681cca908addf1cdd53b", size = 1333650, upload-time = "2025-07-26T12:02:26.181Z" }, + { url = "https://files.pythonhosted.org/packages/43/d7/afdc95580ca56f30fbcd3060250f66cedbde69b4547028863abd8aa3b47e/contourpy-1.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:6afc576f7b33cf00996e5c1102dc2a8f7cc89e39c0b55df93a0b78c1bd992b36", size = 1404833, upload-time = "2025-07-26T12:02:28.782Z" }, + { url = "https://files.pythonhosted.org/packages/e2/e2/366af18a6d386f41132a48f033cbd2102e9b0cf6345d35ff0826cd984566/contourpy-1.3.3-cp314-cp314-win32.whl", hash = "sha256:66c8a43a4f7b8df8b71ee1840e4211a3c8d93b214b213f590e18a1beca458f7d", size = 189692, upload-time = "2025-07-26T12:02:30.128Z" }, + { url = "https://files.pythonhosted.org/packages/7d/c2/57f54b03d0f22d4044b8afb9ca0e184f8b1afd57b4f735c2fa70883dc601/contourpy-1.3.3-cp314-cp314-win_amd64.whl", hash = "sha256:cf9022ef053f2694e31d630feaacb21ea24224be1c3ad0520b13d844274614fd", size = 232424, upload-time = "2025-07-26T12:02:31.395Z" }, + { url = "https://files.pythonhosted.org/packages/18/79/a9416650df9b525737ab521aa181ccc42d56016d2123ddcb7b58e926a42c/contourpy-1.3.3-cp314-cp314-win_arm64.whl", hash = "sha256:95b181891b4c71de4bb404c6621e7e2390745f887f2a026b2d99e92c17892339", size = 198300, upload-time = "2025-07-26T12:02:32.956Z" }, + { url = "https://files.pythonhosted.org/packages/1f/42/38c159a7d0f2b7b9c04c64ab317042bb6952b713ba875c1681529a2932fe/contourpy-1.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:33c82d0138c0a062380332c861387650c82e4cf1747aaa6938b9b6516762e772", size = 306769, upload-time = "2025-07-26T12:02:34.2Z" }, + { url = "https://files.pythonhosted.org/packages/c3/6c/26a8205f24bca10974e77460de68d3d7c63e282e23782f1239f226fcae6f/contourpy-1.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:ea37e7b45949df430fe649e5de8351c423430046a2af20b1c1961cae3afcda77", size = 287892, upload-time = "2025-07-26T12:02:35.807Z" }, + { url = "https://files.pythonhosted.org/packages/66/06/8a475c8ab718ebfd7925661747dbb3c3ee9c82ac834ccb3570be49d129f4/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d304906ecc71672e9c89e87c4675dc5c2645e1f4269a5063b99b0bb29f232d13", size = 326748, upload-time = "2025-07-26T12:02:37.193Z" }, + { url = "https://files.pythonhosted.org/packages/b4/a3/c5ca9f010a44c223f098fccd8b158bb1cb287378a31ac141f04730dc49be/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ca658cd1a680a5c9ea96dc61cdbae1e85c8f25849843aa799dfd3cb370ad4fbe", size = 375554, upload-time = "2025-07-26T12:02:38.894Z" }, + { url = "https://files.pythonhosted.org/packages/80/5b/68bd33ae63fac658a4145088c1e894405e07584a316738710b636c6d0333/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ab2fd90904c503739a75b7c8c5c01160130ba67944a7b77bbf36ef8054576e7f", size = 388118, upload-time = "2025-07-26T12:02:40.642Z" }, + { url = "https://files.pythonhosted.org/packages/40/52/4c285a6435940ae25d7410a6c36bda5145839bc3f0beb20c707cda18b9d2/contourpy-1.3.3-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b7301b89040075c30e5768810bc96a8e8d78085b47d8be6e4c3f5a0b4ed478a0", size = 352555, upload-time = "2025-07-26T12:02:42.25Z" }, + { url = "https://files.pythonhosted.org/packages/24/ee/3e81e1dd174f5c7fefe50e85d0892de05ca4e26ef1c9a59c2a57e43b865a/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:2a2a8b627d5cc6b7c41a4beff6c5ad5eb848c88255fda4a8745f7e901b32d8e4", size = 1322295, upload-time = "2025-07-26T12:02:44.668Z" }, + { url = "https://files.pythonhosted.org/packages/3c/b2/6d913d4d04e14379de429057cd169e5e00f6c2af3bb13e1710bcbdb5da12/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fd6ec6be509c787f1caf6b247f0b1ca598bef13f4ddeaa126b7658215529ba0f", size = 1391027, upload-time = "2025-07-26T12:02:47.09Z" }, + { url = "https://files.pythonhosted.org/packages/93/8a/68a4ec5c55a2971213d29a9374913f7e9f18581945a7a31d1a39b5d2dfe5/contourpy-1.3.3-cp314-cp314t-win32.whl", hash = "sha256:e74a9a0f5e3fff48fb5a7f2fd2b9b70a3fe014a67522f79b7cca4c0c7e43c9ae", size = 202428, upload-time = "2025-07-26T12:02:48.691Z" }, + { url = "https://files.pythonhosted.org/packages/fa/96/fd9f641ffedc4fa3ace923af73b9d07e869496c9cc7a459103e6e978992f/contourpy-1.3.3-cp314-cp314t-win_amd64.whl", hash = "sha256:13b68d6a62db8eafaebb8039218921399baf6e47bf85006fd8529f2a08ef33fc", size = 250331, upload-time = "2025-07-26T12:02:50.137Z" }, + { url = "https://files.pythonhosted.org/packages/ae/8c/469afb6465b853afff216f9528ffda78a915ff880ed58813ba4faf4ba0b6/contourpy-1.3.3-cp314-cp314t-win_arm64.whl", hash = "sha256:b7448cb5a725bb1e35ce88771b86fba35ef418952474492cf7c764059933ff8b", size = 203831, upload-time = "2025-07-26T12:02:51.449Z" }, + { url = "https://files.pythonhosted.org/packages/a5/29/8dcfe16f0107943fa92388c23f6e05cff0ba58058c4c95b00280d4c75a14/contourpy-1.3.3-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:cd5dfcaeb10f7b7f9dc8941717c6c2ade08f587be2226222c12b25f0483ed497", size = 278809, upload-time = "2025-07-26T12:02:52.74Z" }, + { url = "https://files.pythonhosted.org/packages/85/a9/8b37ef4f7dafeb335daee3c8254645ef5725be4d9c6aa70b50ec46ef2f7e/contourpy-1.3.3-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:0c1fc238306b35f246d61a1d416a627348b5cf0648648a031e14bb8705fcdfe8", size = 261593, upload-time = "2025-07-26T12:02:54.037Z" }, + { url = "https://files.pythonhosted.org/packages/0a/59/ebfb8c677c75605cc27f7122c90313fd2f375ff3c8d19a1694bda74aaa63/contourpy-1.3.3-pp311-pypy311_pp73-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:70f9aad7de812d6541d29d2bbf8feb22ff7e1c299523db288004e3157ff4674e", size = 302202, upload-time = "2025-07-26T12:02:55.947Z" }, + { url = "https://files.pythonhosted.org/packages/3c/37/21972a15834d90bfbfb009b9d004779bd5a07a0ec0234e5ba8f64d5736f4/contourpy-1.3.3-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5ed3657edf08512fc3fe81b510e35c2012fbd3081d2e26160f27ca28affec989", size = 329207, upload-time = "2025-07-26T12:02:57.468Z" }, + { url = "https://files.pythonhosted.org/packages/0c/58/bd257695f39d05594ca4ad60df5bcb7e32247f9951fd09a9b8edb82d1daa/contourpy-1.3.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:3d1a3799d62d45c18bafd41c5fa05120b96a28079f2393af559b843d1a966a77", size = 225315, upload-time = "2025-07-26T12:02:58.801Z" }, +] + [[package]] name = "coverage" version = "7.13.5" @@ -259,6 +341,64 @@ toml = [ { name = "tomli", marker = "python_full_version <= '3.11'" }, ] +[[package]] +name = "cycler" +version = "0.12.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a9/95/a3dbbb5028f35eafb79008e7522a75244477d2838f38cbb722248dabc2a8/cycler-0.12.1.tar.gz", hash = "sha256:88bb128f02ba341da8ef447245a9e138fae777f6a23943da4540077d3601eb1c", size = 7615, upload-time = "2023-10-07T05:32:18.335Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e7/05/c19819d5e3d95294a6f5947fb9b9629efb316b96de511b418c53d245aae6/cycler-0.12.1-py3-none-any.whl", hash = "sha256:85cef7cff222d8644161529808465972e51340599459b8ac3ccbac5a854e0d30", size = 8321, upload-time = "2023-10-07T05:32:16.783Z" }, +] + +[[package]] +name = "fonttools" +version = "4.62.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/9a/08/7012b00a9a5874311b639c3920270c36ee0c445b69d9989a85e5c92ebcb0/fonttools-4.62.1.tar.gz", hash = "sha256:e54c75fd6041f1122476776880f7c3c3295ffa31962dc6ebe2543c00dca58b5d", size = 3580737, upload-time = "2026-03-13T13:54:25.52Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/88/39/23ff32561ec8d45a4d48578b4d241369d9270dc50926c017570e60893701/fonttools-4.62.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:40975849bac44fb0b9253d77420c6d8b523ac4dcdcefeff6e4d706838a5b80f7", size = 2871039, upload-time = "2026-03-13T13:52:33.127Z" }, + { url = "https://files.pythonhosted.org/packages/24/7f/66d3f8a9338a9b67fe6e1739f47e1cd5cee78bd3bc1206ef9b0b982289a5/fonttools-4.62.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9dde91633f77fa576879a0c76b1d89de373cae751a98ddf0109d54e173b40f14", size = 2416346, upload-time = "2026-03-13T13:52:35.676Z" }, + { url = "https://files.pythonhosted.org/packages/aa/53/5276ceba7bff95da7793a07c5284e1da901cf00341ce5e2f3273056c0cca/fonttools-4.62.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6acb4109f8bee00fec985c8c7afb02299e35e9c94b57287f3ea542f28bd0b0a7", size = 5100897, upload-time = "2026-03-13T13:52:38.102Z" }, + { url = "https://files.pythonhosted.org/packages/cc/a1/40a5c4d8e28b0851d53a8eeeb46fbd73c325a2a9a165f290a5ed90e6c597/fonttools-4.62.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:1c5c25671ce8805e0d080e2ffdeca7f1e86778c5cbfbeae86d7f866d8830517b", size = 5071078, upload-time = "2026-03-13T13:52:41.305Z" }, + { url = "https://files.pythonhosted.org/packages/e3/be/d378fca4c65ea1956fee6d90ace6e861776809cbbc5af22388a090c3c092/fonttools-4.62.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a5d8825e1140f04e6c99bb7d37a9e31c172f3bc208afbe02175339e699c710e1", size = 5076908, upload-time = "2026-03-13T13:52:44.122Z" }, + { url = "https://files.pythonhosted.org/packages/f8/d9/ae6a1d0693a4185a84605679c8a1f719a55df87b9c6e8e817bfdd9ef5936/fonttools-4.62.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:268abb1cb221e66c014acc234e872b7870d8b5d4657a83a8f4205094c32d2416", size = 5202275, upload-time = "2026-03-13T13:52:46.591Z" }, + { url = "https://files.pythonhosted.org/packages/54/6c/af95d9c4efb15cabff22642b608342f2bd67137eea6107202d91b5b03184/fonttools-4.62.1-cp311-cp311-win32.whl", hash = "sha256:942b03094d7edbb99bdf1ae7e9090898cad7bf9030b3d21f33d7072dbcb51a53", size = 2293075, upload-time = "2026-03-13T13:52:48.711Z" }, + { url = "https://files.pythonhosted.org/packages/d3/97/bf54c5b3f2be34e1f143e6db838dfdc54f2ffa3e68c738934c82f3b2a08d/fonttools-4.62.1-cp311-cp311-win_amd64.whl", hash = "sha256:e8514f4924375f77084e81467e63238b095abda5107620f49421c368a6017ed2", size = 2344593, upload-time = "2026-03-13T13:52:50.725Z" }, + { url = "https://files.pythonhosted.org/packages/47/d4/dbacced3953544b9a93088cc10ef2b596d348c983d5c67a404fa41ec51ba/fonttools-4.62.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:90365821debbd7db678809c7491ca4acd1e0779b9624cdc6ddaf1f31992bf974", size = 2870219, upload-time = "2026-03-13T13:52:53.664Z" }, + { url = "https://files.pythonhosted.org/packages/66/9e/a769c8e99b81e5a87ab7e5e7236684de4e96246aae17274e5347d11ebd78/fonttools-4.62.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:12859ff0b47dd20f110804c3e0d0970f7b832f561630cd879969011541a464a9", size = 2414891, upload-time = "2026-03-13T13:52:56.493Z" }, + { url = "https://files.pythonhosted.org/packages/69/64/f19a9e3911968c37e1e620e14dfc5778299e1474f72f4e57c5ec771d9489/fonttools-4.62.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9c125ffa00c3d9003cdaaf7f2c79e6e535628093e14b5de1dccb08859b680936", size = 5033197, upload-time = "2026-03-13T13:52:59.179Z" }, + { url = "https://files.pythonhosted.org/packages/9b/8a/99c8b3c3888c5c474c08dbfd7c8899786de9604b727fcefb055b42c84bba/fonttools-4.62.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:149f7d84afca659d1a97e39a4778794a2f83bf344c5ee5134e09995086cc2392", size = 4988768, upload-time = "2026-03-13T13:53:02.761Z" }, + { url = "https://files.pythonhosted.org/packages/d1/c6/0f904540d3e6ab463c1243a0d803504826a11604c72dd58c2949796a1762/fonttools-4.62.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:0aa72c43a601cfa9273bb1ae0518f1acadc01ee181a6fc60cd758d7fdadffc04", size = 4971512, upload-time = "2026-03-13T13:53:05.678Z" }, + { url = "https://files.pythonhosted.org/packages/29/0b/5cbef6588dc9bd6b5c9ad6a4d5a8ca384d0cea089da31711bbeb4f9654a6/fonttools-4.62.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:19177c8d96c7c36359266e571c5173bcee9157b59cfc8cb0153c5673dc5a3a7d", size = 5122723, upload-time = "2026-03-13T13:53:08.662Z" }, + { url = "https://files.pythonhosted.org/packages/4a/47/b3a5342d381595ef439adec67848bed561ab7fdb1019fa522e82101b7d9c/fonttools-4.62.1-cp312-cp312-win32.whl", hash = "sha256:a24decd24d60744ee8b4679d38e88b8303d86772053afc29b19d23bb8207803c", size = 2281278, upload-time = "2026-03-13T13:53:10.998Z" }, + { url = "https://files.pythonhosted.org/packages/28/b1/0c2ab56a16f409c6c8a68816e6af707827ad5d629634691ff60a52879792/fonttools-4.62.1-cp312-cp312-win_amd64.whl", hash = "sha256:9e7863e10b3de72376280b515d35b14f5eeed639d1aa7824f4cf06779ec65e42", size = 2331414, upload-time = "2026-03-13T13:53:13.992Z" }, + { url = "https://files.pythonhosted.org/packages/3b/56/6f389de21c49555553d6a5aeed5ac9767631497ac836c4f076273d15bd72/fonttools-4.62.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c22b1014017111c401469e3acc5433e6acf6ebcc6aa9efb538a533c800971c79", size = 2865155, upload-time = "2026-03-13T13:53:16.132Z" }, + { url = "https://files.pythonhosted.org/packages/03/c5/0e3966edd5ec668d41dfe418787726752bc07e2f5fd8c8f208615e61fa89/fonttools-4.62.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:68959f5fc58ed4599b44aad161c2837477d7f35f5f79402d97439974faebfebe", size = 2412802, upload-time = "2026-03-13T13:53:18.878Z" }, + { url = "https://files.pythonhosted.org/packages/52/94/e6ac4b44026de7786fe46e3bfa0c87e51d5d70a841054065d49cd62bb909/fonttools-4.62.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ef46db46c9447103b8f3ff91e8ba009d5fe181b1920a83757a5762551e32bb68", size = 5013926, upload-time = "2026-03-13T13:53:21.379Z" }, + { url = "https://files.pythonhosted.org/packages/e2/98/8b1e801939839d405f1f122e7d175cebe9aeb4e114f95bfc45e3152af9a7/fonttools-4.62.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:6706d1cb1d5e6251a97ad3c1b9347505c5615c112e66047abbef0f8545fa30d1", size = 4964575, upload-time = "2026-03-13T13:53:23.857Z" }, + { url = "https://files.pythonhosted.org/packages/46/76/7d051671e938b1881670528fec69cc4044315edd71a229c7fd712eaa5119/fonttools-4.62.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:2e7abd2b1e11736f58c1de27819e1955a53267c21732e78243fa2fa2e5c1e069", size = 4953693, upload-time = "2026-03-13T13:53:26.569Z" }, + { url = "https://files.pythonhosted.org/packages/1f/ae/b41f8628ec0be3c1b934fc12b84f4576a5c646119db4d3bdd76a217c90b5/fonttools-4.62.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:403d28ce06ebfc547fbcb0cb8b7f7cc2f7a2d3e1a67ba9a34b14632df9e080f9", size = 5094920, upload-time = "2026-03-13T13:53:29.329Z" }, + { url = "https://files.pythonhosted.org/packages/f2/f6/53a1e9469331a23dcc400970a27a4caa3d9f6edbf5baab0260285238b884/fonttools-4.62.1-cp313-cp313-win32.whl", hash = "sha256:93c316e0f5301b2adbe6a5f658634307c096fd5aae60a5b3412e4f3e1728ab24", size = 2279928, upload-time = "2026-03-13T13:53:32.352Z" }, + { url = "https://files.pythonhosted.org/packages/38/60/35186529de1db3c01f5ad625bde07c1f576305eab6d86bbda4c58445f721/fonttools-4.62.1-cp313-cp313-win_amd64.whl", hash = "sha256:7aa21ff53e28a9c2157acbc44e5b401149d3c9178107130e82d74ceb500e5056", size = 2330514, upload-time = "2026-03-13T13:53:34.991Z" }, + { url = "https://files.pythonhosted.org/packages/36/f0/2888cdac391807d68d90dcb16ef858ddc1b5309bfc6966195a459dd326e2/fonttools-4.62.1-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:fa1d16210b6b10a826d71bed68dd9ec24a9e218d5a5e2797f37c573e7ec215ca", size = 2864442, upload-time = "2026-03-13T13:53:37.509Z" }, + { url = "https://files.pythonhosted.org/packages/4b/b2/e521803081f8dc35990816b82da6360fa668a21b44da4b53fc9e77efcd62/fonttools-4.62.1-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:aa69d10ed420d8121118e628ad47d86e4caa79ba37f968597b958f6cceab7eca", size = 2410901, upload-time = "2026-03-13T13:53:40.55Z" }, + { url = "https://files.pythonhosted.org/packages/00/a4/8c3511ff06e53110039358dbbdc1a65d72157a054638387aa2ada300a8b8/fonttools-4.62.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:bd13b7999d59c5eb1c2b442eb2d0c427cb517a0b7a1f5798fc5c9e003f5ff782", size = 4999608, upload-time = "2026-03-13T13:53:42.798Z" }, + { url = "https://files.pythonhosted.org/packages/28/63/cd0c3b26afe60995a5295f37c246a93d454023726c3261cfbb3559969bb9/fonttools-4.62.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8d337fdd49a79b0d51c4da87bc38169d21c3abbf0c1aa9367eff5c6656fb6dae", size = 4912726, upload-time = "2026-03-13T13:53:45.405Z" }, + { url = "https://files.pythonhosted.org/packages/70/b9/ac677cb07c24c685cf34f64e140617d58789d67a3dd524164b63648c6114/fonttools-4.62.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:d241cdc4a67b5431c6d7f115fdf63335222414995e3a1df1a41e1182acd4bcc7", size = 4951422, upload-time = "2026-03-13T13:53:48.326Z" }, + { url = "https://files.pythonhosted.org/packages/e6/10/11c08419a14b85b7ca9a9faca321accccc8842dd9e0b1c8a72908de05945/fonttools-4.62.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:c05557a78f8fa514da0f869556eeda40887a8abc77c76ee3f74cf241778afd5a", size = 5060979, upload-time = "2026-03-13T13:53:51.366Z" }, + { url = "https://files.pythonhosted.org/packages/4e/3c/12eea4a4cf054e7ab058ed5ceada43b46809fce2bf319017c4d63ae55bb4/fonttools-4.62.1-cp314-cp314-win32.whl", hash = "sha256:49a445d2f544ce4a69338694cad575ba97b9a75fff02720da0882d1a73f12800", size = 2283733, upload-time = "2026-03-13T13:53:53.606Z" }, + { url = "https://files.pythonhosted.org/packages/6b/67/74b070029043186b5dd13462c958cb7c7f811be0d2e634309d9a1ffb1505/fonttools-4.62.1-cp314-cp314-win_amd64.whl", hash = "sha256:1eecc128c86c552fb963fe846ca4e011b1be053728f798185a1687502f6d398e", size = 2335663, upload-time = "2026-03-13T13:53:56.23Z" }, + { url = "https://files.pythonhosted.org/packages/42/c5/4d2ed3ca6e33617fc5624467da353337f06e7f637707478903c785bd8e20/fonttools-4.62.1-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:1596aeaddf7f78e21e68293c011316a25267b3effdaccaf4d59bc9159d681b82", size = 2947288, upload-time = "2026-03-13T13:53:59.397Z" }, + { url = "https://files.pythonhosted.org/packages/1f/e9/7ab11ddfda48ed0f89b13380e5595ba572619c27077be0b2c447a63ff351/fonttools-4.62.1-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:8f8fca95d3bb3208f59626a4b0ea6e526ee51f5a8ad5d91821c165903e8d9260", size = 2449023, upload-time = "2026-03-13T13:54:01.642Z" }, + { url = "https://files.pythonhosted.org/packages/b2/10/a800fa090b5e8819942e54e19b55fc7c21fe14a08757c3aa3ca8db358939/fonttools-4.62.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee91628c08e76f77b533d65feb3fbe6d9dad699f95be51cf0d022db94089cdc4", size = 5137599, upload-time = "2026-03-13T13:54:04.495Z" }, + { url = "https://files.pythonhosted.org/packages/37/dc/8ccd45033fffd74deb6912fa1ca524643f584b94c87a16036855b498a1ed/fonttools-4.62.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:5f37df1cac61d906e7b836abe356bc2f34c99d4477467755c216b72aa3dc748b", size = 4920933, upload-time = "2026-03-13T13:54:07.557Z" }, + { url = "https://files.pythonhosted.org/packages/99/eb/e618adefb839598d25ac8136cd577925d6c513dc0d931d93b8af956210f0/fonttools-4.62.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:92bb00a947e666169c99b43753c4305fc95a890a60ef3aeb2a6963e07902cc87", size = 5016232, upload-time = "2026-03-13T13:54:10.611Z" }, + { url = "https://files.pythonhosted.org/packages/d9/5f/9b5c9bfaa8ec82def8d8168c4f13615990d6ce5996fe52bd49bfb5e05134/fonttools-4.62.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:bdfe592802ef939a0e33106ea4a318eeb17822c7ee168c290273cbd5fabd746c", size = 5042987, upload-time = "2026-03-13T13:54:13.569Z" }, + { url = "https://files.pythonhosted.org/packages/90/aa/dfbbe24c6a6afc5c203d90cc0343e24bcbb09e76d67c4d6eef8c2558d7ba/fonttools-4.62.1-cp314-cp314t-win32.whl", hash = "sha256:b820fcb92d4655513d8402d5b219f94481c4443d825b4372c75a2072aa4b357a", size = 2348021, upload-time = "2026-03-13T13:54:16.98Z" }, + { url = "https://files.pythonhosted.org/packages/13/6f/ae9c4e4dd417948407b680855c2c7790efb52add6009aaecff1e3bc50e8e/fonttools-4.62.1-cp314-cp314t-win_amd64.whl", hash = "sha256:59b372b4f0e113d3746b88985f1c796e7bf830dd54b28374cd85c2b8acd7583e", size = 2414147, upload-time = "2026-03-13T13:54:19.416Z" }, + { url = "https://files.pythonhosted.org/packages/fd/ba/56147c165442cc5ba7e82ecf301c9a68353cede498185869e6e02b4c264f/fonttools-4.62.1-py3-none-any.whl", hash = "sha256:7487782e2113861f4ddcc07c3436450659e3caa5e470b27dc2177cade2d8e7fd", size = 1152647, upload-time = "2026-03-13T13:54:22.735Z" }, +] + [[package]] name = "ghp-import" version = "2.1.0" @@ -330,6 +470,112 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" }, ] +[[package]] +name = "kiwisolver" +version = "1.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d0/67/9c61eccb13f0bdca9307614e782fec49ffdde0f7a2314935d489fa93cd9c/kiwisolver-1.5.0.tar.gz", hash = "sha256:d4193f3d9dc3f6f79aaed0e5637f45d98850ebf01f7ca20e69457f3e8946b66a", size = 103482, upload-time = "2026-03-09T13:15:53.382Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/12/dd/a495a9c104be1c476f0386e714252caf2b7eca883915422a64c50b88c6f5/kiwisolver-1.5.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9eed0f7edbb274413b6ee781cca50541c8c0facd3d6fd289779e494340a2b85c", size = 122798, upload-time = "2026-03-09T13:12:58.963Z" }, + { url = "https://files.pythonhosted.org/packages/11/60/37b4047a2af0cf5ef6d8b4b26e91829ae6fc6a2d1f74524bcb0e7cd28a32/kiwisolver-1.5.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c4923e404d6bcd91b6779c009542e5647fef32e4a5d75e115e3bbac6f2335eb", size = 66216, upload-time = "2026-03-09T13:13:00.155Z" }, + { url = "https://files.pythonhosted.org/packages/0a/aa/510dc933d87767584abfe03efa445889996c70c2990f6f87c3ebaa0a18c5/kiwisolver-1.5.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0df54df7e686afa55e6f21fb86195224a6d9beb71d637e8d7920c95cf0f89aac", size = 63911, upload-time = "2026-03-09T13:13:01.671Z" }, + { url = "https://files.pythonhosted.org/packages/80/46/bddc13df6c2a40741e0cc7865bb1c9ed4796b6760bd04ce5fae3928ef917/kiwisolver-1.5.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2517e24d7315eb51c10664cdb865195df38ab74456c677df67bb47f12d088a27", size = 1438209, upload-time = "2026-03-09T13:13:03.385Z" }, + { url = "https://files.pythonhosted.org/packages/fd/d6/76621246f5165e5372f02f5e6f3f48ea336a8f9e96e43997d45b240ed8cd/kiwisolver-1.5.0-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ff710414307fefa903e0d9bdf300972f892c23477829f49504e59834f4195398", size = 1248888, upload-time = "2026-03-09T13:13:05.231Z" }, + { url = "https://files.pythonhosted.org/packages/b2/c1/31559ec6fb39a5b48035ce29bb63ade628f321785f38c384dee3e2c08bc1/kiwisolver-1.5.0-cp311-cp311-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6176c1811d9d5a04fa391c490cc44f451e240697a16977f11c6f722efb9041db", size = 1266304, upload-time = "2026-03-09T13:13:06.743Z" }, + { url = "https://files.pythonhosted.org/packages/5e/ef/1cb8276f2d29cc6a41e0a042f27946ca347d3a4a75acf85d0a16aa6dcc82/kiwisolver-1.5.0-cp311-cp311-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:50847dca5d197fcbd389c805aa1a1cf32f25d2e7273dc47ab181a517666b68cc", size = 1319650, upload-time = "2026-03-09T13:13:08.607Z" }, + { url = "https://files.pythonhosted.org/packages/4c/e4/5ba3cecd7ce6236ae4a80f67e5d5531287337d0e1f076ca87a5abe4cd5d0/kiwisolver-1.5.0-cp311-cp311-manylinux_2_39_riscv64.whl", hash = "sha256:01808c6d15f4c3e8559595d6d1fe6411c68e4a3822b4b9972b44473b24f4e679", size = 970949, upload-time = "2026-03-09T13:13:10.299Z" }, + { url = "https://files.pythonhosted.org/packages/5a/69/dc61f7ae9a2f071f26004ced87f078235b5507ab6e5acd78f40365655034/kiwisolver-1.5.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:f1f9f4121ec58628c96baa3de1a55a4e3a333c5102c8e94b64e23bf7b2083309", size = 2199125, upload-time = "2026-03-09T13:13:11.841Z" }, + { url = "https://files.pythonhosted.org/packages/e5/7b/abbe0f1b5afa85f8d084b73e90e5f801c0939eba16ac2e49af7c61a6c28d/kiwisolver-1.5.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:b7d335370ae48a780c6e6a6bbfa97342f563744c39c35562f3f367665f5c1de2", size = 2293783, upload-time = "2026-03-09T13:13:14.399Z" }, + { url = "https://files.pythonhosted.org/packages/8a/80/5908ae149d96d81580d604c7f8aefd0e98f4fd728cf172f477e9f2a81744/kiwisolver-1.5.0-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:800ee55980c18545af444d93fdd60c56b580db5cc54867d8cbf8a1dc0829938c", size = 1960726, upload-time = "2026-03-09T13:13:16.047Z" }, + { url = "https://files.pythonhosted.org/packages/84/08/a78cb776f8c085b7143142ce479859cfec086bd09ee638a317040b6ef420/kiwisolver-1.5.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:c438f6ca858697c9ab67eb28246c92508af972e114cac34e57a6d4ba17a3ac08", size = 2464738, upload-time = "2026-03-09T13:13:17.897Z" }, + { url = "https://files.pythonhosted.org/packages/b1/e1/65584da5356ed6cb12c63791a10b208860ac40a83de165cb6a6751a686e3/kiwisolver-1.5.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:8c63c91f95173f9c2a67c7c526b2cea976828a0e7fced9cdcead2802dc10f8a4", size = 2270718, upload-time = "2026-03-09T13:13:19.421Z" }, + { url = "https://files.pythonhosted.org/packages/be/6c/28f17390b62b8f2f520e2915095b3c94d88681ecf0041e75389d9667f202/kiwisolver-1.5.0-cp311-cp311-win_amd64.whl", hash = "sha256:beb7f344487cdcb9e1efe4b7a29681b74d34c08f0043a327a74da852a6749e7b", size = 73480, upload-time = "2026-03-09T13:13:20.818Z" }, + { url = "https://files.pythonhosted.org/packages/d8/0e/2ee5debc4f77a625778fec5501ff3e8036fe361b7ee28ae402a485bb9694/kiwisolver-1.5.0-cp311-cp311-win_arm64.whl", hash = "sha256:ad4ae4ffd1ee9cd11357b4c66b612da9888f4f4daf2f36995eda64bd45370cac", size = 64930, upload-time = "2026-03-09T13:13:21.997Z" }, + { url = "https://files.pythonhosted.org/packages/4d/b2/818b74ebea34dabe6d0c51cb1c572e046730e64844da6ed646d5298c40ce/kiwisolver-1.5.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:4e9750bc21b886308024f8a54ccb9a2cc38ac9fa813bf4348434e3d54f337ff9", size = 123158, upload-time = "2026-03-09T13:13:23.127Z" }, + { url = "https://files.pythonhosted.org/packages/bf/d9/405320f8077e8e1c5c4bd6adc45e1e6edf6d727b6da7f2e2533cf58bff71/kiwisolver-1.5.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:72ec46b7eba5b395e0a7b63025490d3214c11013f4aacb4f5e8d6c3041829588", size = 66388, upload-time = "2026-03-09T13:13:24.765Z" }, + { url = "https://files.pythonhosted.org/packages/99/9f/795fedf35634f746151ca8839d05681ceb6287fbed6cc1c9bf235f7887c2/kiwisolver-1.5.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ed3a984b31da7481b103f68776f7128a89ef26ed40f4dc41a2223cda7fb24819", size = 64068, upload-time = "2026-03-09T13:13:25.878Z" }, + { url = "https://files.pythonhosted.org/packages/c4/13/680c54afe3e65767bed7ec1a15571e1a2f1257128733851ade24abcefbcc/kiwisolver-1.5.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:bb5136fb5352d3f422df33f0c879a1b0c204004324150cc3b5e3c4f310c9049f", size = 1477934, upload-time = "2026-03-09T13:13:27.166Z" }, + { url = "https://files.pythonhosted.org/packages/c8/2f/cebfcdb60fd6a9b0f6b47a9337198bcbad6fbe15e68189b7011fd914911f/kiwisolver-1.5.0-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b2af221f268f5af85e776a73d62b0845fc8baf8ef0abfae79d29c77d0e776aaf", size = 1278537, upload-time = "2026-03-09T13:13:28.707Z" }, + { url = "https://files.pythonhosted.org/packages/f2/0d/9b782923aada3fafb1d6b84e13121954515c669b18af0c26e7d21f579855/kiwisolver-1.5.0-cp312-cp312-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b0f172dc8ffaccb8522d7c5d899de00133f2f1ca7b0a49b7da98e901de87bf2d", size = 1296685, upload-time = "2026-03-09T13:13:30.528Z" }, + { url = "https://files.pythonhosted.org/packages/27/70/83241b6634b04fe44e892688d5208332bde130f38e610c0418f9ede47ded/kiwisolver-1.5.0-cp312-cp312-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6ab8ba9152203feec73758dad83af9a0bbe05001eb4639e547207c40cfb52083", size = 1346024, upload-time = "2026-03-09T13:13:32.818Z" }, + { url = "https://files.pythonhosted.org/packages/e4/db/30ed226fb271ae1a6431fc0fe0edffb2efe23cadb01e798caeb9f2ceae8f/kiwisolver-1.5.0-cp312-cp312-manylinux_2_39_riscv64.whl", hash = "sha256:cdee07c4d7f6d72008d3f73b9bf027f4e11550224c7c50d8df1ae4a37c1402a6", size = 987241, upload-time = "2026-03-09T13:13:34.435Z" }, + { url = "https://files.pythonhosted.org/packages/ec/bd/c314595208e4c9587652d50959ead9e461995389664e490f4dce7ff0f782/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7c60d3c9b06fb23bd9c6139281ccbdc384297579ae037f08ae90c69f6845c0b1", size = 2227742, upload-time = "2026-03-09T13:13:36.4Z" }, + { url = "https://files.pythonhosted.org/packages/c1/43/0499cec932d935229b5543d073c2b87c9c22846aab48881e9d8d6e742a2d/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:e315e5ec90d88e140f57696ff85b484ff68bb311e36f2c414aa4286293e6dee0", size = 2323966, upload-time = "2026-03-09T13:13:38.204Z" }, + { url = "https://files.pythonhosted.org/packages/3d/6f/79b0d760907965acfd9d61826a3d41f8f093c538f55cd2633d3f0db269f6/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:1465387ac63576c3e125e5337a6892b9e99e0627d52317f3ca79e6930d889d15", size = 1977417, upload-time = "2026-03-09T13:13:39.966Z" }, + { url = "https://files.pythonhosted.org/packages/ab/31/01d0537c41cb75a551a438c3c7a80d0c60d60b81f694dac83dd436aec0d0/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:530a3fd64c87cffa844d4b6b9768774763d9caa299e9b75d8eca6a4423b31314", size = 2491238, upload-time = "2026-03-09T13:13:41.698Z" }, + { url = "https://files.pythonhosted.org/packages/e4/34/8aefdd0be9cfd00a44509251ba864f5caf2991e36772e61c408007e7f417/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:1d9daea4ea6b9be74fe2f01f7fbade8d6ffab263e781274cffca0dba9be9eec9", size = 2294947, upload-time = "2026-03-09T13:13:43.343Z" }, + { url = "https://files.pythonhosted.org/packages/ad/cf/0348374369ca588f8fe9c338fae49fa4e16eeb10ffb3d012f23a54578a9e/kiwisolver-1.5.0-cp312-cp312-win_amd64.whl", hash = "sha256:f18c2d9782259a6dc132fdc7a63c168cbc74b35284b6d75c673958982a378384", size = 73569, upload-time = "2026-03-09T13:13:45.792Z" }, + { url = "https://files.pythonhosted.org/packages/28/26/192b26196e2316e2bd29deef67e37cdf9870d9af8e085e521afff0fed526/kiwisolver-1.5.0-cp312-cp312-win_arm64.whl", hash = "sha256:f7c7553b13f69c1b29a5bde08ddc6d9d0c8bfb84f9ed01c30db25944aeb852a7", size = 64997, upload-time = "2026-03-09T13:13:46.878Z" }, + { url = "https://files.pythonhosted.org/packages/9d/69/024d6711d5ba575aa65d5538042e99964104e97fa153a9f10bc369182bc2/kiwisolver-1.5.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:fd40bb9cd0891c4c3cb1ddf83f8bbfa15731a248fdc8162669405451e2724b09", size = 123166, upload-time = "2026-03-09T13:13:48.032Z" }, + { url = "https://files.pythonhosted.org/packages/ce/48/adbb40df306f587054a348831220812b9b1d787aff714cfbc8556e38fccd/kiwisolver-1.5.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c0e1403fd7c26d77c1f03e096dc58a5c726503fa0db0456678b8668f76f521e3", size = 66395, upload-time = "2026-03-09T13:13:49.365Z" }, + { url = "https://files.pythonhosted.org/packages/a8/3a/d0a972b34e1c63e2409413104216cd1caa02c5a37cb668d1687d466c1c45/kiwisolver-1.5.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:dda366d548e89a90d88a86c692377d18d8bd64b39c1fb2b92cb31370e2896bbd", size = 64065, upload-time = "2026-03-09T13:13:50.562Z" }, + { url = "https://files.pythonhosted.org/packages/2b/0a/7b98e1e119878a27ba8618ca1e18b14f992ff1eda40f47bccccf4de44121/kiwisolver-1.5.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:332b4f0145c30b5f5ad9374881133e5aa64320428a57c2c2b61e9d891a51c2f3", size = 1477903, upload-time = "2026-03-09T13:13:52.084Z" }, + { url = "https://files.pythonhosted.org/packages/18/d8/55638d89ffd27799d5cc3d8aa28e12f4ce7a64d67b285114dbedc8ea4136/kiwisolver-1.5.0-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0c50b89ffd3e1a911c69a1dd3de7173c0cd10b130f56222e57898683841e4f96", size = 1278751, upload-time = "2026-03-09T13:13:54.673Z" }, + { url = "https://files.pythonhosted.org/packages/b8/97/b4c8d0d18421ecceba20ad8701358453b88e32414e6f6950b5a4bad54e65/kiwisolver-1.5.0-cp313-cp313-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4db576bb8c3ef9365f8b40fe0f671644de6736ae2c27a2c62d7d8a1b4329f099", size = 1296793, upload-time = "2026-03-09T13:13:56.287Z" }, + { url = "https://files.pythonhosted.org/packages/c4/10/f862f94b6389d8957448ec9df59450b81bec4abb318805375c401a1e6892/kiwisolver-1.5.0-cp313-cp313-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0b85aad90cea8ac6797a53b5d5f2e967334fa4d1149f031c4537569972596cb8", size = 1346041, upload-time = "2026-03-09T13:13:58.269Z" }, + { url = "https://files.pythonhosted.org/packages/a3/6a/f1650af35821eaf09de398ec0bc2aefc8f211f0cda50204c9f1673741ba9/kiwisolver-1.5.0-cp313-cp313-manylinux_2_39_riscv64.whl", hash = "sha256:d36ca54cb4c6c4686f7cbb7b817f66f5911c12ddb519450bbe86707155028f87", size = 987292, upload-time = "2026-03-09T13:13:59.871Z" }, + { url = "https://files.pythonhosted.org/packages/de/19/d7fb82984b9238115fe629c915007be608ebd23dc8629703d917dbfaffd4/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:38f4a703656f493b0ad185211ccfca7f0386120f022066b018eb5296d8613e23", size = 2227865, upload-time = "2026-03-09T13:14:01.401Z" }, + { url = "https://files.pythonhosted.org/packages/7f/b9/46b7f386589fd222dac9e9de9c956ce5bcefe2ee73b4e79891381dda8654/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3ac2360e93cb41be81121755c6462cff3beaa9967188c866e5fce5cf13170859", size = 2324369, upload-time = "2026-03-09T13:14:02.972Z" }, + { url = "https://files.pythonhosted.org/packages/92/8b/95e237cf3d9c642960153c769ddcbe278f182c8affb20cecc1cc983e7cc5/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:c95cab08d1965db3d84a121f1c7ce7479bdd4072c9b3dafd8fecce48a2e6b902", size = 1977989, upload-time = "2026-03-09T13:14:04.503Z" }, + { url = "https://files.pythonhosted.org/packages/1b/95/980c9df53501892784997820136c01f62bc1865e31b82b9560f980c0e649/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:fc20894c3d21194d8041a28b65622d5b86db786da6e3cfe73f0c762951a61167", size = 2491645, upload-time = "2026-03-09T13:14:06.106Z" }, + { url = "https://files.pythonhosted.org/packages/cb/32/900647fd0840abebe1561792c6b31e6a7c0e278fc3973d30572a965ca14c/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:7a32f72973f0f950c1920475d5c5ea3d971b81b6f0ec53b8d0a956cc965f22e0", size = 2295237, upload-time = "2026-03-09T13:14:08.891Z" }, + { url = "https://files.pythonhosted.org/packages/be/8a/be60e3bbcf513cc5a50f4a3e88e1dcecebb79c1ad607a7222877becaa101/kiwisolver-1.5.0-cp313-cp313-win_amd64.whl", hash = "sha256:0bf3acf1419fa93064a4c2189ac0b58e3be7872bf6ee6177b0d4c63dc4cea276", size = 73573, upload-time = "2026-03-09T13:14:12.327Z" }, + { url = "https://files.pythonhosted.org/packages/4d/d2/64be2e429eb4fca7f7e1c52a91b12663aeaf25de3895e5cca0f47ef2a8d0/kiwisolver-1.5.0-cp313-cp313-win_arm64.whl", hash = "sha256:fa8eb9ecdb7efb0b226acec134e0d709e87a909fa4971a54c0c4f6e88635484c", size = 64998, upload-time = "2026-03-09T13:14:13.469Z" }, + { url = "https://files.pythonhosted.org/packages/b0/69/ce68dd0c85755ae2de490bf015b62f2cea5f6b14ff00a463f9d0774449ff/kiwisolver-1.5.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:db485b3847d182b908b483b2ed133c66d88d49cacf98fd278fadafe11b4478d1", size = 125700, upload-time = "2026-03-09T13:14:14.636Z" }, + { url = "https://files.pythonhosted.org/packages/74/aa/937aac021cf9d4349990d47eb319309a51355ed1dbdc9c077cdc9224cb11/kiwisolver-1.5.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:be12f931839a3bdfe28b584db0e640a65a8bcbc24560ae3fdb025a449b3d754e", size = 67537, upload-time = "2026-03-09T13:14:15.808Z" }, + { url = "https://files.pythonhosted.org/packages/ee/20/3a87fbece2c40ad0f6f0aefa93542559159c5f99831d596050e8afae7a9f/kiwisolver-1.5.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:16b85d37c2cbb3253226d26e64663f755d88a03439a9c47df6246b35defbdfb7", size = 65514, upload-time = "2026-03-09T13:14:18.035Z" }, + { url = "https://files.pythonhosted.org/packages/f0/7f/f943879cda9007c45e1f7dba216d705c3a18d6b35830e488b6c6a4e7cdf0/kiwisolver-1.5.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4432b835675f0ea7414aab3d37d119f7226d24869b7a829caeab49ebda407b0c", size = 1584848, upload-time = "2026-03-09T13:14:19.745Z" }, + { url = "https://files.pythonhosted.org/packages/37/f8/4d4f85cc1870c127c88d950913370dd76138482161cd07eabbc450deff01/kiwisolver-1.5.0-cp313-cp313t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b0feb50971481a2cc44d94e88bdb02cdd497618252ae226b8eb1201b957e368", size = 1391542, upload-time = "2026-03-09T13:14:21.54Z" }, + { url = "https://files.pythonhosted.org/packages/04/0b/65dd2916c84d252b244bd405303220f729e7c17c9d7d33dca6feeff9ffc4/kiwisolver-1.5.0-cp313-cp313t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:56fa888f10d0f367155e76ce849fa1166fc9730d13bd2d65a2aa13b6f5424489", size = 1404447, upload-time = "2026-03-09T13:14:23.205Z" }, + { url = "https://files.pythonhosted.org/packages/39/5c/2606a373247babce9b1d056c03a04b65f3cf5290a8eac5d7bdead0a17e21/kiwisolver-1.5.0-cp313-cp313t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:940dda65d5e764406b9fb92761cbf462e4e63f712ab60ed98f70552e496f3bf1", size = 1455918, upload-time = "2026-03-09T13:14:24.74Z" }, + { url = "https://files.pythonhosted.org/packages/d5/d1/c6078b5756670658e9192a2ef11e939c92918833d2745f85cd14a6004bdf/kiwisolver-1.5.0-cp313-cp313t-manylinux_2_39_riscv64.whl", hash = "sha256:89fc958c702ee9a745e4700378f5d23fddbc46ff89e8fdbf5395c24d5c1452a3", size = 1072856, upload-time = "2026-03-09T13:14:26.597Z" }, + { url = "https://files.pythonhosted.org/packages/cb/c8/7def6ddf16eb2b3741d8b172bdaa9af882b03c78e9b0772975408801fa63/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9027d773c4ff81487181a925945743413f6069634d0b122d0b37684ccf4f1e18", size = 2333580, upload-time = "2026-03-09T13:14:28.237Z" }, + { url = "https://files.pythonhosted.org/packages/9e/87/2ac1fce0eb1e616fcd3c35caa23e665e9b1948bb984f4764790924594128/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:5b233ea3e165e43e35dba1d2b8ecc21cf070b45b65ae17dd2747d2713d942021", size = 2423018, upload-time = "2026-03-09T13:14:30.018Z" }, + { url = "https://files.pythonhosted.org/packages/67/13/c6700ccc6cc218716bfcda4935e4b2997039869b4ad8a94f364c5a3b8e63/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:ce9bf03dad3b46408c08649c6fbd6ca28a9fce0eb32fdfffa6775a13103b5310", size = 2062804, upload-time = "2026-03-09T13:14:32.888Z" }, + { url = "https://files.pythonhosted.org/packages/1b/bd/877056304626943ff0f1f44c08f584300c199b887cb3176cd7e34f1515f1/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:fc4d3f1fb9ca0ae9f97b095963bc6326f1dbfd3779d6679a1e016b9baaa153d3", size = 2597482, upload-time = "2026-03-09T13:14:34.971Z" }, + { url = "https://files.pythonhosted.org/packages/75/19/c60626c47bf0f8ac5dcf72c6c98e266d714f2fbbfd50cf6dab5ede3aaa50/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:f443b4825c50a51ee68585522ab4a1d1257fac65896f282b4c6763337ac9f5d2", size = 2394328, upload-time = "2026-03-09T13:14:36.816Z" }, + { url = "https://files.pythonhosted.org/packages/47/84/6a6d5e5bb8273756c27b7d810d47f7ef2f1f9b9fd23c9ee9a3f8c75c9cef/kiwisolver-1.5.0-cp313-cp313t-win_arm64.whl", hash = "sha256:893ff3a711d1b515ba9da14ee090519bad4610ed1962fbe298a434e8c5f8db53", size = 68410, upload-time = "2026-03-09T13:14:38.695Z" }, + { url = "https://files.pythonhosted.org/packages/e4/d7/060f45052f2a01ad5762c8fdecd6d7a752b43400dc29ff75cd47225a40fd/kiwisolver-1.5.0-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:8df31fe574b8b3993cc61764f40941111b25c2d9fea13d3ce24a49907cd2d615", size = 123231, upload-time = "2026-03-09T13:14:41.323Z" }, + { url = "https://files.pythonhosted.org/packages/c2/a7/78da680eadd06ff35edef6ef68a1ad273bad3e2a0936c9a885103230aece/kiwisolver-1.5.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:1d49a49ac4cbfb7c1375301cd1ec90169dfeae55ff84710d782260ce77a75a02", size = 66489, upload-time = "2026-03-09T13:14:42.534Z" }, + { url = "https://files.pythonhosted.org/packages/49/b2/97980f3ad4fae37dd7fe31626e2bf75fbf8bdf5d303950ec1fab39a12da8/kiwisolver-1.5.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0cbe94b69b819209a62cb27bdfa5dc2a8977d8de2f89dfd97ba4f53ed3af754e", size = 64063, upload-time = "2026-03-09T13:14:44.759Z" }, + { url = "https://files.pythonhosted.org/packages/e7/f9/b06c934a6aa8bc91f566bd2a214fd04c30506c2d9e2b6b171953216a65b6/kiwisolver-1.5.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:80aa065ffd378ff784822a6d7c3212f2d5f5e9c3589614b5c228b311fd3063ac", size = 1475913, upload-time = "2026-03-09T13:14:46.247Z" }, + { url = "https://files.pythonhosted.org/packages/6b/f0/f768ae564a710135630672981231320bc403cf9152b5596ec5289de0f106/kiwisolver-1.5.0-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e7f886f47ab881692f278ae901039a234e4025a68e6dfab514263a0b1c4ae05", size = 1282782, upload-time = "2026-03-09T13:14:48.458Z" }, + { url = "https://files.pythonhosted.org/packages/e2/9f/1de7aad00697325f05238a5f2eafbd487fb637cc27a558b5367a5f37fb7f/kiwisolver-1.5.0-cp314-cp314-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5060731cc3ed12ca3a8b57acd4aeca5bbc2f49216dd0bec1650a1acd89486bcd", size = 1300815, upload-time = "2026-03-09T13:14:50.721Z" }, + { url = "https://files.pythonhosted.org/packages/5a/c2/297f25141d2e468e0ce7f7a7b92e0cf8918143a0cbd3422c1ad627e85a06/kiwisolver-1.5.0-cp314-cp314-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:7a4aa69609f40fce3cbc3f87b2061f042eee32f94b8f11db707b66a26461591a", size = 1347925, upload-time = "2026-03-09T13:14:52.304Z" }, + { url = "https://files.pythonhosted.org/packages/b9/d3/f4c73a02eb41520c47610207b21afa8cdd18fdbf64ffd94674ae21c4812d/kiwisolver-1.5.0-cp314-cp314-manylinux_2_39_riscv64.whl", hash = "sha256:d168fda2dbff7b9b5f38e693182d792a938c31db4dac3a80a4888de603c99554", size = 991322, upload-time = "2026-03-09T13:14:54.637Z" }, + { url = "https://files.pythonhosted.org/packages/7b/46/d3f2efef7732fcda98d22bf4ad5d3d71d545167a852ca710a494f4c15343/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:413b820229730d358efd838ecbab79902fe97094565fdc80ddb6b0a18c18a581", size = 2232857, upload-time = "2026-03-09T13:14:56.471Z" }, + { url = "https://files.pythonhosted.org/packages/3f/ec/2d9756bf2b6d26ae4349b8d3662fb3993f16d80c1f971c179ce862b9dbae/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:5124d1ea754509b09e53738ec185584cc609aae4a3b510aaf4ed6aa047ef9303", size = 2329376, upload-time = "2026-03-09T13:14:58.072Z" }, + { url = "https://files.pythonhosted.org/packages/8f/9f/876a0a0f2260f1bde92e002b3019a5fabc35e0939c7d945e0fa66185eb20/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:e4415a8db000bf49a6dd1c478bf70062eaacff0f462b92b0ba68791a905861f9", size = 1982549, upload-time = "2026-03-09T13:14:59.668Z" }, + { url = "https://files.pythonhosted.org/packages/6c/4f/ba3624dfac23a64d54ac4179832860cb537c1b0af06024936e82ca4154a0/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:d618fd27420381a4f6044faa71f46d8bfd911bd077c555f7138ed88729bfbe79", size = 2494680, upload-time = "2026-03-09T13:15:01.364Z" }, + { url = "https://files.pythonhosted.org/packages/39/b7/97716b190ab98911b20d10bf92eca469121ec483b8ce0edd314f51bc85af/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5092eb5b1172947f57d6ea7d89b2f29650414e4293c47707eb499ec07a0ac796", size = 2297905, upload-time = "2026-03-09T13:15:03.925Z" }, + { url = "https://files.pythonhosted.org/packages/a3/36/4e551e8aa55c9188bca9abb5096805edbf7431072b76e2298e34fd3a3008/kiwisolver-1.5.0-cp314-cp314-win_amd64.whl", hash = "sha256:d76e2d8c75051d58177e762164d2e9ab92886534e3a12e795f103524f221dd8e", size = 75086, upload-time = "2026-03-09T13:15:07.775Z" }, + { url = "https://files.pythonhosted.org/packages/70/15/9b90f7df0e31a003c71649cf66ef61c3c1b862f48c81007fa2383c8bd8d7/kiwisolver-1.5.0-cp314-cp314-win_arm64.whl", hash = "sha256:fa6248cd194edff41d7ea9425ced8ca3a6f838bfb295f6f1d6e6bb694a8518df", size = 66577, upload-time = "2026-03-09T13:15:09.139Z" }, + { url = "https://files.pythonhosted.org/packages/17/01/7dc8c5443ff42b38e72731643ed7cf1ed9bf01691ae5cdca98501999ed83/kiwisolver-1.5.0-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:d1ffeb80b5676463d7a7d56acbe8e37a20ce725570e09549fe738e02ca6b7e1e", size = 125794, upload-time = "2026-03-09T13:15:10.525Z" }, + { url = "https://files.pythonhosted.org/packages/46/8a/b4ebe46ebaac6a303417fab10c2e165c557ddaff558f9699d302b256bc53/kiwisolver-1.5.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:bc4d8e252f532ab46a1de9349e2d27b91fce46736a9eedaa37beaca66f574ed4", size = 67646, upload-time = "2026-03-09T13:15:12.016Z" }, + { url = "https://files.pythonhosted.org/packages/60/35/10a844afc5f19d6f567359bf4789e26661755a2f36200d5d1ed8ad0126e5/kiwisolver-1.5.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6783e069732715ad0c3ce96dbf21dbc2235ab0593f2baf6338101f70371f4028", size = 65511, upload-time = "2026-03-09T13:15:13.311Z" }, + { url = "https://files.pythonhosted.org/packages/f8/8a/685b297052dd041dcebce8e8787b58923b6e78acc6115a0dc9189011c44b/kiwisolver-1.5.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e7c4c09a490dc4d4a7f8cbee56c606a320f9dc28cf92a7157a39d1ce7676a657", size = 1584858, upload-time = "2026-03-09T13:15:15.103Z" }, + { url = "https://files.pythonhosted.org/packages/9e/80/04865e3d4638ac5bddec28908916df4a3075b8c6cc101786a96803188b96/kiwisolver-1.5.0-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2a075bd7bd19c70cf67c8badfa36cf7c5d8de3c9ddb8420c51e10d9c50e94920", size = 1392539, upload-time = "2026-03-09T13:15:16.661Z" }, + { url = "https://files.pythonhosted.org/packages/ba/01/77a19cacc0893fa13fafa46d1bba06fb4dc2360b3292baf4b56d8e067b24/kiwisolver-1.5.0-cp314-cp314t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:bdd3e53429ff02aa319ba59dfe4ceeec345bf46cf180ec2cf6fd5b942e7975e9", size = 1405310, upload-time = "2026-03-09T13:15:18.229Z" }, + { url = "https://files.pythonhosted.org/packages/53/39/bcaf5d0cca50e604cfa9b4e3ae1d64b50ca1ae5b754122396084599ef903/kiwisolver-1.5.0-cp314-cp314t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3cdcb35dc9d807259c981a85531048ede628eabcffb3239adf3d17463518992d", size = 1456244, upload-time = "2026-03-09T13:15:20.444Z" }, + { url = "https://files.pythonhosted.org/packages/d0/7a/72c187abc6975f6978c3e39b7cf67aeb8b3c0a8f9790aa7fd412855e9e1f/kiwisolver-1.5.0-cp314-cp314t-manylinux_2_39_riscv64.whl", hash = "sha256:70d593af6a6ca332d1df73d519fddb5148edb15cd90d5f0155e3746a6d4fcc65", size = 1073154, upload-time = "2026-03-09T13:15:22.039Z" }, + { url = "https://files.pythonhosted.org/packages/c7/ca/cf5b25783ebbd59143b4371ed0c8428a278abe68d6d0104b01865b1bbd0f/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:377815a8616074cabbf3f53354e1d040c35815a134e01d7614b7692e4bf8acfa", size = 2334377, upload-time = "2026-03-09T13:15:23.741Z" }, + { url = "https://files.pythonhosted.org/packages/4a/e5/b1f492adc516796e88751282276745340e2a72dcd0d36cf7173e0daf3210/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:0255a027391d52944eae1dbb5d4cc5903f57092f3674e8e544cdd2622826b3f0", size = 2425288, upload-time = "2026-03-09T13:15:25.789Z" }, + { url = "https://files.pythonhosted.org/packages/e6/e5/9b21fbe91a61b8f409d74a26498706e97a48008bfcd1864373d32a6ba31c/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:012b1eb16e28718fa782b5e61dc6f2da1f0792ca73bd05d54de6cb9561665fc9", size = 2063158, upload-time = "2026-03-09T13:15:27.63Z" }, + { url = "https://files.pythonhosted.org/packages/b1/02/83f47986138310f95ea95531f851b2a62227c11cbc3e690ae1374fe49f0f/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:0e3aafb33aed7479377e5e9a82e9d4bf87063741fc99fc7ae48b0f16e32bdd6f", size = 2597260, upload-time = "2026-03-09T13:15:29.421Z" }, + { url = "https://files.pythonhosted.org/packages/07/18/43a5f24608d8c313dd189cf838c8e68d75b115567c6279de7796197cfb6a/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:e7a116ae737f0000343218c4edf5bd45893bfeaff0993c0b215d7124c9f77646", size = 2394403, upload-time = "2026-03-09T13:15:31.517Z" }, + { url = "https://files.pythonhosted.org/packages/3b/b5/98222136d839b8afabcaa943b09bd05888c2d36355b7e448550211d1fca4/kiwisolver-1.5.0-cp314-cp314t-win_amd64.whl", hash = "sha256:1dd9b0b119a350976a6d781e7278ec7aca0b201e1a9e2d23d9804afecb6ca681", size = 79687, upload-time = "2026-03-09T13:15:33.204Z" }, + { url = "https://files.pythonhosted.org/packages/99/a2/ca7dc962848040befed12732dff6acae7fb3c4f6fc4272b3f6c9a30b8713/kiwisolver-1.5.0-cp314-cp314t-win_arm64.whl", hash = "sha256:58f812017cd2985c21fbffb4864d59174d4903dd66fa23815e74bbc7a0e2dd57", size = 70032, upload-time = "2026-03-09T13:15:34.411Z" }, + { url = "https://files.pythonhosted.org/packages/1c/fa/2910df836372d8761bb6eff7d8bdcb1613b5c2e03f260efe7abe34d388a7/kiwisolver-1.5.0-graalpy312-graalpy250_312_native-macosx_10_13_x86_64.whl", hash = "sha256:5ae8e62c147495b01a0f4765c878e9bfdf843412446a247e28df59936e99e797", size = 130262, upload-time = "2026-03-09T13:15:35.629Z" }, + { url = "https://files.pythonhosted.org/packages/0f/41/c5f71f9f00aabcc71fee8b7475e3f64747282580c2fe748961ba29b18385/kiwisolver-1.5.0-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:f6764a4ccab3078db14a632420930f6186058750df066b8ea2a7106df91d3203", size = 138036, upload-time = "2026-03-09T13:15:36.894Z" }, + { url = "https://files.pythonhosted.org/packages/fa/06/7399a607f434119c6e1fdc8ec89a8d51ccccadf3341dee4ead6bd14caaf5/kiwisolver-1.5.0-graalpy312-graalpy250_312_native-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c31c13da98624f957b0fb1b5bae5383b2333c2c3f6793d9825dd5ce79b525cb7", size = 194295, upload-time = "2026-03-09T13:15:38.22Z" }, + { url = "https://files.pythonhosted.org/packages/b5/91/53255615acd2a1eaca307ede3c90eb550bae9c94581f8c00081b6b1c8f44/kiwisolver-1.5.0-graalpy312-graalpy250_312_native-win_amd64.whl", hash = "sha256:1f1489f769582498610e015a8ef2d36f28f505ab3096d0e16b4858a9ec214f57", size = 75987, upload-time = "2026-03-09T13:15:39.65Z" }, + { url = "https://files.pythonhosted.org/packages/e9/eb/5fcbbbf9a0e2c3a35effb88831a483345326bbc3a030a3b5b69aee647f84/kiwisolver-1.5.0-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:ec4c85dc4b687c7f7f15f553ff26a98bfe8c58f5f7f0ac8905f0ba4c7be60232", size = 59532, upload-time = "2026-03-09T13:15:47.047Z" }, + { url = "https://files.pythonhosted.org/packages/c3/9b/e17104555bb4db148fd52327feea1e96be4b88e8e008b029002c281a21ab/kiwisolver-1.5.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:12e91c215a96e39f57989c8912ae761286ac5a9584d04030ceb3368a357f017a", size = 57420, upload-time = "2026-03-09T13:15:48.199Z" }, + { url = "https://files.pythonhosted.org/packages/48/44/2b5b95b7aa39fb2d8d9d956e0f3d5d45aef2ae1d942d4c3ffac2f9cfed1a/kiwisolver-1.5.0-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:be4a51a55833dc29ab5d7503e7bcb3b3af3402d266018137127450005cdfe737", size = 79892, upload-time = "2026-03-09T13:15:49.694Z" }, + { url = "https://files.pythonhosted.org/packages/52/7d/7157f9bba6b455cfb4632ed411e199fc8b8977642c2b12082e1bd9e6d173/kiwisolver-1.5.0-pp311-pypy311_pp73-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:daae526907e262de627d8f70058a0f64acc9e2641c164c99c8f594b34a799a16", size = 77603, upload-time = "2026-03-09T13:15:50.945Z" }, + { url = "https://files.pythonhosted.org/packages/0a/dd/8050c947d435c8d4bc94e3252f4d8bb8a76cfb424f043a8680be637a57f1/kiwisolver-1.5.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:59cd8683f575d96df5bb48f6add94afc055012c29e28124fcae2b63661b9efb1", size = 73558, upload-time = "2026-03-09T13:15:52.112Z" }, +] + [[package]] name = "knowledgecomplex" version = "0.1.0" @@ -342,6 +588,10 @@ dependencies = [ ] [package.optional-dependencies] +analysis = [ + { name = "numpy" }, + { name = "scipy" }, +] dev = [ { name = "mypy" }, { name = "pytest" }, @@ -352,21 +602,37 @@ docs = [ { name = "mkdocs-material" }, { name = "mkdocstrings", extra = ["python"] }, ] +viz = [ + { name = "matplotlib" }, + { name = "networkx" }, +] +viz-interactive = [ + { name = "matplotlib" }, + { name = "networkx" }, + { name = "plotly" }, +] [package.metadata] requires-dist = [ + { name = "matplotlib", marker = "extra == 'viz'", specifier = ">=3.7" }, + { name = "matplotlib", marker = "extra == 'viz-interactive'", specifier = ">=3.7" }, { name = "mkdocs-material", marker = "extra == 'docs'" }, { name = "mkdocstrings", extras = ["python"], marker = "extra == 'docs'" }, { name = "mypy", marker = "extra == 'dev'" }, + { name = "networkx", marker = "extra == 'viz'", specifier = ">=3.0" }, + { name = "networkx", marker = "extra == 'viz-interactive'", specifier = ">=3.0" }, + { name = "numpy", marker = "extra == 'analysis'", specifier = ">=1.24" }, { name = "owlrl", specifier = ">=6.0" }, { name = "pandas", specifier = ">=2.0" }, + { name = "plotly", marker = "extra == 'viz-interactive'", specifier = ">=5.0" }, { name = "pyshacl", specifier = ">=0.25" }, { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.0" }, { name = "pytest-cov", marker = "extra == 'dev'" }, { name = "rdflib", specifier = ">=7.0" }, { name = "ruff", marker = "extra == 'dev'" }, + { name = "scipy", marker = "extra == 'analysis'", specifier = ">=1.10" }, ] -provides-extras = ["dev", "docs"] +provides-extras = ["dev", "docs", "viz", "viz-interactive", "analysis"] [[package]] name = "librt" @@ -524,6 +790,70 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/70/bc/6f1c2f612465f5fa89b95bead1f44dcb607670fd42891d8fdcd5d039f4f4/markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", size = 14146, upload-time = "2025-09-27T18:37:28.327Z" }, ] +[[package]] +name = "matplotlib" +version = "3.10.8" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "contourpy" }, + { name = "cycler" }, + { name = "fonttools" }, + { name = "kiwisolver" }, + { name = "numpy" }, + { name = "packaging" }, + { name = "pillow" }, + { name = "pyparsing" }, + { name = "python-dateutil" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/8a/76/d3c6e3a13fe484ebe7718d14e269c9569c4eb0020a968a327acb3b9a8fe6/matplotlib-3.10.8.tar.gz", hash = "sha256:2299372c19d56bcd35cf05a2738308758d32b9eaed2371898d8f5bd33f084aa3", size = 34806269, upload-time = "2025-12-10T22:56:51.155Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f8/86/de7e3a1cdcfc941483af70609edc06b83e7c8a0e0dc9ac325200a3f4d220/matplotlib-3.10.8-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:6be43b667360fef5c754dda5d25a32e6307a03c204f3c0fc5468b78fa87b4160", size = 8251215, upload-time = "2025-12-10T22:55:16.175Z" }, + { url = "https://files.pythonhosted.org/packages/fd/14/baad3222f424b19ce6ad243c71de1ad9ec6b2e4eb1e458a48fdc6d120401/matplotlib-3.10.8-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a2b336e2d91a3d7006864e0990c83b216fcdca64b5a6484912902cef87313d78", size = 8139625, upload-time = "2025-12-10T22:55:17.712Z" }, + { url = "https://files.pythonhosted.org/packages/8f/a0/7024215e95d456de5883e6732e708d8187d9753a21d32f8ddb3befc0c445/matplotlib-3.10.8-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:efb30e3baaea72ce5928e32bab719ab4770099079d66726a62b11b1ef7273be4", size = 8712614, upload-time = "2025-12-10T22:55:20.8Z" }, + { url = "https://files.pythonhosted.org/packages/5a/f4/b8347351da9a5b3f41e26cf547252d861f685c6867d179a7c9d60ad50189/matplotlib-3.10.8-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d56a1efd5bfd61486c8bc968fa18734464556f0fb8e51690f4ac25d85cbbbbc2", size = 9540997, upload-time = "2025-12-10T22:55:23.258Z" }, + { url = "https://files.pythonhosted.org/packages/9e/c0/c7b914e297efe0bc36917bf216b2acb91044b91e930e878ae12981e461e5/matplotlib-3.10.8-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:238b7ce5717600615c895050239ec955d91f321c209dd110db988500558e70d6", size = 9596825, upload-time = "2025-12-10T22:55:25.217Z" }, + { url = "https://files.pythonhosted.org/packages/6f/d3/a4bbc01c237ab710a1f22b4da72f4ff6d77eb4c7735ea9811a94ae239067/matplotlib-3.10.8-cp311-cp311-win_amd64.whl", hash = "sha256:18821ace09c763ec93aef5eeff087ee493a24051936d7b9ebcad9662f66501f9", size = 8135090, upload-time = "2025-12-10T22:55:27.162Z" }, + { url = "https://files.pythonhosted.org/packages/89/dd/a0b6588f102beab33ca6f5218b31725216577b2a24172f327eaf6417d5c9/matplotlib-3.10.8-cp311-cp311-win_arm64.whl", hash = "sha256:bab485bcf8b1c7d2060b4fcb6fc368a9e6f4cd754c9c2fea281f4be21df394a2", size = 8012377, upload-time = "2025-12-10T22:55:29.185Z" }, + { url = "https://files.pythonhosted.org/packages/9e/67/f997cdcbb514012eb0d10cd2b4b332667997fb5ebe26b8d41d04962fa0e6/matplotlib-3.10.8-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:64fcc24778ca0404ce0cb7b6b77ae1f4c7231cdd60e6778f999ee05cbd581b9a", size = 8260453, upload-time = "2025-12-10T22:55:30.709Z" }, + { url = "https://files.pythonhosted.org/packages/7e/65/07d5f5c7f7c994f12c768708bd2e17a4f01a2b0f44a1c9eccad872433e2e/matplotlib-3.10.8-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b9a5ca4ac220a0cdd1ba6bcba3608547117d30468fefce49bb26f55c1a3d5c58", size = 8148321, upload-time = "2025-12-10T22:55:33.265Z" }, + { url = "https://files.pythonhosted.org/packages/3e/f3/c5195b1ae57ef85339fd7285dfb603b22c8b4e79114bae5f4f0fcf688677/matplotlib-3.10.8-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3ab4aabc72de4ff77b3ec33a6d78a68227bf1123465887f9905ba79184a1cc04", size = 8716944, upload-time = "2025-12-10T22:55:34.922Z" }, + { url = "https://files.pythonhosted.org/packages/00/f9/7638f5cc82ec8a7aa005de48622eecc3ed7c9854b96ba15bd76b7fd27574/matplotlib-3.10.8-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:24d50994d8c5816ddc35411e50a86ab05f575e2530c02752e02538122613371f", size = 9550099, upload-time = "2025-12-10T22:55:36.789Z" }, + { url = "https://files.pythonhosted.org/packages/57/61/78cd5920d35b29fd2a0fe894de8adf672ff52939d2e9b43cb83cd5ce1bc7/matplotlib-3.10.8-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:99eefd13c0dc3b3c1b4d561c1169e65fe47aab7b8158754d7c084088e2329466", size = 9613040, upload-time = "2025-12-10T22:55:38.715Z" }, + { url = "https://files.pythonhosted.org/packages/30/4e/c10f171b6e2f44d9e3a2b96efa38b1677439d79c99357600a62cc1e9594e/matplotlib-3.10.8-cp312-cp312-win_amd64.whl", hash = "sha256:dd80ecb295460a5d9d260df63c43f4afbdd832d725a531f008dad1664f458adf", size = 8142717, upload-time = "2025-12-10T22:55:41.103Z" }, + { url = "https://files.pythonhosted.org/packages/f1/76/934db220026b5fef85f45d51a738b91dea7d70207581063cd9bd8fafcf74/matplotlib-3.10.8-cp312-cp312-win_arm64.whl", hash = "sha256:3c624e43ed56313651bc18a47f838b60d7b8032ed348911c54906b130b20071b", size = 8012751, upload-time = "2025-12-10T22:55:42.684Z" }, + { url = "https://files.pythonhosted.org/packages/3d/b9/15fd5541ef4f5b9a17eefd379356cf12175fe577424e7b1d80676516031a/matplotlib-3.10.8-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:3f2e409836d7f5ac2f1c013110a4d50b9f7edc26328c108915f9075d7d7a91b6", size = 8261076, upload-time = "2025-12-10T22:55:44.648Z" }, + { url = "https://files.pythonhosted.org/packages/8d/a0/2ba3473c1b66b9c74dc7107c67e9008cb1782edbe896d4c899d39ae9cf78/matplotlib-3.10.8-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:56271f3dac49a88d7fca5060f004d9d22b865f743a12a23b1e937a0be4818ee1", size = 8148794, upload-time = "2025-12-10T22:55:46.252Z" }, + { url = "https://files.pythonhosted.org/packages/75/97/a471f1c3eb1fd6f6c24a31a5858f443891d5127e63a7788678d14e249aea/matplotlib-3.10.8-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:a0a7f52498f72f13d4a25ea70f35f4cb60642b466cbb0a9be951b5bc3f45a486", size = 8718474, upload-time = "2025-12-10T22:55:47.864Z" }, + { url = "https://files.pythonhosted.org/packages/01/be/cd478f4b66f48256f42927d0acbcd63a26a893136456cd079c0cc24fbabf/matplotlib-3.10.8-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:646d95230efb9ca614a7a594d4fcacde0ac61d25e37dd51710b36477594963ce", size = 9549637, upload-time = "2025-12-10T22:55:50.048Z" }, + { url = "https://files.pythonhosted.org/packages/5d/7c/8dc289776eae5109e268c4fb92baf870678dc048a25d4ac903683b86d5bf/matplotlib-3.10.8-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f89c151aab2e2e23cb3fe0acad1e8b82841fd265379c4cecd0f3fcb34c15e0f6", size = 9613678, upload-time = "2025-12-10T22:55:52.21Z" }, + { url = "https://files.pythonhosted.org/packages/64/40/37612487cc8a437d4dd261b32ca21fe2d79510fe74af74e1f42becb1bdb8/matplotlib-3.10.8-cp313-cp313-win_amd64.whl", hash = "sha256:e8ea3e2d4066083e264e75c829078f9e149fa119d27e19acd503de65e0b13149", size = 8142686, upload-time = "2025-12-10T22:55:54.253Z" }, + { url = "https://files.pythonhosted.org/packages/66/52/8d8a8730e968185514680c2a6625943f70269509c3dcfc0dcf7d75928cb8/matplotlib-3.10.8-cp313-cp313-win_arm64.whl", hash = "sha256:c108a1d6fa78a50646029cb6d49808ff0fc1330fda87fa6f6250c6b5369b6645", size = 8012917, upload-time = "2025-12-10T22:55:56.268Z" }, + { url = "https://files.pythonhosted.org/packages/b5/27/51fe26e1062f298af5ef66343d8ef460e090a27fea73036c76c35821df04/matplotlib-3.10.8-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:ad3d9833a64cf48cc4300f2b406c3d0f4f4724a91c0bd5640678a6ba7c102077", size = 8305679, upload-time = "2025-12-10T22:55:57.856Z" }, + { url = "https://files.pythonhosted.org/packages/2c/1e/4de865bc591ac8e3062e835f42dd7fe7a93168d519557837f0e37513f629/matplotlib-3.10.8-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:eb3823f11823deade26ce3b9f40dcb4a213da7a670013929f31d5f5ed1055b22", size = 8198336, upload-time = "2025-12-10T22:55:59.371Z" }, + { url = "https://files.pythonhosted.org/packages/c6/cb/2f7b6e75fb4dce87ef91f60cac4f6e34f4c145ab036a22318ec837971300/matplotlib-3.10.8-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d9050fee89a89ed57b4fb2c1bfac9a3d0c57a0d55aed95949eedbc42070fea39", size = 8731653, upload-time = "2025-12-10T22:56:01.032Z" }, + { url = "https://files.pythonhosted.org/packages/46/b3/bd9c57d6ba670a37ab31fb87ec3e8691b947134b201f881665b28cc039ff/matplotlib-3.10.8-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b44d07310e404ba95f8c25aa5536f154c0a8ec473303535949e52eb71d0a1565", size = 9561356, upload-time = "2025-12-10T22:56:02.95Z" }, + { url = "https://files.pythonhosted.org/packages/c0/3d/8b94a481456dfc9dfe6e39e93b5ab376e50998cddfd23f4ae3b431708f16/matplotlib-3.10.8-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:0a33deb84c15ede243aead39f77e990469fff93ad1521163305095b77b72ce4a", size = 9614000, upload-time = "2025-12-10T22:56:05.411Z" }, + { url = "https://files.pythonhosted.org/packages/bd/cd/bc06149fe5585ba800b189a6a654a75f1f127e8aab02fd2be10df7fa500c/matplotlib-3.10.8-cp313-cp313t-win_amd64.whl", hash = "sha256:3a48a78d2786784cc2413e57397981fb45c79e968d99656706018d6e62e57958", size = 8220043, upload-time = "2025-12-10T22:56:07.551Z" }, + { url = "https://files.pythonhosted.org/packages/e3/de/b22cf255abec916562cc04eef457c13e58a1990048de0c0c3604d082355e/matplotlib-3.10.8-cp313-cp313t-win_arm64.whl", hash = "sha256:15d30132718972c2c074cd14638c7f4592bd98719e2308bccea40e0538bc0cb5", size = 8062075, upload-time = "2025-12-10T22:56:09.178Z" }, + { url = "https://files.pythonhosted.org/packages/3c/43/9c0ff7a2f11615e516c3b058e1e6e8f9614ddeca53faca06da267c48345d/matplotlib-3.10.8-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:b53285e65d4fa4c86399979e956235deb900be5baa7fc1218ea67fbfaeaadd6f", size = 8262481, upload-time = "2025-12-10T22:56:10.885Z" }, + { url = "https://files.pythonhosted.org/packages/6f/ca/e8ae28649fcdf039fda5ef554b40a95f50592a3c47e6f7270c9561c12b07/matplotlib-3.10.8-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:32f8dce744be5569bebe789e46727946041199030db8aeb2954d26013a0eb26b", size = 8151473, upload-time = "2025-12-10T22:56:12.377Z" }, + { url = "https://files.pythonhosted.org/packages/f1/6f/009d129ae70b75e88cbe7e503a12a4c0670e08ed748a902c2568909e9eb5/matplotlib-3.10.8-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4cf267add95b1c88300d96ca837833d4112756045364f5c734a2276038dae27d", size = 9553896, upload-time = "2025-12-10T22:56:14.432Z" }, + { url = "https://files.pythonhosted.org/packages/f5/26/4221a741eb97967bc1fd5e4c52b9aa5a91b2f4ec05b59f6def4d820f9df9/matplotlib-3.10.8-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2cf5bd12cecf46908f286d7838b2abc6c91cda506c0445b8223a7c19a00df008", size = 9824193, upload-time = "2025-12-10T22:56:16.29Z" }, + { url = "https://files.pythonhosted.org/packages/1f/f3/3abf75f38605772cf48a9daf5821cd4f563472f38b4b828c6fba6fa6d06e/matplotlib-3.10.8-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:41703cc95688f2516b480f7f339d8851a6035f18e100ee6a32bc0b8536a12a9c", size = 9615444, upload-time = "2025-12-10T22:56:18.155Z" }, + { url = "https://files.pythonhosted.org/packages/93/a5/de89ac80f10b8dc615807ee1133cd99ac74082581196d4d9590bea10690d/matplotlib-3.10.8-cp314-cp314-win_amd64.whl", hash = "sha256:83d282364ea9f3e52363da262ce32a09dfe241e4080dcedda3c0db059d3c1f11", size = 8272719, upload-time = "2025-12-10T22:56:20.366Z" }, + { url = "https://files.pythonhosted.org/packages/69/ce/b006495c19ccc0a137b48083168a37bd056392dee02f87dba0472f2797fe/matplotlib-3.10.8-cp314-cp314-win_arm64.whl", hash = "sha256:2c1998e92cd5999e295a731bcb2911c75f597d937341f3030cc24ef2733d78a8", size = 8144205, upload-time = "2025-12-10T22:56:22.239Z" }, + { url = "https://files.pythonhosted.org/packages/68/d9/b31116a3a855bd313c6fcdb7226926d59b041f26061c6c5b1be66a08c826/matplotlib-3.10.8-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:b5a2b97dbdc7d4f353ebf343744f1d1f1cca8aa8bfddb4262fcf4306c3761d50", size = 8305785, upload-time = "2025-12-10T22:56:24.218Z" }, + { url = "https://files.pythonhosted.org/packages/1e/90/6effe8103f0272685767ba5f094f453784057072f49b393e3ea178fe70a5/matplotlib-3.10.8-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:3f5c3e4da343bba819f0234186b9004faba952cc420fbc522dc4e103c1985908", size = 8198361, upload-time = "2025-12-10T22:56:26.787Z" }, + { url = "https://files.pythonhosted.org/packages/d7/65/a73188711bea603615fc0baecca1061429ac16940e2385433cc778a9d8e7/matplotlib-3.10.8-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5f62550b9a30afde8c1c3ae450e5eb547d579dd69b25c2fc7a1c67f934c1717a", size = 9561357, upload-time = "2025-12-10T22:56:28.953Z" }, + { url = "https://files.pythonhosted.org/packages/f4/3d/b5c5d5d5be8ce63292567f0e2c43dde9953d3ed86ac2de0a72e93c8f07a1/matplotlib-3.10.8-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:495672de149445ec1b772ff2c9ede9b769e3cb4f0d0aa7fa730d7f59e2d4e1c1", size = 9823610, upload-time = "2025-12-10T22:56:31.455Z" }, + { url = "https://files.pythonhosted.org/packages/4d/4b/e7beb6bbd49f6bae727a12b270a2654d13c397576d25bd6786e47033300f/matplotlib-3.10.8-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:595ba4d8fe983b88f0eec8c26a241e16d6376fe1979086232f481f8f3f67494c", size = 9614011, upload-time = "2025-12-10T22:56:33.85Z" }, + { url = "https://files.pythonhosted.org/packages/7c/e6/76f2813d31f032e65f6f797e3f2f6e4aab95b65015924b1c51370395c28a/matplotlib-3.10.8-cp314-cp314t-win_amd64.whl", hash = "sha256:25d380fe8b1dc32cf8f0b1b448470a77afb195438bafdf1d858bfb876f3edf7b", size = 8362801, upload-time = "2025-12-10T22:56:36.107Z" }, + { url = "https://files.pythonhosted.org/packages/5d/49/d651878698a0b67f23aa28e17f45a6d6dd3d3f933fa29087fa4ce5947b5a/matplotlib-3.10.8-cp314-cp314t-win_arm64.whl", hash = "sha256:113bb52413ea508ce954a02c10ffd0d565f9c3bc7f2eddc27dfe1731e71c7b5f", size = 8192560, upload-time = "2025-12-10T22:56:38.008Z" }, + { url = "https://files.pythonhosted.org/packages/04/30/3afaa31c757f34b7725ab9d2ba8b48b5e89c2019c003e7d0ead143aabc5a/matplotlib-3.10.8-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:6da7c2ce169267d0d066adcf63758f0604aa6c3eebf67458930f9d9b79ad1db1", size = 8249198, upload-time = "2025-12-10T22:56:45.584Z" }, + { url = "https://files.pythonhosted.org/packages/48/2f/6334aec331f57485a642a7c8be03cb286f29111ae71c46c38b363230063c/matplotlib-3.10.8-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:9153c3292705be9f9c64498a8872118540c3f4123d1a1c840172edf262c8be4a", size = 8136817, upload-time = "2025-12-10T22:56:47.339Z" }, + { url = "https://files.pythonhosted.org/packages/73/e4/6d6f14b2a759c622f191b2d67e9075a3f56aaccb3be4bb9bb6890030d0a0/matplotlib-3.10.8-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:1ae029229a57cd1e8fe542485f27e7ca7b23aa9e8944ddb4985d0bc444f1eca2", size = 8713867, upload-time = "2025-12-10T22:56:48.954Z" }, +] + [[package]] name = "mergedeep" version = "1.3.4" @@ -700,6 +1030,24 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" }, ] +[[package]] +name = "narwhals" +version = "2.18.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/47/b4/02a8add181b8d2cd5da3b667cd102ae536e8c9572ab1a130816d70a89edb/narwhals-2.18.0.tar.gz", hash = "sha256:1de5cee338bc17c338c6278df2c38c0dd4290499fcf70d75e0a51d5f22a6e960", size = 620222, upload-time = "2026-03-10T15:51:27.14Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fe/75/0b4a10da17a44cf13567d08a9c7632a285297e46253263f1ae119129d10a/narwhals-2.18.0-py3-none-any.whl", hash = "sha256:68378155ee706ac9c5b25868ef62ecddd62947b6df7801a0a156bc0a615d2d0d", size = 444865, upload-time = "2026-03-10T15:51:24.085Z" }, +] + +[[package]] +name = "networkx" +version = "3.6.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6a/51/63fe664f3908c97be9d2e4f1158eb633317598cfa6e1fc14af5383f17512/networkx-3.6.1.tar.gz", hash = "sha256:26b7c357accc0c8cde558ad486283728b65b6a95d85ee1cd66bafab4c8168509", size = 2517025, upload-time = "2025-12-08T17:02:39.908Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9e/c9/b2622292ea83fbb4ec318f5b9ab867d0a28ab43c5717bb85b0a5f6b3b0a4/networkx-3.6.1-py3-none-any.whl", hash = "sha256:d47fbf302e7d9cbbb9e2555a0d267983d2aa476bac30e90dfbe5669bd57f3762", size = 2068504, upload-time = "2025-12-08T17:02:38.159Z" }, +] + [[package]] name = "numpy" version = "2.4.3" @@ -878,6 +1226,93 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/ef/3c/2c197d226f9ea224a9ab8d197933f9da0ae0aac5b6e0f884e2b8d9c8e9f7/pathspec-1.0.4-py3-none-any.whl", hash = "sha256:fb6ae2fd4e7c921a165808a552060e722767cfa526f99ca5156ed2ce45a5c723", size = 55206, upload-time = "2026-01-27T03:59:45.137Z" }, ] +[[package]] +name = "pillow" +version = "12.1.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/1f/42/5c74462b4fd957fcd7b13b04fb3205ff8349236ea74c7c375766d6c82288/pillow-12.1.1.tar.gz", hash = "sha256:9ad8fa5937ab05218e2b6a4cff30295ad35afd2f83ac592e68c0d871bb0fdbc4", size = 46980264, upload-time = "2026-02-11T04:23:07.146Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2b/46/5da1ec4a5171ee7bf1a0efa064aba70ba3d6e0788ce3f5acd1375d23c8c0/pillow-12.1.1-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:e879bb6cd5c73848ef3b2b48b8af9ff08c5b71ecda8048b7dd22d8a33f60be32", size = 5304084, upload-time = "2026-02-11T04:20:27.501Z" }, + { url = "https://files.pythonhosted.org/packages/78/93/a29e9bc02d1cf557a834da780ceccd54e02421627200696fcf805ebdc3fb/pillow-12.1.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:365b10bb9417dd4498c0e3b128018c4a624dc11c7b97d8cc54effe3b096f4c38", size = 4657866, upload-time = "2026-02-11T04:20:29.827Z" }, + { url = "https://files.pythonhosted.org/packages/13/84/583a4558d492a179d31e4aae32eadce94b9acf49c0337c4ce0b70e0a01f2/pillow-12.1.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d4ce8e329c93845720cd2014659ca67eac35f6433fd3050393d85f3ecef0dad5", size = 6232148, upload-time = "2026-02-11T04:20:31.329Z" }, + { url = "https://files.pythonhosted.org/packages/d5/e2/53c43334bbbb2d3b938978532fbda8e62bb6e0b23a26ce8592f36bcc4987/pillow-12.1.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc354a04072b765eccf2204f588a7a532c9511e8b9c7f900e1b64e3e33487090", size = 8038007, upload-time = "2026-02-11T04:20:34.225Z" }, + { url = "https://files.pythonhosted.org/packages/b8/a6/3d0e79c8a9d58150dd98e199d7c1c56861027f3829a3a60b3c2784190180/pillow-12.1.1-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7e7976bf1910a8116b523b9f9f58bf410f3e8aa330cd9a2bb2953f9266ab49af", size = 6345418, upload-time = "2026-02-11T04:20:35.858Z" }, + { url = "https://files.pythonhosted.org/packages/a2/c8/46dfeac5825e600579157eea177be43e2f7ff4a99da9d0d0a49533509ac5/pillow-12.1.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:597bd9c8419bc7c6af5604e55847789b69123bbe25d65cc6ad3012b4f3c98d8b", size = 7034590, upload-time = "2026-02-11T04:20:37.91Z" }, + { url = "https://files.pythonhosted.org/packages/af/bf/e6f65d3db8a8bbfeaf9e13cc0417813f6319863a73de934f14b2229ada18/pillow-12.1.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:2c1fc0f2ca5f96a3c8407e41cca26a16e46b21060fe6d5b099d2cb01412222f5", size = 6458655, upload-time = "2026-02-11T04:20:39.496Z" }, + { url = "https://files.pythonhosted.org/packages/f9/c2/66091f3f34a25894ca129362e510b956ef26f8fb67a0e6417bc5744e56f1/pillow-12.1.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:578510d88c6229d735855e1f278aa305270438d36a05031dfaae5067cc8eb04d", size = 7159286, upload-time = "2026-02-11T04:20:41.139Z" }, + { url = "https://files.pythonhosted.org/packages/7b/5a/24bc8eb526a22f957d0cec6243146744966d40857e3d8deb68f7902ca6c1/pillow-12.1.1-cp311-cp311-win32.whl", hash = "sha256:7311c0a0dcadb89b36b7025dfd8326ecfa36964e29913074d47382706e516a7c", size = 6328663, upload-time = "2026-02-11T04:20:43.184Z" }, + { url = "https://files.pythonhosted.org/packages/31/03/bef822e4f2d8f9d7448c133d0a18185d3cce3e70472774fffefe8b0ed562/pillow-12.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:fbfa2a7c10cc2623f412753cddf391c7f971c52ca40a3f65dc5039b2939e8563", size = 7031448, upload-time = "2026-02-11T04:20:44.696Z" }, + { url = "https://files.pythonhosted.org/packages/49/70/f76296f53610bd17b2e7d31728b8b7825e3ac3b5b3688b51f52eab7c0818/pillow-12.1.1-cp311-cp311-win_arm64.whl", hash = "sha256:b81b5e3511211631b3f672a595e3221252c90af017e399056d0faabb9538aa80", size = 2453651, upload-time = "2026-02-11T04:20:46.243Z" }, + { url = "https://files.pythonhosted.org/packages/07/d3/8df65da0d4df36b094351dce696f2989bec731d4f10e743b1c5f4da4d3bf/pillow-12.1.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:ab323b787d6e18b3d91a72fc99b1a2c28651e4358749842b8f8dfacd28ef2052", size = 5262803, upload-time = "2026-02-11T04:20:47.653Z" }, + { url = "https://files.pythonhosted.org/packages/d6/71/5026395b290ff404b836e636f51d7297e6c83beceaa87c592718747e670f/pillow-12.1.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:adebb5bee0f0af4909c30db0d890c773d1a92ffe83da908e2e9e720f8edf3984", size = 4657601, upload-time = "2026-02-11T04:20:49.328Z" }, + { url = "https://files.pythonhosted.org/packages/b1/2e/1001613d941c67442f745aff0f7cc66dd8df9a9c084eb497e6a543ee6f7e/pillow-12.1.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:bb66b7cc26f50977108790e2456b7921e773f23db5630261102233eb355a3b79", size = 6234995, upload-time = "2026-02-11T04:20:51.032Z" }, + { url = "https://files.pythonhosted.org/packages/07/26/246ab11455b2549b9233dbd44d358d033a2f780fa9007b61a913c5b2d24e/pillow-12.1.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:aee2810642b2898bb187ced9b349e95d2a7272930796e022efaf12e99dccd293", size = 8045012, upload-time = "2026-02-11T04:20:52.882Z" }, + { url = "https://files.pythonhosted.org/packages/b2/8b/07587069c27be7535ac1fe33874e32de118fbd34e2a73b7f83436a88368c/pillow-12.1.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a0b1cd6232e2b618adcc54d9882e4e662a089d5768cd188f7c245b4c8c44a397", size = 6349638, upload-time = "2026-02-11T04:20:54.444Z" }, + { url = "https://files.pythonhosted.org/packages/ff/79/6df7b2ee763d619cda2fb4fea498e5f79d984dae304d45a8999b80d6cf5c/pillow-12.1.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7aac39bcf8d4770d089588a2e1dd111cbaa42df5a94be3114222057d68336bd0", size = 7041540, upload-time = "2026-02-11T04:20:55.97Z" }, + { url = "https://files.pythonhosted.org/packages/2c/5e/2ba19e7e7236d7529f4d873bdaf317a318896bac289abebd4bb00ef247f0/pillow-12.1.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:ab174cd7d29a62dd139c44bf74b698039328f45cb03b4596c43473a46656b2f3", size = 6462613, upload-time = "2026-02-11T04:20:57.542Z" }, + { url = "https://files.pythonhosted.org/packages/03/03/31216ec124bb5c3dacd74ce8efff4cc7f52643653bad4825f8f08c697743/pillow-12.1.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:339ffdcb7cbeaa08221cd401d517d4b1fe7a9ed5d400e4a8039719238620ca35", size = 7166745, upload-time = "2026-02-11T04:20:59.196Z" }, + { url = "https://files.pythonhosted.org/packages/1f/e7/7c4552d80052337eb28653b617eafdef39adfb137c49dd7e831b8dc13bc5/pillow-12.1.1-cp312-cp312-win32.whl", hash = "sha256:5d1f9575a12bed9e9eedd9a4972834b08c97a352bd17955ccdebfeca5913fa0a", size = 6328823, upload-time = "2026-02-11T04:21:01.385Z" }, + { url = "https://files.pythonhosted.org/packages/3d/17/688626d192d7261bbbf98846fc98995726bddc2c945344b65bec3a29d731/pillow-12.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:21329ec8c96c6e979cd0dfd29406c40c1d52521a90544463057d2aaa937d66a6", size = 7033367, upload-time = "2026-02-11T04:21:03.536Z" }, + { url = "https://files.pythonhosted.org/packages/ed/fe/a0ef1f73f939b0eca03ee2c108d0043a87468664770612602c63266a43c4/pillow-12.1.1-cp312-cp312-win_arm64.whl", hash = "sha256:af9a332e572978f0218686636610555ae3defd1633597be015ed50289a03c523", size = 2453811, upload-time = "2026-02-11T04:21:05.116Z" }, + { url = "https://files.pythonhosted.org/packages/d5/11/6db24d4bd7685583caeae54b7009584e38da3c3d4488ed4cd25b439de486/pillow-12.1.1-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:d242e8ac078781f1de88bf823d70c1a9b3c7950a44cdf4b7c012e22ccbcd8e4e", size = 4062689, upload-time = "2026-02-11T04:21:06.804Z" }, + { url = "https://files.pythonhosted.org/packages/33/c0/ce6d3b1fe190f0021203e0d9b5b99e57843e345f15f9ef22fcd43842fd21/pillow-12.1.1-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:02f84dfad02693676692746df05b89cf25597560db2857363a208e393429f5e9", size = 4138535, upload-time = "2026-02-11T04:21:08.452Z" }, + { url = "https://files.pythonhosted.org/packages/a0/c6/d5eb6a4fb32a3f9c21a8c7613ec706534ea1cf9f4b3663e99f0d83f6fca8/pillow-12.1.1-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:e65498daf4b583091ccbb2556c7000abf0f3349fcd57ef7adc9a84a394ed29f6", size = 3601364, upload-time = "2026-02-11T04:21:10.194Z" }, + { url = "https://files.pythonhosted.org/packages/14/a1/16c4b823838ba4c9c52c0e6bbda903a3fe5a1bdbf1b8eb4fff7156f3e318/pillow-12.1.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:6c6db3b84c87d48d0088943bf33440e0c42370b99b1c2a7989216f7b42eede60", size = 5262561, upload-time = "2026-02-11T04:21:11.742Z" }, + { url = "https://files.pythonhosted.org/packages/bb/ad/ad9dc98ff24f485008aa5cdedaf1a219876f6f6c42a4626c08bc4e80b120/pillow-12.1.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:8b7e5304e34942bf62e15184219a7b5ad4ff7f3bb5cca4d984f37df1a0e1aee2", size = 4657460, upload-time = "2026-02-11T04:21:13.786Z" }, + { url = "https://files.pythonhosted.org/packages/9e/1b/f1a4ea9a895b5732152789326202a82464d5254759fbacae4deea3069334/pillow-12.1.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:18e5bddd742a44b7e6b1e773ab5db102bd7a94c32555ba656e76d319d19c3850", size = 6232698, upload-time = "2026-02-11T04:21:15.949Z" }, + { url = "https://files.pythonhosted.org/packages/95/f4/86f51b8745070daf21fd2e5b1fe0eb35d4db9ca26e6d58366562fb56a743/pillow-12.1.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc44ef1f3de4f45b50ccf9136999d71abb99dca7706bc75d222ed350b9fd2289", size = 8041706, upload-time = "2026-02-11T04:21:17.723Z" }, + { url = "https://files.pythonhosted.org/packages/29/9b/d6ecd956bb1266dd1045e995cce9b8d77759e740953a1c9aad9502a0461e/pillow-12.1.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5a8eb7ed8d4198bccbd07058416eeec51686b498e784eda166395a23eb99138e", size = 6346621, upload-time = "2026-02-11T04:21:19.547Z" }, + { url = "https://files.pythonhosted.org/packages/71/24/538bff45bde96535d7d998c6fed1a751c75ac7c53c37c90dc2601b243893/pillow-12.1.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:47b94983da0c642de92ced1702c5b6c292a84bd3a8e1d1702ff923f183594717", size = 7038069, upload-time = "2026-02-11T04:21:21.378Z" }, + { url = "https://files.pythonhosted.org/packages/94/0e/58cb1a6bc48f746bc4cb3adb8cabff73e2742c92b3bf7a220b7cf69b9177/pillow-12.1.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:518a48c2aab7ce596d3bf79d0e275661b846e86e4d0e7dec34712c30fe07f02a", size = 6460040, upload-time = "2026-02-11T04:21:23.148Z" }, + { url = "https://files.pythonhosted.org/packages/6c/57/9045cb3ff11eeb6c1adce3b2d60d7d299d7b273a2e6c8381a524abfdc474/pillow-12.1.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a550ae29b95c6dc13cf69e2c9dc5747f814c54eeb2e32d683e5e93af56caa029", size = 7164523, upload-time = "2026-02-11T04:21:25.01Z" }, + { url = "https://files.pythonhosted.org/packages/73/f2/9be9cb99f2175f0d4dbadd6616ce1bf068ee54a28277ea1bf1fbf729c250/pillow-12.1.1-cp313-cp313-win32.whl", hash = "sha256:a003d7422449f6d1e3a34e3dd4110c22148336918ddbfc6a32581cd54b2e0b2b", size = 6332552, upload-time = "2026-02-11T04:21:27.238Z" }, + { url = "https://files.pythonhosted.org/packages/3f/eb/b0834ad8b583d7d9d42b80becff092082a1c3c156bb582590fcc973f1c7c/pillow-12.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:344cf1e3dab3be4b1fa08e449323d98a2a3f819ad20f4b22e77a0ede31f0faa1", size = 7040108, upload-time = "2026-02-11T04:21:29.462Z" }, + { url = "https://files.pythonhosted.org/packages/d5/7d/fc09634e2aabdd0feabaff4a32f4a7d97789223e7c2042fd805ea4b4d2c2/pillow-12.1.1-cp313-cp313-win_arm64.whl", hash = "sha256:5c0dd1636633e7e6a0afe7bf6a51a14992b7f8e60de5789018ebbdfae55b040a", size = 2453712, upload-time = "2026-02-11T04:21:31.072Z" }, + { url = "https://files.pythonhosted.org/packages/19/2a/b9d62794fc8a0dd14c1943df68347badbd5511103e0d04c035ffe5cf2255/pillow-12.1.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0330d233c1a0ead844fc097a7d16c0abff4c12e856c0b325f231820fee1f39da", size = 5264880, upload-time = "2026-02-11T04:21:32.865Z" }, + { url = "https://files.pythonhosted.org/packages/26/9d/e03d857d1347fa5ed9247e123fcd2a97b6220e15e9cb73ca0a8d91702c6e/pillow-12.1.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5dae5f21afb91322f2ff791895ddd8889e5e947ff59f71b46041c8ce6db790bc", size = 4660616, upload-time = "2026-02-11T04:21:34.97Z" }, + { url = "https://files.pythonhosted.org/packages/f7/ec/8a6d22afd02570d30954e043f09c32772bfe143ba9285e2fdb11284952cd/pillow-12.1.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2e0c664be47252947d870ac0d327fea7e63985a08794758aa8af5b6cb6ec0c9c", size = 6269008, upload-time = "2026-02-11T04:21:36.623Z" }, + { url = "https://files.pythonhosted.org/packages/3d/1d/6d875422c9f28a4a361f495a5f68d9de4a66941dc2c619103ca335fa6446/pillow-12.1.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:691ab2ac363b8217f7d31b3497108fb1f50faab2f75dfb03284ec2f217e87bf8", size = 8073226, upload-time = "2026-02-11T04:21:38.585Z" }, + { url = "https://files.pythonhosted.org/packages/a1/cd/134b0b6ee5eda6dc09e25e24b40fdafe11a520bc725c1d0bbaa5e00bf95b/pillow-12.1.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e9e8064fb1cc019296958595f6db671fba95209e3ceb0c4734c9baf97de04b20", size = 6380136, upload-time = "2026-02-11T04:21:40.562Z" }, + { url = "https://files.pythonhosted.org/packages/7a/a9/7628f013f18f001c1b98d8fffe3452f306a70dc6aba7d931019e0492f45e/pillow-12.1.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:472a8d7ded663e6162dafdf20015c486a7009483ca671cece7a9279b512fcb13", size = 7067129, upload-time = "2026-02-11T04:21:42.521Z" }, + { url = "https://files.pythonhosted.org/packages/1e/f8/66ab30a2193b277785601e82ee2d49f68ea575d9637e5e234faaa98efa4c/pillow-12.1.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:89b54027a766529136a06cfebeecb3a04900397a3590fd252160b888479517bf", size = 6491807, upload-time = "2026-02-11T04:21:44.22Z" }, + { url = "https://files.pythonhosted.org/packages/da/0b/a877a6627dc8318fdb84e357c5e1a758c0941ab1ddffdafd231983788579/pillow-12.1.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:86172b0831b82ce4f7877f280055892b31179e1576aa00d0df3bb1bbf8c3e524", size = 7190954, upload-time = "2026-02-11T04:21:46.114Z" }, + { url = "https://files.pythonhosted.org/packages/83/43/6f732ff85743cf746b1361b91665d9f5155e1483817f693f8d57ea93147f/pillow-12.1.1-cp313-cp313t-win32.whl", hash = "sha256:44ce27545b6efcf0fdbdceb31c9a5bdea9333e664cda58a7e674bb74608b3986", size = 6336441, upload-time = "2026-02-11T04:21:48.22Z" }, + { url = "https://files.pythonhosted.org/packages/3b/44/e865ef3986611bb75bfabdf94a590016ea327833f434558801122979cd0e/pillow-12.1.1-cp313-cp313t-win_amd64.whl", hash = "sha256:a285e3eb7a5a45a2ff504e31f4a8d1b12ef62e84e5411c6804a42197c1cf586c", size = 7045383, upload-time = "2026-02-11T04:21:50.015Z" }, + { url = "https://files.pythonhosted.org/packages/a8/c6/f4fb24268d0c6908b9f04143697ea18b0379490cb74ba9e8d41b898bd005/pillow-12.1.1-cp313-cp313t-win_arm64.whl", hash = "sha256:cc7d296b5ea4d29e6570dabeaed58d31c3fea35a633a69679fb03d7664f43fb3", size = 2456104, upload-time = "2026-02-11T04:21:51.633Z" }, + { url = "https://files.pythonhosted.org/packages/03/d0/bebb3ffbf31c5a8e97241476c4cf8b9828954693ce6744b4a2326af3e16b/pillow-12.1.1-cp314-cp314-ios_13_0_arm64_iphoneos.whl", hash = "sha256:417423db963cb4be8bac3fc1204fe61610f6abeed1580a7a2cbb2fbda20f12af", size = 4062652, upload-time = "2026-02-11T04:21:53.19Z" }, + { url = "https://files.pythonhosted.org/packages/2d/c0/0e16fb0addda4851445c28f8350d8c512f09de27bbb0d6d0bbf8b6709605/pillow-12.1.1-cp314-cp314-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:b957b71c6b2387610f556a7eb0828afbe40b4a98036fc0d2acfa5a44a0c2036f", size = 4138823, upload-time = "2026-02-11T04:22:03.088Z" }, + { url = "https://files.pythonhosted.org/packages/6b/fb/6170ec655d6f6bb6630a013dd7cf7bc218423d7b5fa9071bf63dc32175ae/pillow-12.1.1-cp314-cp314-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:097690ba1f2efdeb165a20469d59d8bb03c55fb6621eb2041a060ae8ea3e9642", size = 3601143, upload-time = "2026-02-11T04:22:04.909Z" }, + { url = "https://files.pythonhosted.org/packages/59/04/dc5c3f297510ba9a6837cbb318b87dd2b8f73eb41a43cc63767f65cb599c/pillow-12.1.1-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:2815a87ab27848db0321fb78c7f0b2c8649dee134b7f2b80c6a45c6831d75ccd", size = 5266254, upload-time = "2026-02-11T04:22:07.656Z" }, + { url = "https://files.pythonhosted.org/packages/05/30/5db1236b0d6313f03ebf97f5e17cda9ca060f524b2fcc875149a8360b21c/pillow-12.1.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:f7ed2c6543bad5a7d5530eb9e78c53132f93dfa44a28492db88b41cdab885202", size = 4657499, upload-time = "2026-02-11T04:22:09.613Z" }, + { url = "https://files.pythonhosted.org/packages/6f/18/008d2ca0eb612e81968e8be0bbae5051efba24d52debf930126d7eaacbba/pillow-12.1.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:652a2c9ccfb556235b2b501a3a7cf3742148cd22e04b5625c5fe057ea3e3191f", size = 6232137, upload-time = "2026-02-11T04:22:11.434Z" }, + { url = "https://files.pythonhosted.org/packages/70/f1/f14d5b8eeb4b2cd62b9f9f847eb6605f103df89ef619ac68f92f748614ea/pillow-12.1.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d6e4571eedf43af33d0fc233a382a76e849badbccdf1ac438841308652a08e1f", size = 8042721, upload-time = "2026-02-11T04:22:13.321Z" }, + { url = "https://files.pythonhosted.org/packages/5a/d6/17824509146e4babbdabf04d8171491fa9d776f7061ff6e727522df9bd03/pillow-12.1.1-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b574c51cf7d5d62e9be37ba446224b59a2da26dc4c1bb2ecbe936a4fb1a7cb7f", size = 6347798, upload-time = "2026-02-11T04:22:15.449Z" }, + { url = "https://files.pythonhosted.org/packages/d1/ee/c85a38a9ab92037a75615aba572c85ea51e605265036e00c5b67dfafbfe2/pillow-12.1.1-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a37691702ed687799de29a518d63d4682d9016932db66d4e90c345831b02fb4e", size = 7039315, upload-time = "2026-02-11T04:22:17.24Z" }, + { url = "https://files.pythonhosted.org/packages/ec/f3/bc8ccc6e08a148290d7523bde4d9a0d6c981db34631390dc6e6ec34cacf6/pillow-12.1.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f95c00d5d6700b2b890479664a06e754974848afaae5e21beb4d83c106923fd0", size = 6462360, upload-time = "2026-02-11T04:22:19.111Z" }, + { url = "https://files.pythonhosted.org/packages/f6/ab/69a42656adb1d0665ab051eec58a41f169ad295cf81ad45406963105408f/pillow-12.1.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:559b38da23606e68681337ad74622c4dbba02254fc9cb4488a305dd5975c7eeb", size = 7165438, upload-time = "2026-02-11T04:22:21.041Z" }, + { url = "https://files.pythonhosted.org/packages/02/46/81f7aa8941873f0f01d4b55cc543b0a3d03ec2ee30d617a0448bf6bd6dec/pillow-12.1.1-cp314-cp314-win32.whl", hash = "sha256:03edcc34d688572014ff223c125a3f77fb08091e4607e7745002fc214070b35f", size = 6431503, upload-time = "2026-02-11T04:22:22.833Z" }, + { url = "https://files.pythonhosted.org/packages/40/72/4c245f7d1044b67affc7f134a09ea619d4895333d35322b775b928180044/pillow-12.1.1-cp314-cp314-win_amd64.whl", hash = "sha256:50480dcd74fa63b8e78235957d302d98d98d82ccbfac4c7e12108ba9ecbdba15", size = 7176748, upload-time = "2026-02-11T04:22:24.64Z" }, + { url = "https://files.pythonhosted.org/packages/e4/ad/8a87bdbe038c5c698736e3348af5c2194ffb872ea52f11894c95f9305435/pillow-12.1.1-cp314-cp314-win_arm64.whl", hash = "sha256:5cb1785d97b0c3d1d1a16bc1d710c4a0049daefc4935f3a8f31f827f4d3d2e7f", size = 2544314, upload-time = "2026-02-11T04:22:26.685Z" }, + { url = "https://files.pythonhosted.org/packages/6c/9d/efd18493f9de13b87ede7c47e69184b9e859e4427225ea962e32e56a49bc/pillow-12.1.1-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:1f90cff8aa76835cba5769f0b3121a22bd4eb9e6884cfe338216e557a9a548b8", size = 5268612, upload-time = "2026-02-11T04:22:29.884Z" }, + { url = "https://files.pythonhosted.org/packages/f8/f1/4f42eb2b388eb2ffc660dcb7f7b556c1015c53ebd5f7f754965ef997585b/pillow-12.1.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1f1be78ce9466a7ee64bfda57bdba0f7cc499d9794d518b854816c41bf0aa4e9", size = 4660567, upload-time = "2026-02-11T04:22:31.799Z" }, + { url = "https://files.pythonhosted.org/packages/01/54/df6ef130fa43e4b82e32624a7b821a2be1c5653a5fdad8469687a7db4e00/pillow-12.1.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:42fc1f4677106188ad9a55562bbade416f8b55456f522430fadab3cef7cd4e60", size = 6269951, upload-time = "2026-02-11T04:22:33.921Z" }, + { url = "https://files.pythonhosted.org/packages/a9/48/618752d06cc44bb4aae8ce0cd4e6426871929ed7b46215638088270d9b34/pillow-12.1.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:98edb152429ab62a1818039744d8fbb3ccab98a7c29fc3d5fcef158f3f1f68b7", size = 8074769, upload-time = "2026-02-11T04:22:35.877Z" }, + { url = "https://files.pythonhosted.org/packages/c3/bd/f1d71eb39a72fa088d938655afba3e00b38018d052752f435838961127d8/pillow-12.1.1-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d470ab1178551dd17fdba0fef463359c41aaa613cdcd7ff8373f54be629f9f8f", size = 6381358, upload-time = "2026-02-11T04:22:37.698Z" }, + { url = "https://files.pythonhosted.org/packages/64/ef/c784e20b96674ed36a5af839305f55616f8b4f8aa8eeccf8531a6e312243/pillow-12.1.1-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6408a7b064595afcab0a49393a413732a35788f2a5092fdc6266952ed67de586", size = 7068558, upload-time = "2026-02-11T04:22:39.597Z" }, + { url = "https://files.pythonhosted.org/packages/73/cb/8059688b74422ae61278202c4e1ad992e8a2e7375227be0a21c6b87ca8d5/pillow-12.1.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5d8c41325b382c07799a3682c1c258469ea2ff97103c53717b7893862d0c98ce", size = 6493028, upload-time = "2026-02-11T04:22:42.73Z" }, + { url = "https://files.pythonhosted.org/packages/c6/da/e3c008ed7d2dd1f905b15949325934510b9d1931e5df999bb15972756818/pillow-12.1.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:c7697918b5be27424e9ce568193efd13d925c4481dd364e43f5dff72d33e10f8", size = 7191940, upload-time = "2026-02-11T04:22:44.543Z" }, + { url = "https://files.pythonhosted.org/packages/01/4a/9202e8d11714c1fc5951f2e1ef362f2d7fbc595e1f6717971d5dd750e969/pillow-12.1.1-cp314-cp314t-win32.whl", hash = "sha256:d2912fd8114fc5545aa3a4b5576512f64c55a03f3ebcca4c10194d593d43ea36", size = 6438736, upload-time = "2026-02-11T04:22:46.347Z" }, + { url = "https://files.pythonhosted.org/packages/f3/ca/cbce2327eb9885476b3957b2e82eb12c866a8b16ad77392864ad601022ce/pillow-12.1.1-cp314-cp314t-win_amd64.whl", hash = "sha256:4ceb838d4bd9dab43e06c363cab2eebf63846d6a4aeaea283bbdfd8f1a8ed58b", size = 7182894, upload-time = "2026-02-11T04:22:48.114Z" }, + { url = "https://files.pythonhosted.org/packages/ec/d2/de599c95ba0a973b94410477f8bf0b6f0b5e67360eb89bcb1ad365258beb/pillow-12.1.1-cp314-cp314t-win_arm64.whl", hash = "sha256:7b03048319bfc6170e93bd60728a1af51d3dd7704935feb228c4d4faab35d334", size = 2546446, upload-time = "2026-02-11T04:22:50.342Z" }, + { url = "https://files.pythonhosted.org/packages/56/11/5d43209aa4cb58e0cc80127956ff1796a68b928e6324bbf06ef4db34367b/pillow-12.1.1-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:600fd103672b925fe62ed08e0d874ea34d692474df6f4bf7ebe148b30f89f39f", size = 5228606, upload-time = "2026-02-11T04:22:52.106Z" }, + { url = "https://files.pythonhosted.org/packages/5f/d5/3b005b4e4fda6698b371fa6c21b097d4707585d7db99e98d9b0b87ac612a/pillow-12.1.1-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:665e1b916b043cef294bc54d47bf02d87e13f769bc4bc5fa225a24b3a6c5aca9", size = 4622321, upload-time = "2026-02-11T04:22:53.827Z" }, + { url = "https://files.pythonhosted.org/packages/df/36/ed3ea2d594356fd8037e5a01f6156c74bc8d92dbb0fa60746cc96cabb6e8/pillow-12.1.1-pp311-pypy311_pp73-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:495c302af3aad1ca67420ddd5c7bd480c8867ad173528767d906428057a11f0e", size = 5247579, upload-time = "2026-02-11T04:22:56.094Z" }, + { url = "https://files.pythonhosted.org/packages/54/9a/9cc3e029683cf6d20ae5085da0dafc63148e3252c2f13328e553aaa13cfb/pillow-12.1.1-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8fd420ef0c52c88b5a035a0886f367748c72147b2b8f384c9d12656678dfdfa9", size = 6989094, upload-time = "2026-02-11T04:22:58.288Z" }, + { url = "https://files.pythonhosted.org/packages/00/98/fc53ab36da80b88df0967896b6c4b4cd948a0dc5aa40a754266aa3ae48b3/pillow-12.1.1-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f975aa7ef9684ce7e2c18a3aa8f8e2106ce1e46b94ab713d156b2898811651d3", size = 5313850, upload-time = "2026-02-11T04:23:00.554Z" }, + { url = "https://files.pythonhosted.org/packages/30/02/00fa585abfd9fe9d73e5f6e554dc36cc2b842898cbfc46d70353dae227f8/pillow-12.1.1-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8089c852a56c2966cf18835db62d9b34fef7ba74c726ad943928d494fa7f4735", size = 5963343, upload-time = "2026-02-11T04:23:02.934Z" }, + { url = "https://files.pythonhosted.org/packages/f2/26/c56ce33ca856e358d27fda9676c055395abddb82c35ac0f593877ed4562e/pillow-12.1.1-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:cb9bb857b2d057c6dfc72ac5f3b44836924ba15721882ef103cecb40d002d80e", size = 7029880, upload-time = "2026-02-11T04:23:04.783Z" }, +] + [[package]] name = "platformdirs" version = "4.9.4" @@ -887,6 +1322,19 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/63/d7/97f7e3a6abb67d8080dd406fd4df842c2be0efaf712d1c899c32a075027c/platformdirs-4.9.4-py3-none-any.whl", hash = "sha256:68a9a4619a666ea6439f2ff250c12a853cd1cbd5158d258bd824a7df6be2f868", size = 21216, upload-time = "2026-03-05T18:34:12.172Z" }, ] +[[package]] +name = "plotly" +version = "6.6.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "narwhals" }, + { name = "packaging" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/24/fb/41efe84970cfddefd4ccf025e2cbfafe780004555f583e93dba3dac2cdef/plotly-6.6.0.tar.gz", hash = "sha256:b897f15f3b02028d69f755f236be890ba950d0a42d7dfc619b44e2d8cea8748c", size = 7027956, upload-time = "2026-03-02T21:10:25.321Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/52/d2/c6e44dba74f17c6216ce1b56044a9b93a929f1c2d5bdaff892512b260f5e/plotly-6.6.0-py3-none-any.whl", hash = "sha256:8d6daf0f87412e0c0bfe72e809d615217ab57cc715899a1e5145135a7800d1d0", size = 9910315, upload-time = "2026-03-02T21:10:18.131Z" }, +] + [[package]] name = "pluggy" version = "1.6.0" @@ -1121,6 +1569,77 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/8f/e8/726643a3ea68c727da31570bde48c7a10f1aa60eddd628d94078fec586ff/ruff-0.15.7-py3-none-win_arm64.whl", hash = "sha256:18e8d73f1c3fdf27931497972250340f92e8c861722161a9caeb89a58ead6ed2", size = 11023304, upload-time = "2026-03-19T16:26:51.669Z" }, ] +[[package]] +name = "scipy" +version = "1.17.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "numpy" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/7a/97/5a3609c4f8d58b039179648e62dd220f89864f56f7357f5d4f45c29eb2cc/scipy-1.17.1.tar.gz", hash = "sha256:95d8e012d8cb8816c226aef832200b1d45109ed4464303e997c5b13122b297c0", size = 30573822, upload-time = "2026-02-23T00:26:24.851Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/df/75/b4ce781849931fef6fd529afa6b63711d5a733065722d0c3e2724af9e40a/scipy-1.17.1-cp311-cp311-macosx_10_14_x86_64.whl", hash = "sha256:1f95b894f13729334fb990162e911c9e5dc1ab390c58aa6cbecb389c5b5e28ec", size = 31613675, upload-time = "2026-02-23T00:16:00.13Z" }, + { url = "https://files.pythonhosted.org/packages/f7/58/bccc2861b305abdd1b8663d6130c0b3d7cc22e8d86663edbc8401bfd40d4/scipy-1.17.1-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:e18f12c6b0bc5a592ed23d3f7b891f68fd7f8241d69b7883769eb5d5dfb52696", size = 28162057, upload-time = "2026-02-23T00:16:09.456Z" }, + { url = "https://files.pythonhosted.org/packages/6d/ee/18146b7757ed4976276b9c9819108adbc73c5aad636e5353e20746b73069/scipy-1.17.1-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:a3472cfbca0a54177d0faa68f697d8ba4c80bbdc19908c3465556d9f7efce9ee", size = 20334032, upload-time = "2026-02-23T00:16:17.358Z" }, + { url = "https://files.pythonhosted.org/packages/ec/e6/cef1cf3557f0c54954198554a10016b6a03b2ec9e22a4e1df734936bd99c/scipy-1.17.1-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:766e0dc5a616d026a3a1cffa379af959671729083882f50307e18175797b3dfd", size = 22709533, upload-time = "2026-02-23T00:16:25.791Z" }, + { url = "https://files.pythonhosted.org/packages/4d/60/8804678875fc59362b0fb759ab3ecce1f09c10a735680318ac30da8cd76b/scipy-1.17.1-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:744b2bf3640d907b79f3fd7874efe432d1cf171ee721243e350f55234b4cec4c", size = 33062057, upload-time = "2026-02-23T00:16:36.931Z" }, + { url = "https://files.pythonhosted.org/packages/09/7d/af933f0f6e0767995b4e2d705a0665e454d1c19402aa7e895de3951ebb04/scipy-1.17.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:43af8d1f3bea642559019edfe64e9b11192a8978efbd1539d7bc2aaa23d92de4", size = 35349300, upload-time = "2026-02-23T00:16:49.108Z" }, + { url = "https://files.pythonhosted.org/packages/b4/3d/7ccbbdcbb54c8fdc20d3b6930137c782a163fa626f0aef920349873421ba/scipy-1.17.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:cd96a1898c0a47be4520327e01f874acfd61fb48a9420f8aa9f6483412ffa444", size = 35127333, upload-time = "2026-02-23T00:17:01.293Z" }, + { url = "https://files.pythonhosted.org/packages/e8/19/f926cb11c42b15ba08e3a71e376d816ac08614f769b4f47e06c3580c836a/scipy-1.17.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4eb6c25dd62ee8d5edf68a8e1c171dd71c292fdae95d8aeb3dd7d7de4c364082", size = 37741314, upload-time = "2026-02-23T00:17:12.576Z" }, + { url = "https://files.pythonhosted.org/packages/95/da/0d1df507cf574b3f224ccc3d45244c9a1d732c81dcb26b1e8a766ae271a8/scipy-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:d30e57c72013c2a4fe441c2fcb8e77b14e152ad48b5464858e07e2ad9fbfceff", size = 36607512, upload-time = "2026-02-23T00:17:23.424Z" }, + { url = "https://files.pythonhosted.org/packages/68/7f/bdd79ceaad24b671543ffe0ef61ed8e659440eb683b66f033454dcee90eb/scipy-1.17.1-cp311-cp311-win_arm64.whl", hash = "sha256:9ecb4efb1cd6e8c4afea0daa91a87fbddbce1b99d2895d151596716c0b2e859d", size = 24599248, upload-time = "2026-02-23T00:17:34.561Z" }, + { url = "https://files.pythonhosted.org/packages/35/48/b992b488d6f299dbe3f11a20b24d3dda3d46f1a635ede1c46b5b17a7b163/scipy-1.17.1-cp312-cp312-macosx_10_14_x86_64.whl", hash = "sha256:35c3a56d2ef83efc372eaec584314bd0ef2e2f0d2adb21c55e6ad5b344c0dcb8", size = 31610954, upload-time = "2026-02-23T00:17:49.855Z" }, + { url = "https://files.pythonhosted.org/packages/b2/02/cf107b01494c19dc100f1d0b7ac3cc08666e96ba2d64db7626066cee895e/scipy-1.17.1-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:fcb310ddb270a06114bb64bbe53c94926b943f5b7f0842194d585c65eb4edd76", size = 28172662, upload-time = "2026-02-23T00:18:01.64Z" }, + { url = "https://files.pythonhosted.org/packages/cf/a9/599c28631bad314d219cf9ffd40e985b24d603fc8a2f4ccc5ae8419a535b/scipy-1.17.1-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:cc90d2e9c7e5c7f1a482c9875007c095c3194b1cfedca3c2f3291cdc2bc7c086", size = 20344366, upload-time = "2026-02-23T00:18:12.015Z" }, + { url = "https://files.pythonhosted.org/packages/35/f5/906eda513271c8deb5af284e5ef0206d17a96239af79f9fa0aebfe0e36b4/scipy-1.17.1-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:c80be5ede8f3f8eded4eff73cc99a25c388ce98e555b17d31da05287015ffa5b", size = 22704017, upload-time = "2026-02-23T00:18:21.502Z" }, + { url = "https://files.pythonhosted.org/packages/da/34/16f10e3042d2f1d6b66e0428308ab52224b6a23049cb2f5c1756f713815f/scipy-1.17.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e19ebea31758fac5893a2ac360fedd00116cbb7628e650842a6691ba7ca28a21", size = 32927842, upload-time = "2026-02-23T00:18:35.367Z" }, + { url = "https://files.pythonhosted.org/packages/01/8e/1e35281b8ab6d5d72ebe9911edcdffa3f36b04ed9d51dec6dd140396e220/scipy-1.17.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:02ae3b274fde71c5e92ac4d54bc06c42d80e399fec704383dcd99b301df37458", size = 35235890, upload-time = "2026-02-23T00:18:49.188Z" }, + { url = "https://files.pythonhosted.org/packages/c5/5c/9d7f4c88bea6e0d5a4f1bc0506a53a00e9fcb198de372bfe4d3652cef482/scipy-1.17.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8a604bae87c6195d8b1045eddece0514d041604b14f2727bbc2b3020172045eb", size = 35003557, upload-time = "2026-02-23T00:18:54.74Z" }, + { url = "https://files.pythonhosted.org/packages/65/94/7698add8f276dbab7a9de9fb6b0e02fc13ee61d51c7c3f85ac28b65e1239/scipy-1.17.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:f590cd684941912d10becc07325a3eeb77886fe981415660d9265c4c418d0bea", size = 37625856, upload-time = "2026-02-23T00:19:00.307Z" }, + { url = "https://files.pythonhosted.org/packages/a2/84/dc08d77fbf3d87d3ee27f6a0c6dcce1de5829a64f2eae85a0ecc1f0daa73/scipy-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:41b71f4a3a4cab9d366cd9065b288efc4d4f3c0b37a91a8e0947fb5bd7f31d87", size = 36549682, upload-time = "2026-02-23T00:19:07.67Z" }, + { url = "https://files.pythonhosted.org/packages/bc/98/fe9ae9ffb3b54b62559f52dedaebe204b408db8109a8c66fdd04869e6424/scipy-1.17.1-cp312-cp312-win_arm64.whl", hash = "sha256:f4115102802df98b2b0db3cce5cb9b92572633a1197c77b7553e5203f284a5b3", size = 24547340, upload-time = "2026-02-23T00:19:12.024Z" }, + { url = "https://files.pythonhosted.org/packages/76/27/07ee1b57b65e92645f219b37148a7e7928b82e2b5dbeccecb4dff7c64f0b/scipy-1.17.1-cp313-cp313-macosx_10_14_x86_64.whl", hash = "sha256:5e3c5c011904115f88a39308379c17f91546f77c1667cea98739fe0fccea804c", size = 31590199, upload-time = "2026-02-23T00:19:17.192Z" }, + { url = "https://files.pythonhosted.org/packages/ec/ae/db19f8ab842e9b724bf5dbb7db29302a91f1e55bc4d04b1025d6d605a2c5/scipy-1.17.1-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:6fac755ca3d2c3edcb22f479fceaa241704111414831ddd3bc6056e18516892f", size = 28154001, upload-time = "2026-02-23T00:19:22.241Z" }, + { url = "https://files.pythonhosted.org/packages/5b/58/3ce96251560107b381cbd6e8413c483bbb1228a6b919fa8652b0d4090e7f/scipy-1.17.1-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:7ff200bf9d24f2e4d5dc6ee8c3ac64d739d3a89e2326ba68aaf6c4a2b838fd7d", size = 20325719, upload-time = "2026-02-23T00:19:26.329Z" }, + { url = "https://files.pythonhosted.org/packages/b2/83/15087d945e0e4d48ce2377498abf5ad171ae013232ae31d06f336e64c999/scipy-1.17.1-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:4b400bdc6f79fa02a4d86640310dde87a21fba0c979efff5248908c6f15fad1b", size = 22683595, upload-time = "2026-02-23T00:19:30.304Z" }, + { url = "https://files.pythonhosted.org/packages/b4/e0/e58fbde4a1a594c8be8114eb4aac1a55bcd6587047efc18a61eb1f5c0d30/scipy-1.17.1-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2b64ca7d4aee0102a97f3ba22124052b4bd2152522355073580bf4845e2550b6", size = 32896429, upload-time = "2026-02-23T00:19:35.536Z" }, + { url = "https://files.pythonhosted.org/packages/f5/5f/f17563f28ff03c7b6799c50d01d5d856a1d55f2676f537ca8d28c7f627cd/scipy-1.17.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:581b2264fc0aa555f3f435a5944da7504ea3a065d7029ad60e7c3d1ae09c5464", size = 35203952, upload-time = "2026-02-23T00:19:42.259Z" }, + { url = "https://files.pythonhosted.org/packages/8d/a5/9afd17de24f657fdfe4df9a3f1ea049b39aef7c06000c13db1530d81ccca/scipy-1.17.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:beeda3d4ae615106d7094f7e7cef6218392e4465cc95d25f900bebabfded0950", size = 34979063, upload-time = "2026-02-23T00:19:47.547Z" }, + { url = "https://files.pythonhosted.org/packages/8b/13/88b1d2384b424bf7c924f2038c1c409f8d88bb2a8d49d097861dd64a57b2/scipy-1.17.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6609bc224e9568f65064cfa72edc0f24ee6655b47575954ec6339534b2798369", size = 37598449, upload-time = "2026-02-23T00:19:53.238Z" }, + { url = "https://files.pythonhosted.org/packages/35/e5/d6d0e51fc888f692a35134336866341c08655d92614f492c6860dc45bb2c/scipy-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:37425bc9175607b0268f493d79a292c39f9d001a357bebb6b88fdfaff13f6448", size = 36510943, upload-time = "2026-02-23T00:20:50.89Z" }, + { url = "https://files.pythonhosted.org/packages/2a/fd/3be73c564e2a01e690e19cc618811540ba5354c67c8680dce3281123fb79/scipy-1.17.1-cp313-cp313-win_arm64.whl", hash = "sha256:5cf36e801231b6a2059bf354720274b7558746f3b1a4efb43fcf557ccd484a87", size = 24545621, upload-time = "2026-02-23T00:20:55.871Z" }, + { url = "https://files.pythonhosted.org/packages/6f/6b/17787db8b8114933a66f9dcc479a8272e4b4da75fe03b0c282f7b0ade8cd/scipy-1.17.1-cp313-cp313t-macosx_10_14_x86_64.whl", hash = "sha256:d59c30000a16d8edc7e64152e30220bfbd724c9bbb08368c054e24c651314f0a", size = 31936708, upload-time = "2026-02-23T00:19:58.694Z" }, + { url = "https://files.pythonhosted.org/packages/38/2e/524405c2b6392765ab1e2b722a41d5da33dc5c7b7278184a8ad29b6cb206/scipy-1.17.1-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:010f4333c96c9bb1a4516269e33cb5917b08ef2166d5556ca2fd9f082a9e6ea0", size = 28570135, upload-time = "2026-02-23T00:20:03.934Z" }, + { url = "https://files.pythonhosted.org/packages/fd/c3/5bd7199f4ea8556c0c8e39f04ccb014ac37d1468e6cfa6a95c6b3562b76e/scipy-1.17.1-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:2ceb2d3e01c5f1d83c4189737a42d9cb2fc38a6eeed225e7515eef71ad301dce", size = 20741977, upload-time = "2026-02-23T00:20:07.935Z" }, + { url = "https://files.pythonhosted.org/packages/d9/b8/8ccd9b766ad14c78386599708eb745f6b44f08400a5fd0ade7cf89b6fc93/scipy-1.17.1-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:844e165636711ef41f80b4103ed234181646b98a53c8f05da12ca5ca289134f6", size = 23029601, upload-time = "2026-02-23T00:20:12.161Z" }, + { url = "https://files.pythonhosted.org/packages/6d/a0/3cb6f4d2fb3e17428ad2880333cac878909ad1a89f678527b5328b93c1d4/scipy-1.17.1-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:158dd96d2207e21c966063e1635b1063cd7787b627b6f07305315dd73d9c679e", size = 33019667, upload-time = "2026-02-23T00:20:17.208Z" }, + { url = "https://files.pythonhosted.org/packages/f3/c3/2d834a5ac7bf3a0c806ad1508efc02dda3c8c61472a56132d7894c312dea/scipy-1.17.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:74cbb80d93260fe2ffa334efa24cb8f2f0f622a9b9febf8b483c0b865bfb3475", size = 35264159, upload-time = "2026-02-23T00:20:23.087Z" }, + { url = "https://files.pythonhosted.org/packages/4d/77/d3ed4becfdbd217c52062fafe35a72388d1bd82c2d0ba5ca19d6fcc93e11/scipy-1.17.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:dbc12c9f3d185f5c737d801da555fb74b3dcfa1a50b66a1a93e09190f41fab50", size = 35102771, upload-time = "2026-02-23T00:20:28.636Z" }, + { url = "https://files.pythonhosted.org/packages/bd/12/d19da97efde68ca1ee5538bb261d5d2c062f0c055575128f11a2730e3ac1/scipy-1.17.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:94055a11dfebe37c656e70317e1996dc197e1a15bbcc351bcdd4610e128fe1ca", size = 37665910, upload-time = "2026-02-23T00:20:34.743Z" }, + { url = "https://files.pythonhosted.org/packages/06/1c/1172a88d507a4baaf72c5a09bb6c018fe2ae0ab622e5830b703a46cc9e44/scipy-1.17.1-cp313-cp313t-win_amd64.whl", hash = "sha256:e30bdeaa5deed6bc27b4cc490823cd0347d7dae09119b8803ae576ea0ce52e4c", size = 36562980, upload-time = "2026-02-23T00:20:40.575Z" }, + { url = "https://files.pythonhosted.org/packages/70/b0/eb757336e5a76dfa7911f63252e3b7d1de00935d7705cf772db5b45ec238/scipy-1.17.1-cp313-cp313t-win_arm64.whl", hash = "sha256:a720477885a9d2411f94a93d16f9d89bad0f28ca23c3f8daa521e2dcc3f44d49", size = 24856543, upload-time = "2026-02-23T00:20:45.313Z" }, + { url = "https://files.pythonhosted.org/packages/cf/83/333afb452af6f0fd70414dc04f898647ee1423979ce02efa75c3b0f2c28e/scipy-1.17.1-cp314-cp314-macosx_10_14_x86_64.whl", hash = "sha256:a48a72c77a310327f6a3a920092fa2b8fd03d7deaa60f093038f22d98e096717", size = 31584510, upload-time = "2026-02-23T00:21:01.015Z" }, + { url = "https://files.pythonhosted.org/packages/ed/a6/d05a85fd51daeb2e4ea71d102f15b34fedca8e931af02594193ae4fd25f7/scipy-1.17.1-cp314-cp314-macosx_12_0_arm64.whl", hash = "sha256:45abad819184f07240d8a696117a7aacd39787af9e0b719d00285549ed19a1e9", size = 28170131, upload-time = "2026-02-23T00:21:05.888Z" }, + { url = "https://files.pythonhosted.org/packages/db/7b/8624a203326675d7746a254083a187398090a179335b2e4a20e2ddc46e83/scipy-1.17.1-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:3fd1fcdab3ea951b610dc4cef356d416d5802991e7e32b5254828d342f7b7e0b", size = 20342032, upload-time = "2026-02-23T00:21:09.904Z" }, + { url = "https://files.pythonhosted.org/packages/c9/35/2c342897c00775d688d8ff3987aced3426858fd89d5a0e26e020b660b301/scipy-1.17.1-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:7bdf2da170b67fdf10bca777614b1c7d96ae3ca5794fd9587dce41eb2966e866", size = 22678766, upload-time = "2026-02-23T00:21:14.313Z" }, + { url = "https://files.pythonhosted.org/packages/ef/f2/7cdb8eb308a1a6ae1e19f945913c82c23c0c442a462a46480ce487fdc0ac/scipy-1.17.1-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:adb2642e060a6549c343603a3851ba76ef0b74cc8c079a9a58121c7ec9fe2350", size = 32957007, upload-time = "2026-02-23T00:21:19.663Z" }, + { url = "https://files.pythonhosted.org/packages/0b/2e/7eea398450457ecb54e18e9d10110993fa65561c4f3add5e8eccd2b9cd41/scipy-1.17.1-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:eee2cfda04c00a857206a4330f0c5e3e56535494e30ca445eb19ec624ae75118", size = 35221333, upload-time = "2026-02-23T00:21:25.278Z" }, + { url = "https://files.pythonhosted.org/packages/d9/77/5b8509d03b77f093a0d52e606d3c4f79e8b06d1d38c441dacb1e26cacf46/scipy-1.17.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:d2650c1fb97e184d12d8ba010493ee7b322864f7d3d00d3f9bb97d9c21de4068", size = 35042066, upload-time = "2026-02-23T00:21:31.358Z" }, + { url = "https://files.pythonhosted.org/packages/f9/df/18f80fb99df40b4070328d5ae5c596f2f00fffb50167e31439e932f29e7d/scipy-1.17.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:08b900519463543aa604a06bec02461558a6e1cef8fdbb8098f77a48a83c8118", size = 37612763, upload-time = "2026-02-23T00:21:37.247Z" }, + { url = "https://files.pythonhosted.org/packages/4b/39/f0e8ea762a764a9dc52aa7dabcfad51a354819de1f0d4652b6a1122424d6/scipy-1.17.1-cp314-cp314-win_amd64.whl", hash = "sha256:3877ac408e14da24a6196de0ddcace62092bfc12a83823e92e49e40747e52c19", size = 37290984, upload-time = "2026-02-23T00:22:35.023Z" }, + { url = "https://files.pythonhosted.org/packages/7c/56/fe201e3b0f93d1a8bcf75d3379affd228a63d7e2d80ab45467a74b494947/scipy-1.17.1-cp314-cp314-win_arm64.whl", hash = "sha256:f8885db0bc2bffa59d5c1b72fad7a6a92d3e80e7257f967dd81abb553a90d293", size = 25192877, upload-time = "2026-02-23T00:22:39.798Z" }, + { url = "https://files.pythonhosted.org/packages/96/ad/f8c414e121f82e02d76f310f16db9899c4fcde36710329502a6b2a3c0392/scipy-1.17.1-cp314-cp314t-macosx_10_14_x86_64.whl", hash = "sha256:1cc682cea2ae55524432f3cdff9e9a3be743d52a7443d0cba9017c23c87ae2f6", size = 31949750, upload-time = "2026-02-23T00:21:42.289Z" }, + { url = "https://files.pythonhosted.org/packages/7c/b0/c741e8865d61b67c81e255f4f0a832846c064e426636cd7de84e74d209be/scipy-1.17.1-cp314-cp314t-macosx_12_0_arm64.whl", hash = "sha256:2040ad4d1795a0ae89bfc7e8429677f365d45aa9fd5e4587cf1ea737f927b4a1", size = 28585858, upload-time = "2026-02-23T00:21:47.706Z" }, + { url = "https://files.pythonhosted.org/packages/ed/1b/3985219c6177866628fa7c2595bfd23f193ceebbe472c98a08824b9466ff/scipy-1.17.1-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:131f5aaea57602008f9822e2115029b55d4b5f7c070287699fe45c661d051e39", size = 20757723, upload-time = "2026-02-23T00:21:52.039Z" }, + { url = "https://files.pythonhosted.org/packages/c0/19/2a04aa25050d656d6f7b9e7b685cc83d6957fb101665bfd9369ca6534563/scipy-1.17.1-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:9cdc1a2fcfd5c52cfb3045feb399f7b3ce822abdde3a193a6b9a60b3cb5854ca", size = 23043098, upload-time = "2026-02-23T00:21:56.185Z" }, + { url = "https://files.pythonhosted.org/packages/86/f1/3383beb9b5d0dbddd030335bf8a8b32d4317185efe495374f134d8be6cce/scipy-1.17.1-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6e3dcd57ab780c741fde8dc68619de988b966db759a3c3152e8e9142c26295ad", size = 33030397, upload-time = "2026-02-23T00:22:01.404Z" }, + { url = "https://files.pythonhosted.org/packages/41/68/8f21e8a65a5a03f25a79165ec9d2b28c00e66dc80546cf5eb803aeeff35b/scipy-1.17.1-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a9956e4d4f4a301ebf6cde39850333a6b6110799d470dbbb1e25326ac447f52a", size = 35281163, upload-time = "2026-02-23T00:22:07.024Z" }, + { url = "https://files.pythonhosted.org/packages/84/8d/c8a5e19479554007a5632ed7529e665c315ae7492b4f946b0deb39870e39/scipy-1.17.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:a4328d245944d09fd639771de275701ccadf5f781ba0ff092ad141e017eccda4", size = 35116291, upload-time = "2026-02-23T00:22:12.585Z" }, + { url = "https://files.pythonhosted.org/packages/52/52/e57eceff0e342a1f50e274264ed47497b59e6a4e3118808ee58ddda7b74a/scipy-1.17.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a77cbd07b940d326d39a1d1b37817e2ee4d79cb30e7338f3d0cddffae70fcaa2", size = 37682317, upload-time = "2026-02-23T00:22:18.513Z" }, + { url = "https://files.pythonhosted.org/packages/11/2f/b29eafe4a3fbc3d6de9662b36e028d5f039e72d345e05c250e121a230dd4/scipy-1.17.1-cp314-cp314t-win_amd64.whl", hash = "sha256:eb092099205ef62cd1782b006658db09e2fed75bffcae7cc0d44052d8aa0f484", size = 37345327, upload-time = "2026-02-23T00:22:24.442Z" }, + { url = "https://files.pythonhosted.org/packages/07/39/338d9219c4e87f3e708f18857ecd24d22a0c3094752393319553096b98af/scipy-1.17.1-cp314-cp314t-win_arm64.whl", hash = "sha256:200e1050faffacc162be6a486a984a0497866ec54149a01270adc8a59b7c7d21", size = 25489165, upload-time = "2026-02-23T00:22:29.563Z" }, +] + [[package]] name = "six" version = "1.17.0"