Skip to content

Releases: ThHanke/ontosphere

v1.2.0

27 Apr 17:07

Choose a tag to compare

New features

  • Manchester Pizza Tutorial demo — full OWL pizza ontology built step-by-step via MCP tools: classes, disjointWith axioms, subclass hierarchies, object properties with domain/range, inverseOf, FunctionalProperty, ABox individuals, and OWL-RL reasoning. Available as both a seed-driven doc (docs/mcp-demo/pizza-tutorial.md) and a side-by-side AI tutor chat video (pizza-tutorial-chat).
  • Side-by-side chat demo format — new demo-stage.html layout with mock AI chat on the left and live Ontosphere canvas on the right. Scripted via addChatMessage() / callToolOnStage() — no relay bookmarklet needed.
  • Demo video pipelinenpm run demo:video records all demos at 1920×1080 (H.264 High/CRF 20) using xvfb-run; npm run demo:all regenerates all seed-driven markdown docs and auto-updates the README demo table with the latest SVG snapshot.
  • MCP server renamedvisgraphMcpServerontosphereMcpServer; all internal and external references updated.

Improvements

  • Caption overlay in demo recordings now appears after reasoning results are visible on canvas (captionAfter option in runSeedTurn).
  • Chat stream scrolls on every word during AI message streaming — last message no longer hidden behind input bar.
  • Anchor tag color uses --primary token instead of browser-default blue, consistent across light and dark mode.
  • demo:all auto-patches README demo table with the last SVG produced by each seed run.

Fixes

  • Video recording resolution was 800×450 (Playwright default) — fixed to explicit 1920×1080.
  • demo-bootstrap.mjs pointed at renamed MCP server file.
  • All VisGraph/visgraph branding references replaced with Ontosphere across source, scripts, and docs.

Metadata

  • CITATION.cff updated: title, version, repository URL, and concept DOI (10.5281/zenodo.19605270).
  • package.json name updated to ontosphere.

Full Changelog: v1.1.0...v1.2.0

AI Integration

24 Apr 10:34

Choose a tag to compare

What's new in v1.1.0

MCP Tool Surface

VisGraph now exposes a full Model Context Protocol tool surface, letting AI agents control the graph directly — add/remove nodes and edges, run SPARQL queries, validate with SHACL, trigger OWL-RL reasoning, cluster and layout nodes, and inspect the canvas state. No backend required.

AI Relay Bridge

A new bookmarklet-based relay bridge connects any chat UI (ChatGPT, Open WebUI, Gemini, …) to VisGraph via JSON-RPC 2.0 over BroadcastChannel. The AI issues tool calls as JSON-RPC messages; results are injected back into the chat automatically.

Demo Seeds

New built-in demo seeds showcase real-world usage: FOAF social graph, employment reasoning, and scene-ontology with BFO/RO.

Docs

  • README rewritten with role-based entry points, live demo previews, and dynamic sidebar TOC
  • AGENTS.md for AI integrators and MCP clients
  • Public mcp.json manifest at /.well-known/mcp.json

Full Changelog: v1.0.0...v1.1.0

first release

16 Apr 09:03

Choose a tag to compare

What's Changed

New Contributors

Full Changelog: https://github.com/ThHanke/visgraph/commits/v1.0.0