Skip to content

docs(graphiti): note beta status and OpenAI-only provider limitation#305

Open
mason5052 wants to merge 2 commits into
vxcontrol:feature/next-releasefrom
mason5052:codex/issue-187-graphiti-limitations-docs
Open

docs(graphiti): note beta status and OpenAI-only provider limitation#305
mason5052 wants to merge 2 commits into
vxcontrol:feature/next-releasefrom
mason5052:codex/issue-187-graphiti-limitations-docs

Conversation

@mason5052
Copy link
Copy Markdown
Contributor

Summary

Documents that the Graphiti knowledge graph integration is currently a beta feature, and that the bundled container today only consumes OpenAI-compatible LLM credentials. Adds the note in two places (README and backend/docs/config.md) without changing any runtime behavior.

Problem

Issue #187 raised that Graphiti's current provider story is not obvious to users. PentAGI itself is configured against many LLM providers (Anthropic, Google AI / Gemini, AWS Bedrock, DeepSeek, GLM, Kimi, Qwen, OpenAI, Ollama), but the bundled vxcontrol/graphiti container in docker-compose-graphiti.yml only reads OPENAI_API_KEY and a single OPEN_AI_SERVER_URL (default https://api.openai.com/v1) for entity extraction, with a fixed model name from GRAPHITI_MODEL_NAME (default gpt-5-mini).

Today neither the README's Graphiti section nor backend/docs/config.md's Graphiti settings call this out, and they do not flag the integration as beta. Users have to read docker-compose-graphiti.yml to understand the constraint.

Solution

Add a small, accurate, English-only note in two surfaces:

  • README.md — Adds an [!IMPORTANT] beta callout at the top of the Knowledge Graph Integration (Graphiti) section and a new Current Limitations subsection covering:
    • OpenAI-compatible LLM only (with the relevant env vars and the list of unaffected PentAGI providers),
    • single fixed model per deployment,
    • independent billing from the main flow LLM,
    • no in-app graph explorer yet (Neo4j Browser / Swagger UI are the supported surfaces),
    • explicit fallback: leave GRAPHITI_ENABLED=false if the deployment cannot reach an OpenAI-compatible endpoint.
  • backend/docs/config.md — Mirrors the same beta callout at the top of Graphiti Knowledge Graph Settings and adds a Current Limitations (Beta) subsection so config-focused readers see the same message.

The commit is documentation only; no runtime, schema, or env-var behavior changes.

User Impact

  • Operators evaluating Graphiti can see the OpenAI-compatible-LLM constraint and the beta status before they enable it in production.
  • Users running PentAGI on a non-OpenAI provider learn the documented fallback (GRAPHITI_ENABLED=false) and that the rest of the stack still works.
  • The note is honest about scope: it does not promise multi-provider Graphiti work; it documents what ships today and points at what the user can choose.

Test Plan

  • git diff --stat shows only README.md and backend/docs/config.md changed.
  • Markdown renders without broken anchor, callout, or code-fence syntax.
  • Constraints quoted in the note match docker-compose-graphiti.yml on feature/next-release (env vars OPENAI_API_KEY, OPENAI_BASE_URL from OPEN_AI_SERVER_URL, default https://api.openai.com/v1, MODEL_NAME from GRAPHITI_MODEL_NAME, default gpt-5-mini).
  • No code or tests changed.

Refs #187

The Graphiti container shipped with docker-compose-graphiti.yml only
takes OPENAI_API_KEY and OPEN_AI_SERVER_URL for entity extraction.
PentAGI configures many other LLM providers (Anthropic, Google AI,
AWS Bedrock, DeepSeek, GLM, Kimi, Qwen) for the main flow, but those
credentials are not consumed by Graphiti today. Until that changes,
operators need to plan around an OpenAI-compatible endpoint just for
the knowledge graph.

This commit makes the limitation visible in two surfaces without
changing runtime behavior:

- README.md: Adds a beta callout at the top of the Knowledge Graph
  Integration section and a new 'Current Limitations' subsection
  covering provider scope, fixed model, independent billing, and the
  lack of an in-app graph explorer.
- backend/docs/config.md: Mirrors the beta callout under Graphiti
  Knowledge Graph Settings and adds a 'Current Limitations (Beta)'
  subsection with the same constraints, so config-focused readers
  see the same message.

Both notes explicitly point at the simple fallback: leave
GRAPHITI_ENABLED=false if the deployment cannot reach an
OpenAI-compatible endpoint.

Refs vxcontrol#187

Signed-off-by: mason5052 <ehehwnwjs5052@gmail.com>
Copilot AI review requested due to automatic review settings May 7, 2026 16:23
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates PentAGI documentation to clearly flag the Graphiti knowledge graph integration as a beta feature and to document its current provider limitations (notably that the bundled Graphiti container uses OpenAI-compatible credentials/config only), without changing runtime behavior.

Changes:

  • Added a beta/limitations callout to the Graphiti section in README.md.
  • Added a “Current Limitations” subsection in README.md describing provider/model/UI/billing constraints.
  • Mirrored the same beta/limitations messaging in backend/docs/config.md.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
README.md Adds Graphiti beta callout + a “Current Limitations” subsection for operators.
backend/docs/config.md Adds the same beta/limitations note to the Graphiti configuration documentation.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread README.md Outdated
Comment thread backend/docs/config.md Outdated
Address Copilot review feedback on PR vxcontrol#305: the 'Current Limitations' bullet for the Graphiti integration mixed PentAGI's user-facing .env variables with the container env vars defined in docker-compose-graphiti.yml.

Reword both README.md and backend/docs/config.md to lead with the user-facing OPEN_AI_KEY and OPEN_AI_SERVER_URL .env variables and explicitly note that docker-compose-graphiti.yml maps them into the bundled vxcontrol/graphiti container as OPENAI_API_KEY and OPENAI_BASE_URL. Operators set the .env variables; the container variables are an implementation detail.

Signed-off-by: mason5052 <ehehwnwjs5052@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants