docs(graphiti): note beta status and OpenAI-only provider limitation#305
Open
mason5052 wants to merge 2 commits into
Open
Conversation
The Graphiti container shipped with docker-compose-graphiti.yml only takes OPENAI_API_KEY and OPEN_AI_SERVER_URL for entity extraction. PentAGI configures many other LLM providers (Anthropic, Google AI, AWS Bedrock, DeepSeek, GLM, Kimi, Qwen) for the main flow, but those credentials are not consumed by Graphiti today. Until that changes, operators need to plan around an OpenAI-compatible endpoint just for the knowledge graph. This commit makes the limitation visible in two surfaces without changing runtime behavior: - README.md: Adds a beta callout at the top of the Knowledge Graph Integration section and a new 'Current Limitations' subsection covering provider scope, fixed model, independent billing, and the lack of an in-app graph explorer. - backend/docs/config.md: Mirrors the beta callout under Graphiti Knowledge Graph Settings and adds a 'Current Limitations (Beta)' subsection with the same constraints, so config-focused readers see the same message. Both notes explicitly point at the simple fallback: leave GRAPHITI_ENABLED=false if the deployment cannot reach an OpenAI-compatible endpoint. Refs vxcontrol#187 Signed-off-by: mason5052 <ehehwnwjs5052@gmail.com>
There was a problem hiding this comment.
Pull request overview
This PR updates PentAGI documentation to clearly flag the Graphiti knowledge graph integration as a beta feature and to document its current provider limitations (notably that the bundled Graphiti container uses OpenAI-compatible credentials/config only), without changing runtime behavior.
Changes:
- Added a beta/limitations callout to the Graphiti section in
README.md. - Added a “Current Limitations” subsection in
README.mddescribing provider/model/UI/billing constraints. - Mirrored the same beta/limitations messaging in
backend/docs/config.md.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| README.md | Adds Graphiti beta callout + a “Current Limitations” subsection for operators. |
| backend/docs/config.md | Adds the same beta/limitations note to the Graphiti configuration documentation. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Address Copilot review feedback on PR vxcontrol#305: the 'Current Limitations' bullet for the Graphiti integration mixed PentAGI's user-facing .env variables with the container env vars defined in docker-compose-graphiti.yml. Reword both README.md and backend/docs/config.md to lead with the user-facing OPEN_AI_KEY and OPEN_AI_SERVER_URL .env variables and explicitly note that docker-compose-graphiti.yml maps them into the bundled vxcontrol/graphiti container as OPENAI_API_KEY and OPENAI_BASE_URL. Operators set the .env variables; the container variables are an implementation detail. Signed-off-by: mason5052 <ehehwnwjs5052@gmail.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Documents that the Graphiti knowledge graph integration is currently a beta feature, and that the bundled container today only consumes OpenAI-compatible LLM credentials. Adds the note in two places (README and
backend/docs/config.md) without changing any runtime behavior.Problem
Issue #187 raised that Graphiti's current provider story is not obvious to users. PentAGI itself is configured against many LLM providers (Anthropic, Google AI / Gemini, AWS Bedrock, DeepSeek, GLM, Kimi, Qwen, OpenAI, Ollama), but the bundled
vxcontrol/graphiticontainer indocker-compose-graphiti.ymlonly readsOPENAI_API_KEYand a singleOPEN_AI_SERVER_URL(defaulthttps://api.openai.com/v1) for entity extraction, with a fixed model name fromGRAPHITI_MODEL_NAME(defaultgpt-5-mini).Today neither the README's Graphiti section nor
backend/docs/config.md's Graphiti settings call this out, and they do not flag the integration as beta. Users have to readdocker-compose-graphiti.ymlto understand the constraint.Solution
Add a small, accurate, English-only note in two surfaces:
[!IMPORTANT]beta callout at the top of the Knowledge Graph Integration (Graphiti) section and a new Current Limitations subsection covering:GRAPHITI_ENABLED=falseif the deployment cannot reach an OpenAI-compatible endpoint.The commit is documentation only; no runtime, schema, or env-var behavior changes.
User Impact
GRAPHITI_ENABLED=false) and that the rest of the stack still works.Test Plan
git diff --statshows onlyREADME.mdandbackend/docs/config.mdchanged.docker-compose-graphiti.ymlonfeature/next-release(env varsOPENAI_API_KEY,OPENAI_BASE_URLfromOPEN_AI_SERVER_URL, defaulthttps://api.openai.com/v1,MODEL_NAMEfromGRAPHITI_MODEL_NAME, defaultgpt-5-mini).Refs #187