fix: 🔧 default llm config being used for graph store instead of root llm config provided by the user #3426
+5
−7
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
If the user has provided a config with llm in the main config then that should be used for graph store but instead the default open ai config is used.
The user-provided llm config works for graph store only if it is provided in the graphStore config.
Fixes #3425
This also includes a small change in
anthropic.ts
because the build fails without this change.Type of change
How Has This Been Tested?
Tested locally using npm link
Checklist:
Maintainer Checklist