Skip to content

Conversation

ranjithkumar8352
Copy link
Contributor

Description

If the user has provided a config with llm in the main config then that should be used for graph store but instead the default open ai config is used.

The user-provided llm config works for graph store only if it is provided in the graphStore config.

Fixes #3425

This also includes a small change in anthropic.ts because the build fails without this change.

Type of change

  • Bug fix (non-breaking change which fixes an issue)

How Has This Been Tested?

Tested locally using npm link

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • My changes generate no new warnings
  • I have checked my code and corrected any misspellings

Maintainer Checklist

  • closes #xxxx (Replace xxxx with the GitHub issue number)
  • Made sure Checks passed

@whysosaket
Copy link
Member

Hey @ranjithkumar8352, have you tried and tested if this works? As far as I know, the reason we defaulted the config for Graph Store was because of the use of structured schema and the zod schema there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Nodejs SDK: Default llm config is used for graph store instead of user's root llm config
2 participants