Replies: 2 comments 1 reply
-
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Give a try of this PR #756. Let us know whether it resolves the issue with embedding model. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment

Uh oh!
There was an error while loading. Please reload this page.
-
First off, I am not certain that declared ai models in configurator page is locally are locally running or fetching outputs from somewhere else. May anybody illuminate me on that topic?
And my other question is can I use models from open router? While my configuration as follows:

I set the OPENAI_API_KEY as my environment variable with the key that OpenRouter provided. Can I use models in that way? If yes. May you help you about the error I got while I get my errors validate at the last stage of the configuration phase?
My final configuration:
The error I got:
I did not get why I am getting error ONLY on embeddings model while complex model works.
Beta Was this translation helpful? Give feedback.
All reactions