-
What feature would you like to see?可以支持qwen或者glm的api吗,怎么配置啊 Additional informationNo response What feature would you like to see?Can it support the API of qwen or glm? How to configure it? Additional informationNo response |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments
-
|
支持 support |
Beta Was this translation helpful? Give feedback.
-
具体怎么做呢,有sop吗
How to do it specifically? Is there any SOP? |
Beta Was this translation helpful? Give feedback.
-
|
OpenAI 兼容的所有平台现在都可以,可以看下 All platforms compatible with OpenAI are now available, you can take a look at |
Beta Was this translation helpful? Give feedback.
-
|
kimi-cli supports APIs that are compatible with openai's. To use those APIs, you can edit your config file (located at ~/.kimi) as follows: {
"default_model": "",
"models": {
"gpt4.5": {
"provider": "openai",
"model": "gpt-4.5",
"max_context_size": 160000
}
},
"providers": {
"openai": {
"type": "openai_legacy",
"base_url": "base_url",
"api_key": "your-secret-key"
}
},
"loop_control": {
"max_steps_per_run": 100,
"max_retries_per_step": 3
},
"services": {}
}and then run kimi with: kimi -m gpt4.5 |
Beta Was this translation helpful? Give feedback.
kimi-cli supports APIs that are compatible with openai's. To use those APIs, you can edit your config file (located at ~/.kimi) as follows:
{ "default_model": "", "models": { "gpt4.5": { "provider": "openai", "model": "gpt-4.5", "max_context_size": 160000 } }, "providers": { "openai": { "type": "openai_legacy", "base_url": "base_url", "api_key": "your-secret-key" } }, "loop_control": { "max_steps_per_run": 100, "max_retries_per_step": 3 }, "services": {} }and then run kimi with: