Author: arthur
Version: 0.0.4
Type: model
Canopy Wave plugin provides access to state-of-the-art large language models hosted on Canopy Wave's high-performance inference platform.
This plugin enables Dify users to leverage powerful AI models through a simple API integration.
| Model | Description |
|---|---|
| deepseek/deepseek-chat-v3.1 | DeepSeek V3.1 chat model |
| deepseek/deepseek-chat-v3.2 | DeepSeek V3.2 chat model |
| deepseek/deepseek-r1-distill | DeepSeek R1 distilled reasoning model |
| deepseek/deepseek-math-v2 | DeepSeek Math specialized model |
| zai/glm-4.7 | GLM-4.7 chat model |
| Qwen/Qwen3-30B-A3B-Instruct | Qwen3 30B instruction model |
| Qwen/Qwen3-Coder-30B-A3B-Instruct | Qwen3 Coder 30B |
| minimax/minimax-m2.1 | MiniMax M2.1 chat model |
| moonshot/kimi-k2-thinking | Kimi K2 thinking model |
| xiaomimimo/mimo-v2-flash | MIMO V2 Flash model |
| openai/gpt-oss-120b | GPT OSS 120B model |
- Get API Key: Obtain an API key from Canopy Wave Cloud
- Install Plugin: Install this plugin in your Dify workspace
- Configure: Add your API key in the model provider settings
- Use: Select any Canopy Wave model for your Dify applications
- ✅ Streaming and non-streaming responses
- ✅ Multi-turn conversation support
- ✅ System prompts
- ✅ Temperature and Top-P control
- ✅ Presence and frequency penalty