Create AI-powered automation workflows in Release!
This integration provides tasks for Release to interact with LLMs like OpenAI and Gemini and interact with any MCP servers.
- Create AI agents that perform complex tasks using MCP tools
- Call services directly using MCP, without writing any code
- Embed AI prompts in your Release workflows
- Start interactive chats with the LLM of your choice
- Connect to Gemini and OpenAI-compatible providers
- Connect to any MCP server
- Mix and match MCP servers and reasoning models per task, to get the right tool for the job
Creates an AI agent that can use MCP tools to accomplish tasks.
Connect to an AI server and invoke a single prompt. Supports Gemini and OpenAI-compatible LLM providers.
Interactive chat with an LLM, inside the task's Activity section.
Connect to any MCP server and invoke its tools. No LLM reasoning, just direct tool calls.
Lists available tools on an MCP server
This repo comes with example templates showing how to use the tasks.
- MCP Examples -- Demonstrates the MCP tool tasks that allows you to interact with 3rd party servers without having to write an integration plugin. (no LLM involved).
- Prompt examples -- Demonstrates the AI Prompt task with different model providers (Gemini, OpenAI, etc).
- MCP + Prompt examples -- Demonstrates how to combine MCP and Prompt tasks to build AI-powered workflows.
- Agent examples -- Demonstrates the AI Agent task that combines LLM prompting with multiple MCP servers.
- Interactive Chat example -- Demonstrates the interactive AI Chat task.
See the Build & Run section on how to install the examples.
This section explains how to set up a local Release instance with the plugin and upload the example templates.
You need to have the following installed in order to develop Python-based container tasks for Release using this project:
- Python 3
- Docker
Add the following to /etc/hosts or C:\Windows\System32\drivers\etc\hosts (sudo / administrator permissions
required):
127.0.0.1 host.docker.internal
127.0.0.1 container-registry
This project comes with a Docker environment that sets up a local Release instance with all the required services.
Running the environment with this plugin is a three-step process:
- Start the Release Docker environment
- Build and publish the plugin
- Upload the demo templates
When developing the plugin, typically you would just do step 2 after making code changes. The new version of the plugin will be picked up without having to restart the server.
The Release environment is defined in dev-environment/docker-compose.yaml.
Have Docker running and launch the environment with:
docker compose -f dev-environment/docker-compose.yaml up -d --build
It takes a while to start up. You can see that the Release server is running when the digitalai-release-setup
container has terminated.
Check if you can log in with admin/admin at http://localhost:5516.
After the demo, you can stop the environment with:
docker compose -f dev-environment/docker-compose.yaml down
The build.sh script will build the plugin container, publish it to the local registry and install it to the local
Release instance.
Run the build script
Unix / macOS
sh build.sh --upload
Windows
build.bat --upload
The sample templates come with examples to connect to various MCP servers and LLM providers.
Put your API keys in setup/secrets.xlvals.
Use the example file as a base:
cp setup/secrets.xlvals.example setup/secrets.xlvals
Then edit the file and add your keys.
Run the following command to upload the demo templates to the local Release instance:
./xlw apply -f setup/mcp-demo.yaml
The templates will be uploaded to a new AI Demo folder.
- Log in to http://localhost:5516 with admin/admin
- Go to the AI Demo folder
- Go the Templates section and run the examples
👉Add your favorite MCP Server or LLM provider under Connections and build your own example!
The tests are integration tests and need API keys to run. Put your API keys in .env file in the root of the project:
GEMINI_API_KEY=<key>
OPENAI_API_KEY=<key>
DAI_LLM_API_KEY=<key>
Run the tests with the command






