Avi NLU API is a high-performance server dedicated to Natural Language Understanding (NLU) for the Avi ecosystem. It handles intent recognition for voice commands and structured inputs, serving as the computational core for Avi’s voice assistant.
This API is designed to integrate seamlessly with AECP-enabled devices such as microphones, smart buttons, and other triggers, providing fast, reliable parsing of user intents.
- Intent Recognition Only: Focused on parsing user queries into structured intents and slots.
- Multi-language Support: Supports English and Portuguese out-of-the-box.
- FastAPI Backend: Lightweight, modern Python web framework for high-performance endpoints.
- Containerized Deployment: Docker-ready for easy setup and distribution.
- Low Latency: Optimized for real-time voice command processing.
- Docker Engine (v19.03.0+ recommended)
- 2GB RAM minimum (4GB recommended for training models)
- Network access to the Avi Core Node for full integration
Build the Docker image:
docker build -t avi-nlu-api .- Clone the repository:
git clone https://github.com/Apoll011/avi-nlu.git
cd Avi-NLU-API- Install dependencies:
pip install -r requirements.txt- Download language models:
python -m snips_nlu download-language-entities pt_pt
python -m snips_nlu download-language-entities enRun the server:
docker run -p 1178:1178 avi-nlu-apiThis will:
- Map port
1178on the host to port1178in the container. - Start the Avi NLU API server using uvicorn.
The server configuration is stored in config.py:
- Default host:
0.0.0.0 - Default port:
1178 - Model storage paths for intent recognition
The IntentKit handles:
- Parsing text or audio input into structured intents
- Slot extraction for multiple parameters
- Support for multiple languages
- Integration with AECP to receive raw audio or pre-parsed text from triggers
- Persistent storage of trained intent models
├── features/
│ └── intent_recognition/ # Intent models and datasets
│ └── snips/ # Snips NLU models
├── main.py # FastAPI application
├── kit.py # Core NLU components
├── config.py # Server configuration
├── Dockerfile # Docker configuration
└── requirements.txt # Python dependencies
The server exposes FastAPI endpoints for:
- Receiving text or audio input and returning structured intents and slots
- Checking server health and status
The server operates in Atlantic/Cape Verde timezone (UTC-1).
- Based on Python 3.8-slim
- Pre-loaded with Snips NLU utilities
- Includes models for English and Portuguese
This project is licensed under the MIT License – see the LICENSE file for details.
- GitHub: @Apoll011
- Related Project: Avi Core Node