🦍 The Cloud-Native Gateway for APIs & AI
-
Updated
Dec 18, 2025 - Lua
🦍 The Cloud-Native Gateway for APIs & AI
Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.
Deploy serverless AI workflows at scale. Firebase for AI agents
AutoRAG: An Open-Source Framework for Retrieval-Augmented Generation (RAG) Evaluation & Optimization with AutoML-Style Automation
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
The open LLM Ops platform - Traces, Analytics, Evaluations, Datasets and Prompt Optimization ✨
The collaborative spreadsheet for AI. Chain cells into powerful pipelines, experiment with prompts and models, and evaluate LLM responses in real-time. Work together seamlessly to build and iterate on AI applications.
AIConfig is a config-based framework to build generative AI applications.
Python SDK for running evaluations on LLM generated responses
An end-to-end LLM reference implementation providing a Q&A interface for Airflow and Astronomer
[⛔️ DEPRECATED] Friendli: the fastest serving engine for generative AI
Quality Control for AI Artifact Management
Miscellaneous codes and writings for MLOps
Deployment of RAG + LLM model serving on multiple K8s cloud clusters
The prompt engineering, prompt management, and prompt evaluation tool for TypeScript, JavaScript, and NodeJS.
Unofficial Go SDK for Langfuse - trace, monitor, and evaluate your LLM applications with async batch processing and specialized observation types
Lightweight Agent Framework for building AI apps with any LLM
Add a description, image, and links to the llm-ops topic page so that developers can more easily learn about it.
To associate your repository with the llm-ops topic, visit your repo's landing page and select "manage topics."