This tool gives security researchers an AI chat interface that can drive Ghidra through MCP, letting them ask high-level questions about a binary instead of digging manually. The agentic workflow automatically performs the required reverse-engineering steps inside Ghidra to produce answers.
docker run --rm -p 9090:9090 -v $(pwd)/data:/data/ghidra_projects biniamfd/ghidra-headless-rest:latestHeadless Ghidra endpoints (at GHIDRA_API_BASE = http://localhost:9090)
| endpoint | description |
|---|---|
| /tools/analyze | Upload a base64-encoded binary and start headless Ghidra analysis. |
| /tools/status | Get status for an existing analysis job. |
| /tools/list_functions | Retrieve the list of discovered functions for a job. |
| /tools/decompile_function | Get decompiled pseudocode for a function at a given address. |
| /tools/get_xrefs | Get callers and callees for a function (cross-references). |
| /tools/list_imports | List imported libraries and symbols for the binary. |
| /tools/list_strings | Return printable strings extracted from the binary. |
| /tools/query_artifacts | Simple natural-language-like query over artifacts (function names, decompiled snippets). |
- Pull the Docker image and run it
- Set your OpenAI compatible API base URL
- API key
- model name
python webui/app.pyThen access the service at http://localhost:5000