Skip to content

A local, general-purpose AI assistant built in Go, powered by Ollama — inspired by cyberpunk worlds like Shadowrun, Cyberpunk 2077, and The Matrix.

License

Notifications You must be signed in to change notification settings

theantichris/ghost

Repository files navigation

Ghost

   ▄████  ██░ ██  ▒█████   ██████  ████████
  ██▒ ▀█▒▓██░ ██▒▒██▒  ██▒ ██    ▒    ██
 ▒██░▄▄▄░▒██▀▀██░▒██░  ██▒ ▓██▄       ██
 ░▓█  ██▓░▓█ ░██ ▒██   ██░  ▒   ██▒   ██
 ░▒▓███▀▒░▓█▒░██▓░ ████▓▒░▒██████▒▒   ██
  ░▒   ▒   ▒ ░░▒░▒░ ▒░▒░▒░ ▒ ▒▓▒ ▒ ░   ░
   ░   ░   ▒ ░▒░ ░  ░ ▒ ▒░ ░ ░▒  ░ ░
 ░ ░   ░   ░  ░░ ░░ ░ ░ ▒  ░  ░  ░
       ░   ░  ░  ░    ░ ░        ░

Go Reference Build Status Go ReportCard license

Your personal AI companion in the terminal. Always on. Always local. No corp surveillance.

Ghost is a command-line AI assistant powered by Ollama, bringing the spirit of cyberpunk AI companions from Shadowrun, Cyberpunk 2077, and The Matrix into your daily workflow.

Jack In

Prerequisites:

  • Ollama installed and running
  • At least one model pulled (e.g., ollama pull llama3)

Install:

go install github.com/theantichris/ghost@latest

Or grab prebuilt binaries from the releases page.

Run your first query:

ghost "Explain recursion in simple terms"

Core Capabilities

  • Intelligence on demand: Ask questions, get explanations, analyze data
  • Data stream analysis: Pipe logs, files, or any text directly into Ghost
  • Visual recon: Feed images to vision models for analysis and description
  • Format flexibility: Output as plain text, JSON, or styled Markdown

Usage Examples

# Query the net for intel
ghost "how do I crack open a encrypted data stream?"

# Scan your logs for anomalies
cat system.log | ghost "what's lurking in here?"

# Extract structured data
ghost "give me a list of common netrunner tools" -f json | jq .

# Generate formatted dossiers
ghost "write a guide to bypassing corp firewalls" -f markdown > intel.md

# Visual recon (requires vision model)
ghost "analyze this security feed" -i camera-feed.png

# Compare surveillance data
ghost "what changed in the facility?" -i before-raid.png -i after-raid.png

System Configuration

Configure Ghost via command-line flags, environment variables, or config file.

Command Flags

  • -m, --model: Model to use (e.g., llama3)
  • -V, --vision-model: Vision model for images (defaults to main model)
  • -i, --image: Image file path (can be used multiple times)
  • -f, --format: Output format: text, json, or markdown
  • -u, --url: Ollama API URL (default: http://localhost:11434/api)
  • -c, --config: Config file path (default: ~/.config/ghost/config.toml)

Environment Variables

export GHOST_MODEL=llama3
export GHOST_VISION_MODEL=llama3.2-vision
export GHOST_URL=http://localhost:11434/api

Config File

Create ~/.config/ghost/config.toml:

model = "llama3"
url = "http://localhost:11434/api"

[vision]
model = "llama3.2-vision"

License

MIT

About

A local, general-purpose AI assistant built in Go, powered by Ollama — inspired by cyberpunk worlds like Shadowrun, Cyberpunk 2077, and The Matrix.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •