Skip to content

sntfrc/meshtastic-juniper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

meshtastic-juniper

This simple project puts the power of freely available Large Language Models on-mesh.

Just connect any Meshtastic hardware to your (decently-powered) PC and you will have a completely off-grid, on-mesh LLM chatbot that works without any Internet connection.

Default model prompt is written in Italian, because this was made primarily to provide my local mesh network with a fun and useful tool for propagation testing and general experimentation. Feel free to change it as you wish, but please don't remove the instructions about keeping messages short. The mesh doesn't need to be recklessly flooded by AI slop!

Usage

Prepare a virtual environment and install the required packages:

python3 -m venv .venv
source .venv/bin/activate
pip install ollama meshtastic

Then, install Ollama. After installation, customize the system prompt in the Modelfile, then create the "juniper" model:

ollama create juniper -f Modelfile.juniper

In juniper.service you have a systemd template which you can use, that also preloads the model to avoid first-message delays.

Tunnelling (optional)

If you want to add IP tunnelling to give your node an on-mesh IP address for other purposes:

  • pip install pytap2
  • add --tunnel to the command line when you run juniper.py

Have fun!

About

Meshtastic on-mesh LLM chatbot

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages