Background
vLLM pins an older version of transformers as a hard dependency, which
blocks us from upgrading to transformers v5. Since vLLM is the only thing
holding us back, the cleanest path forward is to remove it from the project
entirely rather than waiting for vLLM to catch up.
What needs to change
- Delete
LocalVLLMBackend and the mellea/backends/vllm module
- Remove the
[vllm] optional dependency and drop it from [all] / [backends]
- Replace the subprocess-based vLLM integration test with a lightweight mock
OpenAI-compatible server fixture (the test was validating the generic
OpenAI-compatible HTTP path anyway, not vLLM internals)
- Remove all vLLM references from docs, README, and the nav config
- Regenerate
uv.lock — this should free up the transformers version constraint
Out of scope
The follow-on transformers v5 upgrade is a separate pr #418 that depends on this. This issue is
only the removal.
Acceptance criteria
Background
vLLM pins an older version of
transformersas a hard dependency, whichblocks us from upgrading to transformers v5. Since vLLM is the only thing
holding us back, the cleanest path forward is to remove it from the project
entirely rather than waiting for vLLM to catch up.
What needs to change
LocalVLLMBackendand themellea/backends/vllmmodule[vllm]optional dependency and drop it from[all]/[backends]OpenAI-compatible server fixture (the test was validating the generic
OpenAI-compatible HTTP path anyway, not vLLM internals)
uv.lock— this should free up the transformers version constraintOut of scope
The follow-on transformers v5 upgrade is a separate pr #418 that depends on this. This issue is
only the removal.
Acceptance criteria
from mellea.backends.vllm import LocalVLLMBackendraisesModuleNotFoundErrorpip install mellea[vllm]fails with "no such extra"uv.lockno longer pins a transformers version required by vLLMvllmstring remains in source, docs, or CI config