Skip to content

Yaswanth-C/RapidQ

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RapidQ

License Python PyPI Version PyPI Downloads Code style: Black Views

For those who want a bare minimum task queue, Just run and discard, no headaches.

A lightweight 🪶 and fast 🚀 background task processing library for Python, developed with simplicity in mind.
There is nothing fancy, RapidQ is a simple and easy to use package - works on any OS.
Only Redis broker is currently available, and there is no result backend(could be implemented later).

Inspired by celery and dramatiq, but lightweight, and easy to use for small projects.

Installation

pip install rapidq

It has:

  • Only Redis as broker, with flexible serialization options. (You can use your own too.)
  • Process based workers, and is faster
  • No result backend
  • No retry behavior (of course it will be added)
  • No monitoring, as of now.

It requires:

  • No configurations for local setup. Although some broker property can be configured.

Motivation

Simply put - I just wanted to see if I could do it.
This was part of my hobby project that somehow became a package.
Understanding how packages like celery and dramatiq works internally was a challenge I faced. I wanted a package that is understandable and simple.


This project is under development, so expect breaking changes when you upgrade


A working example

The below code is available in example\minimal.py

# my_task.py
from rapidq import RapidQ

app = RapidQ()

@app.task(name="simple-task")
def test_func(msg):
    # of course this function could do more than just print.
    print("simple task is running")
    print(msg)


if __name__ == "__main__":
    test_func.enqueue(msg="Hello, I'm running in background")
     # Line below will be printed directly and will not go to worker.
    test_func(msg="Hello, I'm running in the same process!")

Copy paste the above into a python file, say my_task.py

Run the rapidq worker first.
rapidq my_task

Then on another terminal, run the my_task.py
python my_task.py


Customizing broker properties

If you wish to customize the serialization to use json (pickle by default) or want to change the broker url?
It can be customized with a small configuration, using a simple python file. Checkout this file ->example\config_example.py.
I used a python module because you can run any arbitrary code to read config from any other options such as .env . check similar example in example\minimal_custom.py and example\config_example.py

# my_custom_task.py
from rapidq import RapidQ

app = RapidQ()

# define the custom configuration. Below line can be omitted if configuration is not needed.
app.config_from_module("example.config_example")


@app.task(name="simple-task")
def test_func(msg):
    print("simple task is running")
    print(msg)


if __name__ == "__main__":
    test_func.enqueue(msg="Hello, I'm running")

You can run rapidq as before.
rapidq my_custom_task
Then on another terminal, run the my_custom_task.py
python my_custom_task.py


Number of workers.

By default RapidQ uses 4 worker processes or the number of CPUs available on your system, whichever is smaller. You can control the number of workers by passing -w argument. Eg rapidq my_task -w 6. Which will start 6 worker processes.


Flushing broker

May be you tested a lot and flooded your broker with messages.
You can flush the broker by running rapidq-flush

Integrating with web frameworks

It can be easily integrated with Flask and FastAPI applications. A simple Flask, FastAPI and Django example is in example directory. For Flask and FastAPI like frameworks, if configured right - rapidq can be run by specifying your main python module name or a dotted path to the module.

rapidq main -w 4

OR

rapidq application.main -w 4

Setting up with Django

Currently RapidQ has experimental support for Django. An example project is available in example directory.

Create a file in your django project's directory: project/rapidq.py

import os
from rapidq import RapidQ

# Configure Django settings module in env.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "project.settings")

# app instance.
app = RapidQ()

Then you need to import this app in your project/__init__.py. This ensures the app is loaded when Django starts.

project/__init__.py:

from .rapidq import app

__all__ = ('app', )

Now run rapidq by specifying your project name.

rapidq your_project -w 4

By default Rapidq automatically discovers tasks modules from all INSTALLED_APPS.
If you want to use a different module name, use the below variable in your django settings module.

project/settings.py:

RAPIDQ_TASK_DISCOVER_MODULES = ("tasks",)

If you like this project, drop a ⭐. And help is always welcome via issues or pull requests.


About

A lightweight and fast background task processing library for Python, developed with simplicity in mind.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages