arq
v1.0 (new)z4j-arq · Upstream project
Async Redis queue for FastAPI-era Python.
arq has no native re-enqueue-by-id primitive, so retry_task is polyfilled brain-side via submit_task using captured args. Event capture chains onto arq's WorkerSettings on_job_start / on_job_end hooks.
Install
pip install z4j-arq Native actions
-
submit_task -
cancel_task (Job.abort)
Brain-polyfilled actions
-
retry_task -
bulk_retry -
requeue_dead_letter
The brain fetches the original args from its tasks table and lowers to submit_task transparently. The dashboard shows the same button regardless.
Not supported
- purge_queue
- restart_worker (graceful self-exit)
- rate_limit
Engine-level limitations, not a roadmap gap. These actions require primitives the engine does not expose.
How z4j plugs in
Event capture
attach_to_worker_settings() chains onto on_job_start / on_job_end. Optional; lifecycle still works without.
Reconciliation
arq.jobs.Job.status() + result_info(): clean deferred/queued/in_progress/complete mapping.
arq cron_jobs
Read-only surface for arq's WorkerSettings.cron_jobs.
z4j-arqcron
arq cron jobs are statically configured in WorkerSettings.cron_jobs. Same read-only posture as Huey. This adapter exists for visibility, last-fire tracking, and per-job drill-down.
Learn more