Skip to content

Commit df78a61

Browse files
committed
moving docs examples into their own files
1 parent d6b9454 commit df78a61

7 files changed

+54
-51
lines changed

docs/Makefile

+2
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
.DEFAULT_GOAL := html
2+
13
# You can set these variables from the command line.
24
SPHINXOPTS = -W
35
SPHINXBUILD = sphinx-build

docs/examples/direct_enqueuing.py

+12
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
from arq import Actor
2+
3+
4+
class FooBar(Actor):
5+
async def foo(self, a, b, c):
6+
print(a + b + c)
7+
8+
9+
async def main():
10+
foobar = FooBar()
11+
await foobar.enqueue_job('foo', 1, 2, c=48, queue=Actor.LOW_QUEUE)
12+
await foobar.enqueue_job('foo', 1, 2, c=48) # this will be queued in DEFAULT_QUEUE
File renamed without changes.

docs/examples/multiple_queues.py

+11
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
from arq import Actor, concurrent
2+
3+
4+
class RegistrationEmail(Actor):
5+
@concurrent
6+
async def email_standard_user(self, user_id):
7+
send_user_email(user_id)
8+
9+
@concurrent(Actor.HIGH_QUEUE)
10+
async def email_premium_user(self, user_id):
11+
send_user_email(user_id)

docs/examples/worker_customisation.py

+24
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
from arq import BaseWorker
2+
3+
4+
class Worker(BaseWorker):
5+
# execute jobs from both Downloader and FooBar above
6+
shadows = [Downloader, FooBar]
7+
8+
# allow lots and lots of jobs to run simultaniously, default 50
9+
max_concurrent_tasks = 500
10+
11+
# force the worker to close quickly after a termination signal is received, default 6
12+
shutdown_delay = 2
13+
14+
# jobs may not take more than 10 seconds, default 60
15+
timeout_seconds = 10
16+
17+
# number of seconds between health checks, default 60
18+
health_check_interval = 30
19+
20+
def logging_config(self, verbose):
21+
conf = super().logging_config(verbose)
22+
# alter logging setup to set arq.jobs level to WARNING
23+
conf['loggers']['arq.jobs']['level'] = 'WARNING'
24+
return conf

docs/usage.rst

+4-50
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ Usage is best described by example.
66
Simple Usage
77
............
88

9-
.. literalinclude:: demo.py
9+
.. literalinclude:: examples/main_demo.py
1010

1111
(This script is complete, it should run "as is" both to enqueue jobs and run them)
1212

@@ -67,18 +67,7 @@ Multiple Queues
6767
Functions can be assigned to different queues, by default arq defines three queues:
6868
``HIGH_QUEUE``, ``DEFAULT_QUEUE`` and ``LOW_QUEUE`` which are prioritised by the worker in that order.
6969

70-
.. code:: python
71-
72-
from arq import Actor, concurrent
73-
74-
class RegistrationEmail(Actor):
75-
@concurrent
76-
async def email_standard_user(self, user_id):
77-
send_user_email(user_id)
78-
79-
@concurrent(Actor.HIGH_QUEUE)
80-
async def email_premium_user(self, user_id):
81-
send_user_email(user_id)
70+
.. literalinclude:: examples/multiple_queues.py
8271

8372
(Just a snippet, won't run "as is")
8473

@@ -87,18 +76,7 @@ Direct Enqueuing
8776

8877
Functions can we enqueued directly whether or no they're decorated with ``@concurrent``.
8978

90-
.. code:: python
91-
92-
from arq import Actor, concurrent
93-
94-
class FooBar(Actor):
95-
async def foo(self, a, b, c):
96-
print(a + b + c)
97-
98-
async def main():
99-
foobar = FooBar()
100-
await foobar.enqueue_job('foo', 1, 2, c=48, queue=Actor.LOW_QUEUE)
101-
await foobar.enqueue_job('foo', 1, 2, c=48) # this will be queued in DEFAULT_QUEUE
79+
.. literalinclude:: examples/direct_enqueuing.py
10280

10381

10482
(This script is almost complete except for ``loop.run_until_complete(main())`` as above to run ``main``,
@@ -112,31 +90,7 @@ Worker Customisation
11290
Workers can be customised in numerous ways, this is preferred to command line arguments as it's easier to
11391
document and record.
11492

115-
.. code:: python
116-
117-
from arq import BaseWorker
118-
119-
class Worker(BaseWorker):
120-
# execute jobs from both Downloader and FooBar above
121-
shadows = [Downloader, FooBar]
122-
123-
# allow lots and lots of jobs to run simultaniously, default 50
124-
max_concurrent_tasks = 500
125-
126-
# force the worker to close quickly after a termination signal is received, default 6
127-
shutdown_delay = 2
128-
129-
# jobs may not take more than 10 seconds, default 60
130-
timeout_seconds = 10
131-
132-
# number of seconds between health checks, default 60
133-
health_check_interval = 30
134-
135-
def logging_config(self, verbose):
136-
conf = super().logging_config(verbose)
137-
# alter logging setup to set arq.jobs level to WARNING
138-
conf['loggers']['arq.jobs']['level'] = 'WARNING'
139-
return conf
93+
.. literalinclude:: examples/worker_customisation.py
14094

14195
(This script is more-or-less complete,
14296
provided ``Downloader`` and ``FooBar`` are defined and imported it should run "as is")

tests/test_doc_example.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55

66

77
async def test_run_job_burst(redis_conn, loop, caplog):
8-
demo = SourceFileLoader('demo', str(THIS_DIR / '../docs/demo.py')).load_module()
8+
demo = SourceFileLoader('demo', str(THIS_DIR / '../docs/examples/main_demo.py')).load_module()
99
worker = demo.Worker(burst=True, loop=loop)
1010

1111
downloader = demo.Downloader(loop=loop)

0 commit comments

Comments
 (0)