You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you make any changes to the async code (in `src/confluent_kafka/schema_registry/_async` and `tests/integration/schema_registry/_async`), you **must** run this script to generate the sync counterparts (in `src/confluent_kafka/schema_registry/_sync` and `tests/integration/schema_registry/_sync`). Otherwise, this script will be run in CI with the `--check` flag and fail the build.
105
105
106
-
Note: The AsyncIO Producer/Consumer under `src/confluent_kafka/aio/` are first-class asyncio implementations and are not generated using `unasync`.
106
+
Note: The AsyncIO Producer/Consumer under `src/confluent_kafka/experimental/aio/` are first-class asyncio implementations and are not generated using `unasync`.
-**High Performance & Reliability**: Built on [`librdkafka`](https://github.com/confluentinc/librdkafka), the battle-tested C client for Apache Kafka, ensuring maximum throughput, low latency, and stability. The client is supported by Confluent and is trusted in mission-critical production environments.
29
29
-**Comprehensive Kafka Support**: Full support for the Kafka protocol, transactions, and administration APIs.
30
-
-**AsyncIO Producer**: A fully asynchronous producer (`AIOProducer`) for seamless integration with modern Python applications using `asyncio`.
30
+
-**Experimental; AsyncIO Producer**: An experimental fully asynchronous producer (`AIOProducer`) for seamless integration with modern Python applications using `asyncio`.
31
31
-**Seamless Schema Registry Integration**: Synchronous and asynchronous clients for Confluent Schema Registry to handle schema management and serialization (Avro, Protobuf, JSON Schema).
32
32
-**Improved Error Handling**: Detailed, context-aware error messages and exceptions to speed up debugging and troubleshooting.
33
33
-**[Confluent Cloud] Automatic Zone Detection**: Producers automatically connect to brokers in the same availability zone, reducing latency and data transfer costs without requiring manual configuration.
@@ -60,7 +60,7 @@ Use the AsyncIO `Producer` inside async applications to avoid blocking the event
60
60
61
61
```python
62
62
import asyncio
63
-
from confluent_kafka.aio import AIOProducer
63
+
from confluent_kafka.experimental.aio import AIOProducer
64
64
65
65
asyncdefmain():
66
66
p = AIOProducer({"bootstrap.servers": "mybroker"})
@@ -97,7 +97,6 @@ For a more detailed example that includes both an async producer and consumer, s
97
97
98
98
The AsyncIO producer and consumer integrate seamlessly with async Schema Registry serializers. See the [Schema Registry Integration](#schema-registry-integration) section below for full details.
99
99
100
-
**Migration Note:** If you previously used custom AsyncIO wrappers, you can now migrate to the official `AIOProducer` which handles thread pool management, callback scheduling, and cleanup automatically. See the [blog post](https://www.confluent.io/blog/kafka-python-asyncio-integration/) for migration guidance.
101
100
### Basic Producer example
102
101
103
102
```python
@@ -178,7 +177,7 @@ producer.flush()
178
177
Use the `AsyncSchemaRegistryClient` and `Async` serializers with `AIOProducer` and `AIOConsumer`. The configuration is the same as the synchronous client.
179
178
180
179
```python
181
-
from confluent_kafka.aio import AIOProducer
180
+
from confluent_kafka.experimental.aio import AIOProducer
182
181
from confluent_kafka.schema_registry import AsyncSchemaRegistryClient
183
182
from confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer
184
183
@@ -316,7 +315,7 @@ For source install, see the *Install from source* section in [INSTALL.md](INSTAL
316
315
## Broker compatibility
317
316
318
317
The Python client (as well as the underlying C library librdkafka) supports
319
-
all broker versions >= 0.8.
318
+
all broker versions >= 0.8.
320
319
But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it
321
320
is not safe for a client to assume what protocol version is actually supported
322
321
by the broker, thus you will need to hint the Python client what protocol
Copy file name to clipboardExpand all lines: examples/README.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ The scripts in this directory provide various examples of using the Confluent Py
11
11
12
12
## AsyncIO Examples
13
13
14
-
-[asyncio_example.py](asyncio_example.py): Comprehensive AsyncIO example demonstrating both AIOProducer and AIOConsumer with transactional operations, batched async produce, proper event loop integration, signal handling, and async callback patterns.
14
+
-[asyncio_example.py](asyncio_example.py): Experimental comprehensive AsyncIO example demonstrating both AIOProducer and AIOConsumer with transactional operations, batched async produce, proper event loop integration, signal handling, and async callback patterns.
15
15
-[asyncio_avro_producer.py](asyncio_avro_producer.py): Minimal AsyncIO Avro producer using `AsyncSchemaRegistryClient` and `AsyncAvroSerializer` (supports Confluent Cloud using `--sr-api-key`/`--sr-api-secret`).
16
16
17
17
**Architecture:** For implementation details and component design, see the [AIOProducer Architecture Overview](../aio_producer_simple_diagram.md).
@@ -24,7 +24,7 @@ The AsyncIO producer works seamlessly with popular Python web frameworks:
24
24
25
25
```python
26
26
from fastapi import FastAPI
27
-
from confluent_kafka.aio import AIOProducer
27
+
from confluent_kafka.experimental.aio import AIOProducer
0 commit comments