You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The following code snippet shows how to use the previous
69
68
configuration settings to stream data to MongoDB:
@@ -119,7 +118,8 @@ Write to MongoDB in Streaming Mode
119
118
120
119
* - ``writeStream.trigger()``
121
120
- Specifies how often the {+connector-short+} writes results
122
-
to the streaming sink.
121
+
to the streaming sink. Call this method on the ``DataStreamWriter`` object
122
+
you create from the ``DataStreamReader`` you configure.
123
123
124
124
To use continuous processing, pass the function a time value
125
125
using the ``continuous`` parameter.
@@ -130,8 +130,6 @@ Write to MongoDB in Streaming Mode
130
130
To view a list of all supported processing policies, see
131
131
the `pyspark trigger documentation <https://spark.apache.org/docs/latest/api/python/reference/pyspark.ss/api/pyspark.sql.streaming.DataStreamWriter.trigger.html>`__.
132
132
133
-
.. include:: /includes/note-trigger-method
134
-
135
133
The following code snippet shows how to use the previous
136
134
configuration settings to stream data to MongoDB:
137
135
@@ -186,7 +184,8 @@ Write to MongoDB in Streaming Mode
186
184
187
185
* - ``writeStream.trigger()``
188
186
- Specifies how often the {+connector-short+} writes results
189
-
to the streaming sink.
187
+
to the streaming sink. Call this method on the ``DataStreamWriter`` object
188
+
you create from the ``DataStreamReader`` you configure.
190
189
191
190
To use continuous processing, pass ``Trigger.Continuous(<time value>)``
192
191
as an argument, where ``<time value>`` is how often you want the Spark
@@ -198,8 +197,6 @@ Write to MongoDB in Streaming Mode
198
197
To view a list of all
199
198
supported processing policies, see the `Scala trigger documentation <https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/streaming/DataStreamWriter.html#trigger(trigger:org.apache.spark.sql.streaming.Trigger):org.apache.spark.sql.streaming.DataStreamWriter[T]>`__.
200
199
201
-
.. include:: /includes/note-trigger-method
202
-
203
200
The following code snippet shows how to use the previous
0 commit comments