This version of the documentation is archived and no longer supported. View the current documentation to learn how to install supported versions of the Spark Connector.
Overview
In batch mode, you can use the Spark Dataset and DataFrame APIs to process data at a specified time interval.
The following sections show you how to use the Spark Connector to read data from MongoDB and write data to MongoDB in batch mode:
Tip
Apache Spark Documentation
To learn more about using Spark to process batches of data, see the Spark Programming Guide.