Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOCSP-31186 Add sections for connections #243

Merged

Conversation

shuangela
Copy link
Contributor

@shuangela shuangela commented Mar 5, 2025

Pull Request Info

PR Reviewing Guidelines

JIRA - https://jira.mongodb.org/browse/DOCSP-31186

Staging Links

  • getting-started
  • Self-Review Checklist

    • Is this free of any warnings or errors in the RST?
    • Did you run a spell-check?
    • Did you run a grammar-check?
    • Are all the links working?
    • Are the facets and meta keywords accurate?

    Copy link

    netlify bot commented Mar 5, 2025

    Deploy Preview for docs-spark-connector ready!

    Name Link
    🔨 Latest commit 333242c
    🔍 Latest deploy log https://app.netlify.com/sites/docs-spark-connector/deploys/67c9c00b74eca500099622e8
    😎 Deploy Preview https://deploy-preview-243--docs-spark-connector.netlify.app
    📱 Preview on mobile
    Toggle QR Code...

    QR Code

    Use your smartphone camera to open QR code link.

    To edit notification comments on pull requests, go to your Netlify site configuration.

    Copy link
    Contributor

    @mongoKart mongoKart left a comment

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    nice work! just a few small changes.

    Integrations
    ------------

    You can integrate Spark with third-party platforms to use the {+connector-long+} in various external platforms.
    Copy link
    Contributor

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    i: repetitive use of 'platforms'

    Suggested change
    You can integrate Spark with third-party platforms to use the {+connector-long+} in various external platforms.
    The following sections describe some of the popular third-party platforms that you can
    integrate Spark and the {+connector-long+} with.

    Amazon EMR
    ~~~~~~~~~~

    Amazon EMR is a managed cluster platform that you can run big data frameworks such as Spark on. To install Spark on an EMR cluster, see
    Copy link
    Contributor

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    i: such as

    Suggested change
    Amazon EMR is a managed cluster platform that you can run big data frameworks such as Spark on. To install Spark on an EMR cluster, see
    Amazon EMR is a managed cluster platform that you can use to run big data frameworks like Spark. To install Spark on an EMR cluster, see

    Comment on lines 68 to 73
    Docker is an open-source platform that helps developers build, share, and run applications in containers. The following steps guide you through the process of connecting to a Docker container
    and integrating the {+connector-long+} in Docker.

    1. To start Spark in a Docker container, see `Apache Spark <https://hub.docker.com/r/apache/spark#!>`__ in the Docker documentation and follow the steps provided.
    #. See `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__ to deploy Atlas on Docker.
    #. Select the appropriate language in the tabs under :ref:`Getting Started <pyspark-shell>` and follow the steps provided.
    Copy link
    Contributor

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    i: 'The following steps' suggest this is going to be a procedure, which feels out of place with the rest of these sections. I would probably just break it into separate links (and the last step is hopefully already understood, so I would leave it out):

    Suggested change
    Docker is an open-source platform that helps developers build, share, and run applications in containers. The following steps guide you through the process of connecting to a Docker container
    and integrating the {+connector-long+} in Docker.
    1. To start Spark in a Docker container, see `Apache Spark <https://hub.docker.com/r/apache/spark#!>`__ in the Docker documentation and follow the steps provided.
    #. See `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__ to deploy Atlas on Docker.
    #. Select the appropriate language in the tabs under :ref:`Getting Started <pyspark-shell>` and follow the steps provided.
    Docker is an open-source platform that helps developers build, share, and run applications in containers.
    - To learn how to start Spark in a Docker container, see `Apache Spark <https://hub.docker.com/r/apache/spark#!>`__ in the Docker documentation.
    - To learn how to deploy Atlas on Docker, see `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__.

    Copy link
    Contributor Author

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    That's a great suggestion! I agree steps felt odd but I wasn't sure how to break out the instructions without having a long paragraph. The bullet points are a good idea 😄

    @shuangela shuangela requested a review from mongoKart March 6, 2025 15:14
    Copy link
    Contributor

    @mongoKart mongoKart left a comment

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    1 SG fix, then LGTM!

    #. See `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__ to deploy Atlas on Docker.
    #. Select the appropriate language in the tabs under :ref:`Getting Started <pyspark-shell>` and follow the steps provided.
    - To start Spark in a Docker container, see `Apache Spark <https://hub.docker.com/r/apache/spark#!>`__ in the Docker documentation and follow the steps provided.
    - See `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__ to deploy Atlas on Docker.
    Copy link
    Contributor

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    SG: 'Link text should begin with the purpose of the cross-reference'

    Suggested change
    - See `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__ to deploy Atlas on Docker.
    - To learn how to deploy Atlas on Docker, see `Create a Local Atlas Deployment with Docker <https://www.mongodb.com/docs/atlas/cli/current/atlas-cli-deploy-docker/>`__.

    @shuangela shuangela merged commit 75df8c4 into mongodb:master Mar 6, 2025
    6 checks passed
    shuangela added a commit to shuangela/docs-spark-connector that referenced this pull request Mar 6, 2025
    * add sections for connections
    
    * fix vale
    
    * fix vale flag
    
    * fix ref
    
    * fix typos
    
    * change steps formatting
    
    * change based on mikes feedback
    
    * fix vale error
    
    * change based on feedback
    
    (cherry picked from commit 75df8c4)
    shuangela added a commit to shuangela/docs-spark-connector that referenced this pull request Mar 6, 2025
    * add sections for connections
    
    * fix vale
    
    * fix vale flag
    
    * fix ref
    
    * fix typos
    
    * change steps formatting
    
    * change based on mikes feedback
    
    * fix vale error
    
    * change based on feedback
    
    (cherry picked from commit 75df8c4)
    @shuangela
    Copy link
    Contributor Author

    💔 Some backports could not be created

    Status Branch Result
    v10.4
    v10.3
    v10.2 Conflict resolution was aborted by the user
    v10.1 Conflict resolution was aborted by the user
    v10.0 Conflict resolution was aborted by the user

    Note: Successful backport PRs will be merged automatically after passing CI.

    Manual backport

    To create the backport manually run:

    backport --pr 243
    

    Questions ?

    Please refer to the Backport tool documentation

    shuangela added a commit that referenced this pull request Mar 6, 2025
    * add sections for connections
    
    * fix vale
    
    * fix vale flag
    
    * fix ref
    
    * fix typos
    
    * change steps formatting
    
    * change based on mikes feedback
    
    * fix vale error
    
    * change based on feedback
    
    (cherry picked from commit 75df8c4)
    shuangela added a commit that referenced this pull request Mar 6, 2025
    * add sections for connections
    
    * fix vale
    
    * fix vale flag
    
    * fix ref
    
    * fix typos
    
    * change steps formatting
    
    * change based on mikes feedback
    
    * fix vale error
    
    * change based on feedback
    
    (cherry picked from commit 75df8c4)
    shuangela added a commit that referenced this pull request Mar 6, 2025
    * add sections for connections
    
    * fix vale
    
    * fix vale flag
    
    * fix ref
    
    * fix typos
    
    * change steps formatting
    
    * change based on mikes feedback
    
    * fix vale error
    
    * change based on feedback
    
    (cherry picked from commit 75df8c4)
    shuangela added a commit that referenced this pull request Mar 6, 2025
    * DOCSP-29861: Cleanup unused files (#200)
    
    (cherry picked from commit 05f3125)
    
    * DOCSP-40130 - Note on Sharded Partitioner (#201)
    
    Co-authored-by: Nora Reidy <[email protected]>
    (cherry picked from commit e25e13f)
    
    * Add Netlify config files via upload
    
    * DOCSP-42969 - remove nested admonitions (#204) (#208)
    
    (cherry picked from commit b37226e)
    
    Co-authored-by: Mike Woofter <[email protected]>
    
    * DOCSP-44953 TOC Relabel (#214) (#217)
    
    * DOCSP-44953 TOC Relabel
    
    * edit
    
    * Mike's Suggestions
    
    * keep configure tls
    
    (cherry picked from commit 987747d)
    
    Co-authored-by: lindseymoore <[email protected]>
    
    * (DOCSP-45749) Denests last phase 1 nested component for spark connector (#220) (#222)
    
    * Add dependency installation to vale rule
    
    (cherry picked from commit 989f378)
    
    * [Spark] Remove autobuilder (#227) (#230)
    
    (cherry picked from commit ca7f722)
    
    Co-authored-by: Rachel Mackintosh <[email protected]>
    
    * DOCSP-46719 Spark Guide 404 (#235) (#239)
    
    (cherry picked from commit 671e355)
    
    Co-authored-by: lindseymoore <[email protected]>
    
    * DOCSP-31186 Add sections for connections (#243) (#248)
    
    * add sections for connections
    
    * fix vale
    
    * fix vale flag
    
    * fix ref
    
    * fix typos
    
    * change steps formatting
    
    * change based on mikes feedback
    
    * fix vale error
    
    * change based on feedback
    
    (cherry picked from commit 75df8c4)
    
    ---------
    
    Co-authored-by: Michael Morisi <[email protected]>
    Co-authored-by: Mike Woofter <[email protected]>
    Co-authored-by: anabellabuckvar <[email protected]>
    Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
    Co-authored-by: lindseymoore <[email protected]>
    Co-authored-by: Sarah Simpers <[email protected]>
    Co-authored-by: Rea Rustagi <[email protected]>
    Co-authored-by: Rachel Mackintosh <[email protected]>
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Labels
    None yet
    Projects
    None yet
    Development

    Successfully merging this pull request may close these issues.

    2 participants