Skip to content
This repository has been archived by the owner on Apr 27, 2022. It is now read-only.

configure shared secrets by default #138

Open
willb opened this issue Nov 19, 2018 · 0 comments
Open

configure shared secrets by default #138

willb opened this issue Nov 19, 2018 · 0 comments

Comments

@willb
Copy link
Member

willb commented Nov 19, 2018

There is a remote code execution vulnerability in the Spark master (CVE-2018-17190). This is not urgent for our deployments (if you can run arbitrary code in an OpenShift project, you don't need to exploit the Spark master to do it), but the workaround (enabling authentication) is fairly simple.

We should generate a shared secret on cluster creation and set spark.authenticate to true.

references:

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant