From ebea73097dbc574a8034a5711e925860ed1055ed Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E8=B0=AD=E4=B9=9D=E9=BC=8E?= <109224573@qq.com> Date: Thu, 6 Mar 2025 20:37:39 +0800 Subject: [PATCH] [MINOR][DOCS] IP -> HOST --- docs/spark-standalone.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md index 5a8eb3f1e0602..41cf82ad9bc87 100644 --- a/docs/spark-standalone.md +++ b/docs/spark-standalone.md @@ -547,12 +547,12 @@ Note, the user does not need to specify a discovery script when submitting an ap # Connecting an Application to the Cluster -To run an application on the Spark cluster, simply pass the `spark://IP:PORT` URL of the master as to the [`SparkContext` +To run an application on the Spark cluster, simply pass the `spark://HOST:PORT` URL of the master as to the [`SparkContext` constructor](rdd-programming-guide.html#initializing-spark). To run an interactive Spark shell against the cluster, run the following command: - ./bin/spark-shell --master spark://IP:PORT + ./bin/spark-shell --master spark://HOST:PORT You can also pass an option `--total-executor-cores ` to control the number of cores that spark-shell uses on the cluster. @@ -649,7 +649,7 @@ via http://[host:port]/[version]/submissions/[action] where The following is a curl CLI command example with the `pi.py` and REST API. ```bash -$ curl -XPOST http://IP:PORT/v1/submissions/create \ +$ curl -XPOST http://HOST:PORT/v1/submissions/create \ --header "Content-Type:application/json;charset=UTF-8" \ --data '{ "appResource": "", @@ -686,7 +686,7 @@ When Spark master requires HTTP Authorization header via configurations, curl CLI command can provide the required header like the following. ```bash -$ curl -XPOST http://IP:PORT/v1/submissions/create \ +$ curl -XPOST http://HOST:PORT/v1/submissions/create \ --header "Authorization: Bearer USER-PROVIDED-WEB-TOEN-SIGNED-BY-THE-SAME-SHARED-KEY" ... ```