【安装部署】DSS 1.1.0+LINKIS 1.1.1以及相关组件配置实践 #3628
utopianet
started this conversation in
Solicit Articles(征文)
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
本文主要目的为给新手安装DSS、LINKIS提供一个配置参考,并非完整安装部署文档。
CDH 5.16.1 开启Kerberos
Hive 1.1.0
Hadoop 2.6.0
Python 2.7
Apache Spark 2.4.3
2.部署方案
192.168.1.125 部署ngnix、DSS、LINKIS除linkis-cg-engineconnmanager的其它服务。使用hadoop用户部署。
192.168.1.129 部署eureka、linkis-cg-engineconnmanager二个服务。使用hadoop用户部署。
192.168.1.130 部署MYSQL数据库,存储hive metastore、DSS资料库。
3、配置文件下载
20221012.zip
4、192.168.1.125配置文件说明
/home/hadoop/.bash_profile
说明:该文件配置必须的用户环境变量
export JAVA_HOME=/usr/java/jdk1.8.0_181-cloudera
export SPARK_HOME=/appcom/spark-2.4.3-bin-hadoop2.6
export SPARK_CONF_DIR=/appcom/spark-2.4.3-bin-hadoop2.6/conf
export PYSPARK_ALLOW_INSECURE_GATEWAY=1
export HIVE_HOME=/opt/cloudera/parcels/CDH/lib/hive
export HIVE_CONF_DIR=/etc/hive/conf
export HADOOP_HOME=/opt/cloudera/parcels/CDH/lib/hadoop
export HADOOP_CONF_DIR=/etc/hadoop/conf
export SQOOP_HOME=/opt/cloudera/parcels/CDH-5.16.1-1.cdh5.16.1.p0.3/bin/sqoop
export SQOOP_CONF_DIR=/data/appcom/install/Exchangis/sqoop/sqoop/dist/v1.4.6/conf
export MAVEN_HOME=/data/appcom/install/apache-maven-3.8.4
export PATH=$JAVA_HOME/bin:$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin:$MAVEN_HOME/bin
export EXCHANGIS_HOME=/data/appcom/install/exchangis
export PYSPARK_ALLOW_INSECURE_GATEWAY=1
export ENABLE_METADATA_QUERY=true
/etc/nginx/conf.d/dss.conf
说明:该文件为dss应用的nginx配置文件,该配置文件需要在nginx软件主配置文件中进行引入。
server {
listen 8089;# 访问端口
server_name localhost;
#charset koi8-r;
#access_log /var/log/nginx/host.access.log main;
}
/etc/nginx/conf.d/exchangis.conf
说明:该文件为exchangis应用的nginx配置文件,该配置文件需要在nginx软件主配置文件中进行引入。
server {
listen 8098;
server_name localhost;
#charset koi8-r;
#access_log /var/log/nginx/host.access.log main;
}
/usr/local/nginx/conf/nginx.conf
说明:该文件为nginx主配置文件,需要将以上二个应用配置文件引入。
http {
include /etc/nginx/conf.d/dss.conf;
include /etc/nginx/conf.d/exchangis.conf;
include mime.types;
default_type application/octet-stream;
sendfile on;
keepalive_timeout 65;
}
/data/appcom/install/conf/config.sh
说明:该文件为全家桶一键安装时的配置文件,其中关键配置为服务安装IP、Yarn地址、插件版本、用户工作目录等。WORKSPACE_USER_ROOT_PATH为工作空间根目录,主要存放脚本,需要hadoop用户可读、可写、可执行。
HDFS_USER_ROOT_PATH为HDFS用户目录,主要存放运行时文件,需要hadoop用户可读、可写。
RESULT_SET_ROOT_PATH为HDFS用户临时用户,主要存放运行结果集,需要hadoop用户可读、可写。
ENGINECONN_ROOT_PATH为本地目录,存放引擎执行日志,需要hadoop用户可读、可写。
ENTRANCE_CONFIG_LOG_PATH为HDFS目录,存放引擎配置,需要hadoop用户可读、可写。
deploy user
deployUser=hadoop
Linkis_VERSION
LINKIS_VERSION=1.1.1
DSS Web
DSS_NGINX_IP=192.168.1.125
DSS_WEB_PORT=8089
DSS VERSION
DSS_VERSION=1.1.0
Specifies the user workspace, which is used to store the user's script files and log files.
Generally local directory
##file:// required
WORKSPACE_USER_ROOT_PATH=file:///data/tmp/linkis/
User's root hdfs path
##hdfs:// required
HDFS_USER_ROOT_PATH=hdfs:///tmp/linkis
Path to store job ResultSet:file or hdfs path
##hdfs:// required
RESULT_SET_ROOT_PATH=hdfs:///tmp/linkis
Path to store started engines and engine logs, must be local
ENGINECONN_ROOT_PATH=/data/appcom/tmp
ENTRANCE_CONFIG_LOG_PATH=hdfs:///tmp/linkis/ ##hdfs:// required
###HADOOP CONF DIR #/appcom/config/hadoop-config
HADOOP_CONF_DIR=/etc/hadoop/conf
###HIVE CONF DIR #/appcom/config/hive-config
HIVE_CONF_DIR=/etc/hive/conf
###SPARK CONF DIR #/appcom/config/spark-config
SPARK_CONF_DIR=/appcom/spark-2.4.3-bin-hadoop2.6/conf
for install
LINKIS_PUBLIC_MODULE=lib/linkis-commons/public-module
##YARN REST URL spark engine required
YARN_RESTFUL_URL=http://192.168.1.130:8088/
Engine version conf
#SPARK_VERSION
SPARK_VERSION=2.4.3
##HIVE_VERSION
HIVE_VERSION=2.3.3
PYTHON_VERSION=python2
EUREKA install information
You can access it in your browser at the address below:http://${EUREKA_INSTALL_IP}:${EUREKA_PORT}
Microservices Service Registration Discovery Center
LINKIS_EUREKA_INSTALL_IP=192.168.1.125
LINKIS_EUREKA_PORT=9600
#LINKIS_EUREKA_PREFER_IP=true
Gateway install information
LINKIS_GATEWAY_INSTALL_IP=192.168.1.125
LINKIS_GATEWAY_PORT=9001
ApplicationManager
LINKIS_MANAGER_INSTALL_IP=192.168.1.125
LINKIS_MANAGER_PORT=9101
EngineManager
LINKIS_ENGINECONNMANAGER_INSTALL_IP=192.168.1.125
LINKIS_ENGINECONNMANAGER_PORT=9102
EnginePluginServer
LINKIS_ENGINECONN_PLUGIN_SERVER_INSTALL_IP=192.168.1.125
LINKIS_ENGINECONN_PLUGIN_SERVER_PORT=9103
LinkisEntrance
LINKIS_ENTRANCE_INSTALL_IP=192.168.1.125
LINKIS_ENTRANCE_PORT=9104
publicservice
LINKIS_PUBLICSERVICE_INSTALL_IP=192.168.1.125
LINKIS_PUBLICSERVICE_PORT=9105
cs
LINKIS_CS_INSTALL_IP=192.168.1.125
LINKIS_CS_PORT=9108
DSS_SERVER
This service is used to provide dss-server capability.
project-server
DSS_FRAMEWORK_PROJECT_SERVER_INSTALL_IP=192.168.1.125
DSS_FRAMEWORK_PROJECT_SERVER_PORT=9002
orchestrator-server
DSS_FRAMEWORK_ORCHESTRATOR_SERVER_INSTALL_IP=192.168.1.125
DSS_FRAMEWORK_ORCHESTRATOR_SERVER_PORT=9003
apiservice-server
DSS_APISERVICE_SERVER_INSTALL_IP=192.168.1.125
DSS_APISERVICE_SERVER_PORT=9004
dss-workflow-server
DSS_WORKFLOW_SERVER_INSTALL_IP=192.168.1.125
DSS_WORKFLOW_SERVER_PORT=9005
dss-flow-execution-server
DSS_FLOW_EXECUTION_SERVER_INSTALL_IP=192.168.1.125
DSS_FLOW_EXECUTION_SERVER_PORT=9006
###dss-scriptis-server
DSS_SCRIPTIS_SERVER_INSTALL_IP=192.168.1.125
DSS_SCRIPTIS_SERVER_PORT=9008
###dss-data-api-server
DSS_DATA_API_SERVER_INSTALL_IP=192.168.1.125
DSS_DATA_API_SERVER_PORT=9208
###dss-data-governance-server
DSS_DATA_GOVERNANCE_SERVER_INSTALL_IP=192.168.1.125
DSS_DATA_GOVERNANCE_SERVER_PORT=9209
###dss-guide-server
DSS_GUIDE_SERVER_INSTALL_IP=192.168.1.125
DSS_GUIDE_SERVER_PORT=9210
java application default jvm memory
export SERVER_HEAP_SIZE="512M"
##sendemail配置,只影响DSS工作流中发邮件功能
EMAIL_HOST=
EMAIL_PORT=25
EMAIL_USERNAME=
EMAIL_PASSWORD=
EMAIL_PROTOCOL=smtp
Save the file path exported by the orchestrator service
ORCHESTRATOR_FILE_PATH=/data/appcom/tmp/dss
Save DSS flow execution service log path
EXECUTION_LOG_PATH=/data/appcom/tmp/dss
/data/appcom/install/conf/db.sh
for DSS-Server and Eventchecker APPJOINT
MYSQL_HOST=192.168.1.130
MYSQL_PORT=3306
MYSQL_DB=linkis
MYSQL_USER=linkis
MYSQL_PASSWORD=
###hive的配置
HIVE_HOST=192.168.1.130
HIVE_PORT=3306
HIVE_DB=metastore
HIVE_USER=hive
HIVE_PASSWORD=
/data/appcom/install/dss/config/config.sh
说明:该文件为DSS安装配置文件,安装脚本会读取该文件配置并写入资料库。重点需要注意配置INSTALL_IP。
deploy user
deployUser=hadoop
max memory for services
SERVER_HEAP_SIZE=512M
The install home path of DSS,Must provided
DSS_INSTALL_HOME=/data/appcom/install/dss
DSS_VERSION=1.1.0
DSS_FILE_NAME=dss-1.1.0
Linkis EUREKA information. # Microservices Service Registration Discovery Center
EUREKA_INSTALL_IP=192.168.1.125
EUREKA_PORT=9600
If EUREKA has safety verification, please fill in username and password
#EUREKA_USERNAME=
#EUREKA_PASSWORD=
Linkis Gateway information
GATEWAY_INSTALL_IP=192.168.1.125
GATEWAY_PORT=9001
DSS_SERVER
This service is used to provide dss-server capability.
project-server
DSS_FRAMEWORK_PROJECT_SERVER_INSTALL_IP=192.168.1.125
DSS_FRAMEWORK_PROJECT_SERVER_PORT=9002
orchestrator-server
DSS_FRAMEWORK_ORCHESTRATOR_SERVER_INSTALL_IP=192.168.1.125
DSS_FRAMEWORK_ORCHESTRATOR_SERVER_PORT=9003
apiservice-server
DSS_APISERVICE_SERVER_INSTALL_IP=192.168.1.125
DSS_APISERVICE_SERVER_PORT=9004
dss-workflow-server
DSS_WORKFLOW_SERVER_INSTALL_IP=192.168.1.125
DSS_WORKFLOW_SERVER_PORT=9005
dss-flow-execution-server
DSS_FLOW_EXECUTION_SERVER_INSTALL_IP=192.168.1.125
DSS_FLOW_EXECUTION_SERVER_PORT=9006
###dss-scriptis-server
DSS_SCRIPTIS_SERVER_INSTALL_IP=192.168.1.125
DSS_SCRIPTIS_SERVER_PORT=9008
###dss-data-api-server
DSS_DATA_API_SERVER_INSTALL_IP=192.168.1.125
DSS_DATA_API_SERVER_PORT=9208
###dss-data-governance-server
DSS_DATA_GOVERNANCE_SERVER_INSTALL_IP=192.168.1.125
DSS_DATA_GOVERNANCE_SERVER_PORT=9209
###dss-guide-server
DSS_GUIDE_SERVER_INSTALL_IP=192.168.1.125
DSS_GUIDE_SERVER_PORT=9210
############## ############## dss_appconn_instance configuration start ############## ##############
####eventchecker表的地址,一般就是dss数据库
EVENTCHECKER_JDBC_URL=jdbc:mysql://192.168.1.130:3306/linkis?characterEncoding=UTF-8
EVENTCHECKER_JDBC_USERNAME=linkis
EVENTCHECKER_JDBC_PASSWORD=
hive地址
DATACHECKER_JOB_JDBC_URL=jdbc:mysql://192.168.1.130:3306/metastore?useUnicode=true
DATACHECKER_JOB_JDBC_USERNAME=hive
DATACHECKER_JOB_JDBC_PASSWORD=
元数据库,可配置成和DATACHECKER_JOB的一致
DATACHECKER_BDP_JDBC_URL=jdbc:mysql://192.168.1.130:3306/metastore?useUnicode=true
DATACHECKER_BDP_JDBC_USERNAME=hive
DATACHECKER_BDP_JDBC_PASSWORD=
EMAIL_HOST=
EMAIL_PORT=25
EMAIL_USERNAME=
EMAIL_PASSWORD=
EMAIL_PROTOCOL=smtp
/data/appcom/install/dss/config/db.sh
说明:该文件为DSS资料库配置文件。
for DSS-Server and Eventchecker APPCONN
MYSQL_HOST=192.168.1.130
MYSQL_PORT=3306
MYSQL_DB=linkis
MYSQL_USER=linkis
MYSQL_PASSWORD=
/data/appcom/install/dss/conf/application-dss.yml
说明:该文件为DSS SPRING BOOST配置,重点关注配置eureka注册中心地址。建议使用多活方案。
eureka:
client:
serviceUrl:
defaultZone: http://192.168.1.125:9600/eureka/,http://192.168.1.129:9600/eureka/
/data/appcom/install/dss/conf/dss.properties
该文件为DSS配置文件
wds.linkis.gateway.ip=192.168.1.125
wds.linkis.gateway.port=9001
wds.linkis.gateway.url=http://192.168.1.125:9001/
wds.linkis.gateway.wtss.url=http://192.168.1.125:9001/
wds.linkis.mysql.is.encrypt=false
wds.linkis.server.mybatis.datasource.url=jdbc:mysql://192.168.1.130:3306/linkis?characterEncoding=UTF-8
wds.linkis.server.mybatis.datasource.username=linkis
wds.linkis.server.mybatis.datasource.password=
wds.dss.esb.appid=
wds.dss.esb.token=
wds.dss.appconn.scheduler.job.label=dev
wds.linkis.reflect.scan.package=org.apache.linkis,com.webank.wedatasphere.dss
spring.spring.mvc.servlet.path=/api/rest_j/v1
spring.spring.servlet.multipart.max-file-size=200MB
spring.spring.servlet.multipart.max-request-size=200MB
wds.dss.project.strict.mode=true
wds.dss.appconn.email.from.default=
wds.dss.appconn.email.suffix.default=
wds.dss.appconn.email.host=
wds.dss.appconn.email.port=25
wds.dss.appconn.email.username=
wds.dss.appconn.email.password=
spring.spring.cloud.config.enabled=false
wds.linkis.keytab.enable=true
wds.linkis.keytab.file=/home/hadoop
wds.linkis.keytab.host.enabled=false
wds.linkis.keytab.host=cdhdev02
/data/appcom/install/dss/conf/dss-flow-execution-server.properties
-说明:该文件为工作流执行服务配置,重点关注执行引擎版本配置。
wds.linkis.spark.engine.version=2.4.3
wds.linkis.hive.engine.version=1.1.0
/data/appcom/install/linkis/conf/application-eureka.yml
说明:注册中心配置文件,如果是多活必须要自身注册到注册中心。
serviceUrl:
defaultZone: http://192.168.1.125:9600/eureka/,http://192.168.1.129:9600/eureka/
/data/appcom/install/linkis/conf/application-linkis.yml
说明:该文件为DSS SPRING BOOST配置,重点关注配置eureka注册中心地址。建议使用多活方案。
eureka:
client:
serviceUrl:
defaultZone: http://192.168.1.125:9600/eureka/,http://192.168.1.129:9600/eureka/
/data/appcom/install/linkis/conf/linkis.properties
说明:Linkis配置文件,注意关注必须设置wds.linkis.engineconn.home、wds.linkis.engineconn.plugin.loader.store.path等二个引擎目录。以及wds.linkis.filesystem.root.path、wds.linkis.filesystem.hdfs.root.path、wds.linkis.bml.hdfs.prefix等目录hadoop必须有读写权限。
##enable wds.linkis.test.mode where use knife4j
#wds.linkis.test.mode=true
wds.linkis.server.version=v1
##spring conf
wds.linkis.gateway.url=http://192.168.1.125:9001/
wds.linkis.eureka.defaultZone=http://192.168.1.125:9600/eureka/,http://192.168.1.129:9600/eureka/
##mybatis
wds.linkis.server.mybatis.datasource.url=jdbc:mysql://192.168.1.130:3306/linkis?characterEncoding=UTF-8
wds.linkis.server.mybatis.datasource.username=linkis
wds.linkis.server.mybatis.datasource.password=linkis
mysql
wds.linkis.mysql.is.encrypt=false
#hadoop/hive/spark config
hadoop.config.dir=/etc/hadoop/conf
hive.config.dir=/etc/hive/conf
spark.config.dir=/appcom/spark-2.4.3-bin-hadoop2.6/conf
##file path
wds.linkis.filesystem.root.path=file:///data/tmp/linkis/
wds.linkis.filesystem.hdfs.root.path=hdfs:///tmp/linkis
##bml path:default use hdfs
wds.linkis.bml.is.hdfs=true
wds.linkis.bml.hdfs.prefix=hdfs:///tmp/linkis
#wds.linkis.bml.local.prefix=/data/dss/bml
##engine Version
wds.linkis.spark.engine.version=2.4.3
wds.linkis.hive.engine.version=1.1.0
wds.linkis.python.engine.version=python2
#LinkisHome
wds.linkis.home=/data/appcom/install/linkis
#Linkis governance station administrators
wds.linkis.governance.station.admin=hadoop
wds.linkis.gateway.conf.publicservice.list=query,jobhistory,application,configuration,filesystem,udf,variable,microservice,errorcode,bml,datasource
spring.spring.servlet.multipart.max-file-size=500MB
spring.spring.servlet.multipart.max-request-size=500MB
note:value of zero means Jetty will never write to disk. spring-projects/spring-boot#9073
spring.spring.servlet.multipart.file-size-threshold=50MB
wds.linkis.engineconn.home=/data/appcom/install/linkis/lib/linkis-engineconn-plugins
wds.linkis.engineconn.plugin.loader.store.path=/data/appcom/install/linkis/lib/linkis-engineconn-plugins
#sendmail
wds.dss.appconn.email.from.default=
wds.dss.appconn.email.suffix.default=
wds.dss.appconn.email.host=
wds.dss.appconn.email.port=25
wds.dss.appconn.email.username=
wds.dss.appconn.email.password=
spring.spring.cloud.config.enabled=false
wds.linkis.keytab.enable=true
wds.linkis.keytab.file=/home/hadoop
wds.linkis.keytab.host.enabled=false
wds.linkis.keytab.host=
/data/appcom/install/linkis/conf/linkis-cg-engineplugin.properties
说明:插件服务配置文件,重点关注必须设置wds.linkis.engineconn.home、wds.linkis.engineconn.plugin.loader.store.path等二个引擎目录。
wds.linkis.engineconn.home=/data/appcom/install/linkis/lib/linkis-engineconn-plugins
wds.linkis.engineconn.plugin.loader.store.path=/data/appcom/install/linkis/lib/linkis-engineconn-plugins
wds.linkis.keytab.enable=true
wds.linkis.keytab.file=/home/hadoop
wds.linkis.keytab.host.enabled=false
wds.linkis.keytab.host=
/data/appcom/install/linkis/conf/linkis-ps-publicservice.properties
说明:公共服务配置文件,建议linkis.metadata.hive.permission.with-login-user-enabled设置为false。以便质量检核、SCRIPT等组件可以正常访问hive数据库的metastore。
linkis.metadata.hive.permission.with-login-user-enabled=false
wds.linkis.keytab.enable=true
wds.linkis.keytab.file=/home/hadoop
wds.linkis.keytab.host.enabled=false
wds.linkis.keytab.host=
/data/appcom/install/apache-skywalking-apm-bin/webapp/webapp.yml
说明:skywalking配置文件,指定前端端口即可。
server:
port: 8090
/data/appcom/install/apache-skywalking-apm-bin/config/application.yml
说明:skywalking配置文件,指定数据库地址。
mysql:
properties:
jdbcUrl: ${SW_JDBC_URL:"jdbc:mysql://192.168.1.104:30168/skywalking?rewriteBatchedStatements=true"}
dataSource.user: ${SW_DATA_SOURCE_USER:skywalking}
dataSource.password: ${SW_DATA_SOURCE_PASSWORD:skywalking}
/data/appcom/install/visualis-server/conf/application.yml
说明:visualis-server服务配置文件。注意以下配置只能使用IP。
1. Visualis Service configuration
server:
protocol: http
address: 192.168.1.125 # server ip address
port: 9009 # server port
url: http://192.168.1.125:8089/dss/visualis # frontend index page full path
access:
address: 192.168.1.125 # frontend address
port: 8089 # frontend port
4、192.168.1.129配置文件说明
/data/appcom/config/schedulis-config/host.properties
说明:schedulis节点配置文件。
dssdev02.dev.com.cn=1
/data/appcom/install/schedulis/schedulis_0.7.0_exec/azkaban.properties
说明:schedulis主配置文件。
Web Server
azkaban.webserver.url=http://192.168.1.129:8080
Beta Was this translation helpful? Give feedback.
All reactions