-
Notifications
You must be signed in to change notification settings - Fork 9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HADOOP-19262: Upgrade wildfly-openssl:1.1.3.Final to 2.1.4.Final #7026
Conversation
This comment was marked as outdated.
This comment was marked as outdated.
tests look caused by #6631, which didn't get picked up on before the merge. will see what reverting that does. |
@saikatroy038 can you rebase this PR onto trunk and do a forced push? |
a5922cc
to
196ef58
Compare
This comment was marked as outdated.
This comment was marked as outdated.
💔 -1 overall
This message was automatically generated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
merged. @saikatroy038 can you do a cherrypick PR for branch-3.4? |
…upport Java17+ (apache#7026) Contributed by Saikat Roy
done. #7032 |
…upport Java17+ (apache#7026) Contributed by Saikat Roy
### What changes were proposed in this pull request? This PR aims to upgrade `wildfly-openssl` to 2.2.5.Final. ### Why are the changes needed? Like Apache Hadoop 3.4.2 (HADOOP-19262), we need to upgrade to `wildfly-openssl 2.x` to support the latest openssl in Apache Spark 4.0.0. - apache/hadoop#7026 As of now, `2.2.5.Final` is the latest one. - https://github.com/wildfly-security/wildfly-openssl/releases/tag/2.2.5.Final ### Does this PR introduce _any_ user-facing change? No behavior change. ### How was this patch tested? Pass the CIs and do the manual test on `Intel Mac`. **BEFORE** ``` $ build/sbt package -Phadoop-cloud $ bin/spark-shell -c spark.hadoop.fs.s3a.aws.credentials.provider=software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider -c spark.hadoop.fs.s3a.ssl.channel.mode=openssl WARNING: Using incubator modules: jdk.incubator.vector Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 4.1.0-SNAPSHOT /_/ Using Scala version 2.13.15 (OpenJDK 64-Bit Server VM, Java 17.0.1) Type in expressions to have them evaluated. Type :help for more information. 25/01/28 15:15:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://localhost:4040 Spark context available as 'sc' (master = local[*], app id = local-1738106104577). Spark session available as 'spark'. scala> spark.read.text("s3a://dongjoon/README.md") 25/01/28 15:15:09 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. 25/01/28 15:15:09 WARN FileSystem: Failed to initialize filesystem s3a://dongjoon/README.md: java.io.IOException: java.security.NoSuchAlgorithmException: Error constructing implementation (algorithm: openssl.TLS, provider: openssl, class: org.wildfly.openssl.OpenSSLContextSPI$OpenSSLTLSContextSpi) ... Caused by: java.lang.IllegalStateException: Could not load required symbol from libssl: SSL_get_peer_certificate at org.wildfly.openssl.SSLImpl.initialize0(Native Method) at org.wildfly.openssl.SSLImpl.initialize(SSLImpl.java:33) ``` **AFTER** ``` $ build/sbt package -Phadoop-cloud $ bin/spark-shell -c spark.hadoop.fs.s3a.aws.credentials.provider=software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider -c spark.hadoop.fs.s3a.ssl.channel.mode=openssl WARNING: Using incubator modules: jdk.incubator.vector Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 4.1.0-SNAPSHOT /_/ Using Scala version 2.13.15 (OpenJDK 64-Bit Server VM, Java 17.0.1) Type in expressions to have them evaluated. Type :help for more information. 25/01/28 15:07:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://localhost:4040 Spark context available as 'sc' (master = local[*], app id = local-1738105662294). Spark session available as 'spark'. scala> spark.read.text("s3a://dongjoon/README.md") 25/01/28 15:07:47 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. val res0: org.apache.spark.sql.DataFrame = [value: string] scala> ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #49716 from dongjoon-hyun/SPARK-51024. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
### What changes were proposed in this pull request? This PR aims to upgrade `wildfly-openssl` to 2.2.5.Final. ### Why are the changes needed? Like Apache Hadoop 3.4.2 (HADOOP-19262), we need to upgrade to `wildfly-openssl 2.x` to support the latest openssl in Apache Spark 4.0.0. - apache/hadoop#7026 As of now, `2.2.5.Final` is the latest one. - https://github.com/wildfly-security/wildfly-openssl/releases/tag/2.2.5.Final ### Does this PR introduce _any_ user-facing change? No behavior change. ### How was this patch tested? Pass the CIs and do the manual test on `Intel Mac`. **BEFORE** ``` $ build/sbt package -Phadoop-cloud $ bin/spark-shell -c spark.hadoop.fs.s3a.aws.credentials.provider=software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider -c spark.hadoop.fs.s3a.ssl.channel.mode=openssl WARNING: Using incubator modules: jdk.incubator.vector Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 4.1.0-SNAPSHOT /_/ Using Scala version 2.13.15 (OpenJDK 64-Bit Server VM, Java 17.0.1) Type in expressions to have them evaluated. Type :help for more information. 25/01/28 15:15:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://localhost:4040 Spark context available as 'sc' (master = local[*], app id = local-1738106104577). Spark session available as 'spark'. scala> spark.read.text("s3a://dongjoon/README.md") 25/01/28 15:15:09 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. 25/01/28 15:15:09 WARN FileSystem: Failed to initialize filesystem s3a://dongjoon/README.md: java.io.IOException: java.security.NoSuchAlgorithmException: Error constructing implementation (algorithm: openssl.TLS, provider: openssl, class: org.wildfly.openssl.OpenSSLContextSPI$OpenSSLTLSContextSpi) ... Caused by: java.lang.IllegalStateException: Could not load required symbol from libssl: SSL_get_peer_certificate at org.wildfly.openssl.SSLImpl.initialize0(Native Method) at org.wildfly.openssl.SSLImpl.initialize(SSLImpl.java:33) ``` **AFTER** ``` $ build/sbt package -Phadoop-cloud $ bin/spark-shell -c spark.hadoop.fs.s3a.aws.credentials.provider=software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider -c spark.hadoop.fs.s3a.ssl.channel.mode=openssl WARNING: Using incubator modules: jdk.incubator.vector Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 4.1.0-SNAPSHOT /_/ Using Scala version 2.13.15 (OpenJDK 64-Bit Server VM, Java 17.0.1) Type in expressions to have them evaluated. Type :help for more information. 25/01/28 15:07:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://localhost:4040 Spark context available as 'sc' (master = local[*], app id = local-1738105662294). Spark session available as 'spark'. scala> spark.read.text("s3a://dongjoon/README.md") 25/01/28 15:07:47 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. val res0: org.apache.spark.sql.DataFrame = [value: string] scala> ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #49716 from dongjoon-hyun/SPARK-51024. Authored-by: Dongjoon Hyun <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]> (cherry picked from commit cb38cd2) Signed-off-by: Dongjoon Hyun <[email protected]>
Description of PR
Upgrade wildfly-openssl:1.1.3.Final to 2.1.4.Final to support Java17+
How was this patch tested?
Ran "mvn verify" in hadoop-azure, and hadoop-aws (against ap-south-1) on both Java8 and Java11
For code changes:
LICENSE
,LICENSE-binary
,NOTICE-binary
files?