Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HADOOP-19262: Upgrade wildfly-openssl:1.1.3.Final to 2.1.4.Final #7026

Merged
merged 1 commit into from
Sep 9, 2024

Conversation

saikatroy038
Copy link
Contributor

@saikatroy038 saikatroy038 commented Sep 4, 2024

Description of PR

Upgrade wildfly-openssl:1.1.3.Final to 2.1.4.Final to support Java17+

How was this patch tested?

Ran "mvn verify" in hadoop-azure, and hadoop-aws (against ap-south-1) on both Java8 and Java11

For code changes:

  • Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')?
  • Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation?
  • If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under ASF 2.0?
  • If applicable, have you updated the LICENSE, LICENSE-binary, NOTICE-binary files?

@hadoop-yetus

This comment was marked as outdated.

@steveloughran
Copy link
Contributor

tests look caused by #6631, which didn't get picked up on before the merge.

will see what reverting that does.

@steveloughran
Copy link
Contributor

@saikatroy038 can you rebase this PR onto trunk and do a forced push?

@hadoop-yetus

This comment was marked as outdated.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 29s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+0 🆗 xmllint 0m 1s xmllint was not available.
+0 🆗 shelldocs 0m 1s Shelldocs was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
-1 ❌ test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.
_ trunk Compile Tests _
+0 🆗 mvndep 14m 13s Maven dependency ordering for branch
+1 💚 mvninstall 36m 29s trunk passed
+1 💚 compile 19m 2s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 compile 17m 11s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 mvnsite 26m 32s trunk passed
+1 💚 javadoc 10m 6s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 8m 11s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 shadedclient 53m 3s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 36s Maven dependency ordering for patch
+1 💚 mvninstall 33m 31s the patch passed
+1 💚 compile 18m 42s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javac 18m 42s the patch passed
+1 💚 compile 17m 32s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 javac 17m 32s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 mvnsite 20m 34s the patch passed
+1 💚 shellcheck 0m 0s No new issues.
+1 💚 javadoc 9m 52s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 8m 9s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 shadedclient 57m 3s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 770m 10s root in the patch passed.
+1 💚 asflicense 1m 40s The patch does not generate ASF License warnings.
1090m 5s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7026/3/artifact/out/Dockerfile
GITHUB PR #7026
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs
uname Linux 12eaada661c0 5.15.0-117-generic #127-Ubuntu SMP Fri Jul 5 20:13:28 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 196ef58
Default Java Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7026/3/testReport/
Max. process+thread count 3503 (vs. ulimit of 5500)
modules C: hadoop-project . U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7026/3/console
versions git=2.25.1 maven=3.6.3 shellcheck=0.7.0
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

Copy link
Member

@ayushtkn ayushtkn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@steveloughran steveloughran merged commit 6881d12 into apache:trunk Sep 9, 2024
1 of 3 checks passed
@steveloughran
Copy link
Contributor

merged. @saikatroy038 can you do a cherrypick PR for branch-3.4?

saikatroy038 added a commit to saikatroy038/hadoop that referenced this pull request Sep 9, 2024
@saikatroy038
Copy link
Contributor Author

merged. @saikatroy038 can you do a cherrypick PR for branch-3.4?

done. #7032

Hexiaoqiao pushed a commit to Hexiaoqiao/hadoop that referenced this pull request Sep 12, 2024
dongjoon-hyun added a commit to apache/spark that referenced this pull request Jan 29, 2025
### What changes were proposed in this pull request?

This PR aims to upgrade `wildfly-openssl` to 2.2.5.Final.

### Why are the changes needed?

Like Apache Hadoop 3.4.2 (HADOOP-19262), we need to upgrade to `wildfly-openssl 2.x` to support the latest openssl in Apache Spark 4.0.0.
- apache/hadoop#7026

As of now, `2.2.5.Final` is the latest one.
- https://github.com/wildfly-security/wildfly-openssl/releases/tag/2.2.5.Final

### Does this PR introduce _any_ user-facing change?

No behavior change.

### How was this patch tested?

Pass the CIs and do the manual test on `Intel Mac`.

**BEFORE**
```
$ build/sbt package -Phadoop-cloud

$ bin/spark-shell -c spark.hadoop.fs.s3a.aws.credentials.provider=software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider -c spark.hadoop.fs.s3a.ssl.channel.mode=openssl
WARNING: Using incubator modules: jdk.incubator.vector
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 4.1.0-SNAPSHOT
      /_/

Using Scala version 2.13.15 (OpenJDK 64-Bit Server VM, Java 17.0.1)
Type in expressions to have them evaluated.
Type :help for more information.
25/01/28 15:15:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://localhost:4040
Spark context available as 'sc' (master = local[*], app id = local-1738106104577).
Spark session available as 'spark'.

scala> spark.read.text("s3a://dongjoon/README.md")
25/01/28 15:15:09 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
25/01/28 15:15:09 WARN FileSystem: Failed to initialize filesystem s3a://dongjoon/README.md: java.io.IOException: java.security.NoSuchAlgorithmException: Error constructing implementation (algorithm: openssl.TLS, provider: openssl, class: org.wildfly.openssl.OpenSSLContextSPI$OpenSSLTLSContextSpi)
...
Caused by: java.lang.IllegalStateException: Could not load required symbol from libssl: SSL_get_peer_certificate
	at org.wildfly.openssl.SSLImpl.initialize0(Native Method)
	at org.wildfly.openssl.SSLImpl.initialize(SSLImpl.java:33)
```

**AFTER**
```
$ build/sbt package -Phadoop-cloud

$ bin/spark-shell -c spark.hadoop.fs.s3a.aws.credentials.provider=software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider -c spark.hadoop.fs.s3a.ssl.channel.mode=openssl
WARNING: Using incubator modules: jdk.incubator.vector
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 4.1.0-SNAPSHOT
      /_/

Using Scala version 2.13.15 (OpenJDK 64-Bit Server VM, Java 17.0.1)
Type in expressions to have them evaluated.
Type :help for more information.
25/01/28 15:07:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://localhost:4040
Spark context available as 'sc' (master = local[*], app id = local-1738105662294).
Spark session available as 'spark'.

scala> spark.read.text("s3a://dongjoon/README.md")
25/01/28 15:07:47 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
val res0: org.apache.spark.sql.DataFrame = [value: string]

scala>
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #49716 from dongjoon-hyun/SPARK-51024.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
dongjoon-hyun added a commit to apache/spark that referenced this pull request Jan 29, 2025
### What changes were proposed in this pull request?

This PR aims to upgrade `wildfly-openssl` to 2.2.5.Final.

### Why are the changes needed?

Like Apache Hadoop 3.4.2 (HADOOP-19262), we need to upgrade to `wildfly-openssl 2.x` to support the latest openssl in Apache Spark 4.0.0.
- apache/hadoop#7026

As of now, `2.2.5.Final` is the latest one.
- https://github.com/wildfly-security/wildfly-openssl/releases/tag/2.2.5.Final

### Does this PR introduce _any_ user-facing change?

No behavior change.

### How was this patch tested?

Pass the CIs and do the manual test on `Intel Mac`.

**BEFORE**
```
$ build/sbt package -Phadoop-cloud

$ bin/spark-shell -c spark.hadoop.fs.s3a.aws.credentials.provider=software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider -c spark.hadoop.fs.s3a.ssl.channel.mode=openssl
WARNING: Using incubator modules: jdk.incubator.vector
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 4.1.0-SNAPSHOT
      /_/

Using Scala version 2.13.15 (OpenJDK 64-Bit Server VM, Java 17.0.1)
Type in expressions to have them evaluated.
Type :help for more information.
25/01/28 15:15:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://localhost:4040
Spark context available as 'sc' (master = local[*], app id = local-1738106104577).
Spark session available as 'spark'.

scala> spark.read.text("s3a://dongjoon/README.md")
25/01/28 15:15:09 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
25/01/28 15:15:09 WARN FileSystem: Failed to initialize filesystem s3a://dongjoon/README.md: java.io.IOException: java.security.NoSuchAlgorithmException: Error constructing implementation (algorithm: openssl.TLS, provider: openssl, class: org.wildfly.openssl.OpenSSLContextSPI$OpenSSLTLSContextSpi)
...
Caused by: java.lang.IllegalStateException: Could not load required symbol from libssl: SSL_get_peer_certificate
	at org.wildfly.openssl.SSLImpl.initialize0(Native Method)
	at org.wildfly.openssl.SSLImpl.initialize(SSLImpl.java:33)
```

**AFTER**
```
$ build/sbt package -Phadoop-cloud

$ bin/spark-shell -c spark.hadoop.fs.s3a.aws.credentials.provider=software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider -c spark.hadoop.fs.s3a.ssl.channel.mode=openssl
WARNING: Using incubator modules: jdk.incubator.vector
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 4.1.0-SNAPSHOT
      /_/

Using Scala version 2.13.15 (OpenJDK 64-Bit Server VM, Java 17.0.1)
Type in expressions to have them evaluated.
Type :help for more information.
25/01/28 15:07:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://localhost:4040
Spark context available as 'sc' (master = local[*], app id = local-1738105662294).
Spark session available as 'spark'.

scala> spark.read.text("s3a://dongjoon/README.md")
25/01/28 15:07:47 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
val res0: org.apache.spark.sql.DataFrame = [value: string]

scala>
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #49716 from dongjoon-hyun/SPARK-51024.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit cb38cd2)
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants