Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] [Spark]The execution of SecureRandomStringUtils.random may get stuck. #5196

Closed
2 tasks done
peacewong opened this issue Nov 5, 2024 · 5 comments · Fixed by #5197
Closed
2 tasks done

[Bug] [Spark]The execution of SecureRandomStringUtils.random may get stuck. #5196

peacewong opened this issue Nov 5, 2024 · 5 comments · Fixed by #5197
Labels
bug Something isn't working

Comments

@peacewong
Copy link
Contributor

Search before asking

  • I searched the issues and found no similar issues.

Linkis Component

linkis-engineconnn-plugin

Steps to reproduce

  1. execute pyspark task
  2. java thread hang
"TaskExecution-Thread-1" #144 daemon prio=5 os_prio=0 tid=0x00007f154800f000 nid=0x6fd8 runnable [0x00007f14efd6e000]
   java.lang.Thread.State: RUNNABLE
        at java.io.FileInputStream.readBytes(Native Method)
        at java.io.FileInputStream.read(FileInputStream.java:255)
        at sun.security.provider.NativePRNG$RandomIO.readFully(NativePRNG.java:424)
        at sun.security.provider.NativePRNG$RandomIO.ensureBufferValid(NativePRNG.java:525)
        at sun.security.provider.NativePRNG$RandomIO.implNextBytes(NativePRNG.java:544)
        - locked <0x00000000c074f570> (a java.lang.Object)
        at sun.security.provider.NativePRNG$RandomIO.access$400(NativePRNG.java:331)
        at sun.security.provider.NativePRNG$Blocking.engineNextBytes(NativePRNG.java:268)
        at java.security.SecureRandom.nextBytes(SecureRandom.java:468)
        at java.security.SecureRandom.next(SecureRandom.java:491)
        at java.util.Random.nextInt(Random.java:390)
        at org.apache.linkis.engineplugin.spark.executor.SecureRandomStringUtils.random(SecureRandomStringUtils.java:182)
        at org.apache.linkis.engineplugin.spark.executor.SecureRandomStringUtils.random(SecureRandomStringUtils.java:99)
        at org.apache.linkis.engineplugin.spark.executor.SecureRandomStringUtils.random(SecureRandomStringUtils.java:76)
        at org.apache.linkis.engineplugin.spark.executor.SecureRandomStringUtils.randomAlphanumeric(SecureRandomStringUtils.java:60)
        at org.apache.linkis.engineplugin.spark.executor.SparkPythonExecutor.py4jToken$lzycompute(SparkPythonExecutor.scala:79)

Expected behavior

do not stuck.

Your environment

  • Linkis version used: 1.1.2
  • Environment name and version:
    • cdh-5.14.2
    • hdp-3.1.5
    • hive-2.1.1
    • spark-3.2.1
    • scala-2.12.2
    • jdk 1.8.0_121
    • ....

Anything else

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!
@peacewong peacewong added the bug Something isn't working label Nov 5, 2024
Copy link

github-actions bot commented Nov 5, 2024

😊 Welcome to the Apache Linkis community!!

We are glad that you are contributing by opening this issue.

Please make sure to include all the relevant context.
We will be here shortly.

If you are interested in contributing to our website project, please let us know!
You can check out our contributing guide on
👉 How to Participate in Project Contribution.

Community

WeChat Assistant WeChat Public Account

Mailing Lists

Name Description Subscribe Unsubscribe Archive
[email protected] community activity information subscribe unsubscribe archive

@peacewong
Copy link
Contributor Author

ping ~ @pjfanning

@pjfanning
Copy link
Contributor

@peacewong secure random requires entropy and on some VMs, there is not enough entropy

Sometimes, you can work around this by adding -Djava.security.egd=file:/dev/./urandom

When I have issues like this in the past, I have installed haveged on the affected machine.

https://www.digitalocean.com/community/tutorials/how-to-setup-additional-entropy-for-cloud-servers-using-haveged

@pjfanning
Copy link
Contributor

One potential change is allow users to opt out of the change in #5143. I would prefer, if we use secure random strings by default but we could allow customers to choose to opt out.

@peacewong
Copy link
Contributor Author

Yes, I think we should have the option to turn off use secure random by default, which is the mode that uses random numbers.

peacewong added a commit to WeDataSphere/linkis that referenced this issue Nov 6, 2024
casionone pushed a commit that referenced this issue Nov 12, 2024
* Turn off use secure random by default close #5196

* Update Notification Mailing List

* Fix ds meta service build
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants