-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add vmargs=-XX:-UseContainerSupport in config #136
Conversation
I'll just note that this issue along with workarounds and fixes have shown up across different inference toolkits. Here is a list of links showing that it's been a recurring problem. Also, this fix may create other problems, as turning off container support means that the JVM does not respect Docker container memory limits. We should make sure to address this uniformly across the inference toolkits and deep-learning-containers, while allowing users to easily customize without needing onerous workarounds such as using derived deep-learning-container images or even forking toolkit. Links:
I still don't think this is an exhaustive list. |
The PR will be updated to allow customization of vmargs |
python 3.7 tests failing at coverage report step. Failing to invoke coverage command. Works fine for python3.6 |
Issue #, if available:
#99
Description of changes:
Apply the fixing in pytorch inference toolkit
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.