You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a use-case where a piece of jsonnet code that is costly to eval returns an object and I want to use this object for subsequent invocations of other files in separate VMs as an external code var. This object is big and not everything that is subsequently eval-ed needs to use it.
Each eval runs in an isolated VM partly because of concurrent processing of components and partly because every invocation has a different set of vars which makes the import cache useless anyway. That is, instead of re-using VMs in complex ways taking concurrency into account etc., my code just creates a fresh VM every time.
What would be really helpful is if I could turn the initial object to an ast.Node using SnippetToAST and initialize it as vm.ExtCodeNode(name, node) or something.
Passing in the serialized version of the object negates the gains I've made in the preprocessing in that the extVar is eagerly turned into a node every time even if the component code doesn't happen to require the value of the var. And being a giant object, it happens to be costly to parse.
I hacked together some crappy code to implement ExtCodeNode in my local vendor tree and verified that what I'm describing is indeed the bottleneck that makes everything slow.
Other ideas also welcome.
The text was updated successfully, but these errors were encountered:
We could also allow providing lazy values as extVars/TLA: #338. I believe it would also solve this problem. This is a more general solution which would help with other use cases as well (e.g. passing secrets).
We can introduce ExtCodeNode and TLACodeNode upstream, if you feel like polishing your thing a little bit. We already have evaluation of nodes in the API, so we can just as well accept that for ExtVars and TLAs.
every invocation has a different set of vars which makes the import cache useless anyway.
Not really. Extvars only invalidate value cache. Parsing cache (which keeps ast.Node) survives adding extVars.
concurrent processing of components
Yes, this is a problem, especially when evaluating with different extvars/tlas.
So there is also this crazy hack I can implement (just for the use-case I have in mind), which is to set the extVar to import 'rando-namespace://var and arrange for a custom importer to support rando-namespace:// style imports to return the actual object as a string.
This takes care of one problem in that the giant object is not parsed except on demand but still requires parsing of the giant object when needed and doesn't provide the speedups I could've had with an ast.Node
I have a use-case where a piece of jsonnet code that is costly to eval returns an object and I want to use this object for subsequent invocations of other files in separate VMs as an external code var. This object is big and not everything that is subsequently eval-ed needs to use it.
Each eval runs in an isolated VM partly because of concurrent processing of components and partly because every invocation has a different set of vars which makes the import cache useless anyway. That is, instead of re-using VMs in complex ways taking concurrency into account etc., my code just creates a fresh VM every time.
What would be really helpful is if I could turn the initial object to an
ast.Node
usingSnippetToAST
and initialize it asvm.ExtCodeNode(name, node)
or something.Passing in the serialized version of the object negates the gains I've made in the preprocessing in that the extVar is eagerly turned into a node every time even if the component code doesn't happen to require the value of the var. And being a giant object, it happens to be costly to parse.
I hacked together some crappy code to implement
ExtCodeNode
in my local vendor tree and verified that what I'm describing is indeed the bottleneck that makes everything slow.Other ideas also welcome.
The text was updated successfully, but these errors were encountered: