Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Improve set variable performance #3257

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

thedist
Copy link
Member

@thedist thedist commented Feb 6, 2025

Part of a set of changes (the other being to companion-module-base) to improve the performance of modules setting variables.

Currently, modules set variable values as an Object of key/value pairs, such as { someVariableId: 'variableValue' }, but the companion-module-base transforms this into an array such as [{ variableId: 'someVariableId, value: 'variableValue' }], and then after being sent from the child process to Companion it has to be transformed into { someVariableId: 'variableValue' } again. This iteration over the variables adds avoidable overhead.

A simple test I set up to generate an arrays of set amounts of variables took this long to transform to an object:
1,000 variables: 0.2ms
10,000 variables: 1.8344ms
100,00 variables: 23.3069ms
1,000,000 variables: 148.7271ms

Compared to if the object sent by the instance was already an object
1,000 variables: 0.0024ms
10,000 variables: 0.0006ms
100,000 variables: 0.0006ms
1,000,000 variables: 0.0014ms

This PR should be fully backwards compatible, so all existing modules would continue to send arrays of variables and need to be transformed, but any module that updates their companion module base (with a PR that will be coming shortly), it would be sending an object and bypass the transforming completely.

@Julusian
Copy link
Member

Julusian commented Feb 6, 2025

I think the reason I did this is that sending { abc: undefined } over IPC will likely result in abc disappearing from the object entirely. Passing it as an array avoids this.

What would be the impact of leaving the object -> array conversion inside module-base, and passing it around as an array inside companion? We are simply iterating over the whole object, so it could easily remain as an array (will require some simple fixing up of other callsites of that method)

I suppose another hazard that would be nice to avoid is if a 'batch' of changes includes changing the same variable multiple times. It would be nice to optimise this out, but maybe we can reason that this should never happen, and that its a bug in module-base if it does

@thedist
Copy link
Member Author

thedist commented Feb 6, 2025

One of the reasons I didn't want it to remain an array is the performance improvements I'm testing in companion module base involves using a Map rather than an object (but trying to send a Map over IPC results in an empty object {}, and transforming it to a Map rather than an Object on Companions side didn't show much in the way of performance gains (at least not with the current setup).

Edit: I just checked and you are indeed correct, undefined couldn't be sent, but null can so perhaps that could be a solution for that particular issue.

The only time that undefined should be set is either if the module dev intentionally wants to remove a variable, or if validation is enabled and the value being set is for an variable id not defiend. These are quite niche situations so perhaps another option would be to remove these entirely from the object being sent from instance to companion, and providing a separate IPC message such as deleteVariableValues which is an array of variable id strings for Companion to delete.

@jswalden
Copy link
Contributor

jswalden commented Feb 7, 2025

Letting the internal value encoding (or encoding constrained by IPC functionality) bleed into the user-facing UI seems like kind of a mistake. If you're going to require modules to affirmatively opt in to the newer, faster UI, maybe it makes sense to add a new, parallel UI -- and then you can implement the old slow UI in terms of the new one. (And by "new UI" maybe that also includes separating the variable-setting case from the variable-deleting case int two new functions.)

@thedist
Copy link
Member Author

thedist commented Feb 7, 2025

Letting the internal value encoding (or encoding constrained by IPC functionality) bleed into the user-facing UI seems like kind of a mistake. If you're going to require modules to affirmatively opt in to the newer, faster UI, maybe it makes sense to add a new, parallel UI -- and then you can implement the old slow UI in terms of the new one. (And by "new UI" maybe that also includes separating the variable-setting case from the variable-deleting case int two new functions.)

What user-facing UI are you talking about? This has nothing whatsoever to do with UI, or anything directly user-facing at all so I'm confused about your comment.

@thedist thedist marked this pull request as draft February 7, 2025 00:21
@thedist
Copy link
Member Author

thedist commented Feb 7, 2025

Converted to Draft while I'm working on some changes on the companion-module-base to assist in this.

@jswalden
Copy link
Contributor

jswalden commented Feb 7, 2025

What user-facing UI are you talking about?

Sorry, by "user" I meant "module developer" as opposed to "Companion implementer". Didn't think of the obvious confusion that word would have here! In my head modules use Companion, so they/module developers are "users" of it, even if that's plainly not the best name to be thinking of here.

@thedist
Copy link
Member Author

thedist commented Feb 7, 2025

Well in terms of integration for Module Developers, I personally don't see what benefit it would have to split this into multiple ways for developers to have their module send variables to Companion.

Currently, module developers use instance.setVariableValues and pass it an object.
From there companion-module-base (still within the instance of that module) iterates over it and makes an array of objects which is then sent to Companion over IPC.
Companion takes that array and turns back into an object, and processes the variables internally within Companion.

The changes I'm working on are to 1. lower the overhead between companion-module-base and Companion, this would be entirely transparent to Module Developers, you'd still set your variables exactly the same way. and 2. updating setVariableValues within companion-module-base to support a Map, as on the module developers side this can have performance gains when generating the variables and values before they are even sent to setVariableValues, but currently a Map has to first be changed to an Object before passing to setVariableValues which has some overhead itself.

What will this mean for Module Developers?
if you want to send an object to setVariableValues like all modules currently do, that'll work without changes.
If you want to send an object to setVariableValues, but have a version of Companion and use companion-module-base with these changes, you'll have performance benefits without having to do anything.
If you want to send a Map to setVariableValues, that'll work too with a version of companion-module-base that supports it.
If you want to delete variables, again, use setVariableValues the same way it has been already and just use undefined as a value, how they are deleted would all be handled behind the scenes as Module Developers don't need to get into the specifics of it.

So my plans are to be full backwards compatible, no need for separate functions to send variables using the new way, no need for a separate function to delete variables.

@Julusian
Copy link
Member

Julusian commented Feb 7, 2025

@thedist That all sounds fine with me.

I am still curious on this though:

What would be the impact of leaving the object -> array conversion inside module-base, and passing it around as an array inside companion? We are simply iterating over the whole object, so it could easily remain as an array (will require some simple fixing up of other callsites of that method)

How much does it cost to do the object/map to array conversion there, especially compared to alternatives. It sounds like its going to have to do some kind of conversion (to handle both object and map, and to preserve setting to undefined), so it feels like keeping it transported as an array could be the best option.
The key thing we can do is minimise the cost inside companion, as the array to object change is pretty pointless.

If you are going to change the ipc format, dont feel like you need to make it work within the existing ipc message. Adding a new one with a different parameter layout is perfectly fine. I did this already in I think 1.2 for how we send updated configs to modules

@thedist
Copy link
Member Author

thedist commented Feb 18, 2025

This is some testing I did, essentially breaking down each function from generating variables within a module, to passing to the module base, stringifying, then on Companions side parsing and checking for changes, plus any conversions that need to happen for different combintations.

Essentially what it seems like is the Maps are the fastest to work with, but all performance gains are lost by either needing to transform the map into another form (such as when using ejson), or if using superjson the stringify is about 3 times longer so all gains are lost there.

I've tried several variations of how variables are generated, at which point they are transformed, different stringify libs, and haven't been able to find any changes to the current setup that would make a significant difference. The only thing I can suggest right now is modules will need to work to reduce the number of variables used (or at least provide users options to enable/disable sets of variables) to reduce the total number being processed, and secondly would be to perhaps rethink how we handle variables entirely to see if there's any alternative ways it could be done.

Module

Function 1,000 10,000 100,000 1,000,000
Generating Variables as Obj 0.997ms 4.0264ms 53.7642ms 565.5797ms
Generating Variables as Map 0.1418ms 1.5503ms 14.7072ms 259.631ms
Converting Map to Obj 12.1193ms 3.5406ms 183.9174ms 414.7543ms
setVariableValues with Obj 0.5981ms 4.8589ms 53.8081ms 1007.8314ms
setVariableValues with Map 0.2899ms 3.0237ms 23.6834ms 359.5598ms
Converting Map to Array 0.1228ms 0.6213ms 5.483ms 251.4124ms
ejson stringify with Array 3.0806ms 12.4404ms 79.5981ms 866.2747ms
Superjson stringify with Array 7.3859ms 27.8722ms 202.8095ms 2470.637ms
Superjson stringify with Map 1.7032ms 19.6336ms 201.4564ms 2432.1415ms

Companion

Function 1,000 10,000 100,000 1,000,000
ejson parse with Array 1.1377ms 6.6636ms 72.2971ms 772.8875ms
Superjson parse with Array 2.1457ms 10.4405ms 74.6938ms 777.4161ms
Superjson parse with Map 0.8677ms 8.8555ms 74.2021ms 768.9108ms
Array to Obj 0.3515ms 1.9336ms 13.9112ms 265.8113ms
Map to Obj (not needed) 0.0011ms 0.0003ms 0.0004ms 0.0003ms
Check Obj for changes 0.2663ms 2.3594ms 38.8668ms 393.3443ms
Superjson parse with Map 0.3413ms 2.3992ms 33.205ms 351.2783ms

Stringify string sizes

Type 1,000 10,000 100,000 1,000,000
ejson 29795 317795 3377795 35777795
superjson arr 29804 317804 3377804 35777804
superjson map 29804 317804 3377804 35777804

Totals

Type 1,000 10,000 100,000 1,000,000
Current with ejson 6.43 32.28 312.25 3871.73
Current with superjson 11.74 51.49 437.85 5480.62
Map with superjson 3.35 35.46 347.25 4171.52
Map to Arr with ejson 10.70 47.80 374.15 4777.81
Map to Obj with ejson 17.70 33.35 457.11 3980.53

@Julusian
Copy link
Member

I'm curious if JSON.stringify would be any faster?

I am thinking that maybe we could do something weird like:

ejson.stringify({ // the general ipc layer
  type: 'setVariables'
  values: JSON.stringify(.....) 
})

essentially, pre-stringify the values with some cheaper algorithm/lib that supports less types.
I would would be open to taking this to an extreme of some custom serializer if that provides enough benefit

But from these numbers, I do see 2 smaller gains we can make, setVariableValues with Map would be a safe and easy change to support. And I still think we should drop the Array to Obj step inside Companion, all we do with it is iterate over the array in one place so it may as well stay as an array. Presumably that will make Check Obj for changes a bit quicker too, due to less random lookups. Between them that would be 10% less time, which is a start.

Something else that might be worth looking into, is whether it would be faster to drop ejson entirely and instead enable serialization: 'advanced' for the child process, to let the ipc use structuredclone. I suspect that will cover enough types that we need. I think I wanted to avoid anything too nodejs specific, as I want to keep the option open to supporting modules in other languages, but all that will mean is retaining support for ejson/json later on

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants