-
-
Notifications
You must be signed in to change notification settings - Fork 514
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Rendering specific text crashes Companion #3285
Comments
I wonder if this may be the cause of the issue in #3282 |
I have just tried to reproduce this problem but I could not generate a crash. I tried rendering text from a custom variable that I doubled repeatedly. I stopped at 100 something million characters, because it started getting slow but no crash. What do you mean with "large"? GB? |
I checked today. You right, It happens not every time. But if you add generic http get module. Add 2025-02-14.15-20-11.mp4 |
OK, so far I can say that there are no fancy characters in the response that coud cause the crash. |
Just tried on fresh companion install on win 11... Same crash when storing html in a variable. |
I'm also noticing how slow it is to draw buttons with this much text on them, cpu looks to be maxing out a core for multiple seconds, which looks to be our algorithm for splitting the text into lines. I'm going to see what I can do on this One potential solution I am wondering about is adding some arbitrary limits to how much text we try to draw. That doesnt necessarily address the underlying issue though, just making it less likely to occur |
I thought, I'd already done that when speeding up the linebreaking algorithm for the Auto size. Maybe it got lost. |
It was not drawing any lines that wouldn't be visible, but it is still chunking up the text that will never be visible. I'm also seeing each call to |
… will be visible. fix handling of some multi-codepoint utf characters #3285 Optimise line drawing, don't try measuring more characters than would be visible at 1px each.
@Julusian with your fixes I could not reproduce the error any more, even not when I take out the slice and feed the whole text. But while looking at the code I saw that it has quite evolved from what I wrote back in the days and there are now some new errors introduced.
As you say, the speed is also not not good any more. Last time I revisited that part of the code I could speed it up to something like 14ms for a worst case scenario with large text and Auto size. Didn't measure now but it feels like a second with fixed size and two seconds with Auto. The original code was at many points not looking nice but it was tweaked a little bit to speed, e.g. working often with bytes and not with characters. I wonder if this part should not only have its own class but done from scratch (the 3rd time then ;-) ). |
have any example of 1 and 2 I can look at? yeah 3 is a side effect of making it possible to edit custom dropdown values. might need to opt out of that behaviour on this dropdown re performance, I admit I only really looked at this for long strings, so my focus was on breaking out of the loops early, and trying to avoid computations that would never yield a sensible result (measuring the width of 10000 chars on one line). But those got down to being in the scale of 10-50ms (I think), so I don't know why shorter strings would have suddenly become much longer. Some of the other changes I did while in here was, using a predictable string for measuring line height, otherwise the line height grew if you added an emoji to line 1; and using
I don't particularly fancy doing that tbh, even this change which felt safe appears to have side effects...
Yeah, I think that is going to be a challenge for headless versions, typically chrome/electron/CEF don't run without at least a partial desktop environment, which makes the headless version be not particularly server friendly. It is doable, but people (including me) wont be happy about it. Another option to consider is whether we just drop some of the auto sizing behaviour when doing the drawing overhaul. I dont know what could be dropped and what would be helpful to drop, but could be worth considering |
I'm not seeing that line break issue locally (on linux, but that shouldnt matter) The top and bottom I am, forgot to trim the last line off..
I mean it was a recent change which doesnt play well here #2910. I've fixed this case so that it behaves like before.
The problem I was looking to solve here is the latest comment of #2731. It was only sampling the height of the first line, so the spacing may be correct, but also may not be depending on what the line was. I did try using 'correct' spacing for each line, but then emoji/unicode characters had a weirdly long space after them, and this felt like the best compromise, and closer to how it behaved before but more consistently. I am not set on my solution here though, I don't mind it being refined or reverted.
One challenge here will be that the current headless builds are the user downloading the electron build and discarding some of the files. https://github.com/bitfocus/companion-pi/blob/c976f482ded0e9552143b02d371d19ce8bc7c427/update.sh#L148-L152
I am not really after faster as such, mainly more features. But I am pretty sure that a pure native version would be faster, the interop between js and c++ does add some overhead. |
Interesting, my screenshot is from linux too. Running the dev environment on Ubuntu 22.04 There should be a line break at the space. |
Is this a bug in companion itself or a module?
Is there an existing issue for this?
Describe the bug
When i insert large text (like responce of HTTP Get Request from generic http module) or just in custom variable editor. If i use this variable in in text for knob. Or some variable (for example vmix title text) have large text in it. Or I just insert large text in button text field, companion crashes with stack trace:
Seems like error in canvas submodule...
Steps To Reproduce
paste large text in Button text field
Expected Behavior
Companion not crushes
Environment (please complete the following information)
Additional context
No response
The text was updated successfully, but these errors were encountered: