Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Measure performance impact of recent async changes #444

Open
radekmie opened this issue Dec 18, 2023 · 1 comment
Open

Measure performance impact of recent async changes #444

radekmie opened this issue Dec 18, 2023 · 1 comment
Assignees
Milestone

Comments

@radekmie
Copy link
Collaborator

While getting ready for Meteor 3.0, we've done a lot of internal changes (#412, #413, #424, #428; soon also one for #443). Most of them are simply adding on top of the existing logic to support Promises virtually everywhere; some are actual redesigns (e.g., shape of _scopeBindings).

Of course, adding more logic always comes with a cost, and we should check, whether it's acceptable. So far we didn't hear about any visible regressions, but it should be checked nevertheless.

It's strongly connected to #383, since the majority of the benchmarks should remain synchronous-only, so we could compare the results with previous versions.

@radekmie radekmie added this to the Blaze 3.0 milestone Dec 18, 2023
@radekmie radekmie self-assigned this Dec 18, 2023
@radekmie
Copy link
Collaborator Author

I've looked into it over the week, and I have a problem on how to approach it.

  • On the one hand, we'd like to measure the performance change in a real-life scenario. That involves a complete data change -> render -> DOM change cycle. However, the non-DOM operations take so little computation here, that it's barely visible in the profiler.
  • On the other, we could focus on the underneath helpers, like Spacebars.dot. However, as written above, in a real-life scenario, these account for 1-3% of CPU.

I did prepare a basic example that rerenders a lot of times on click.

Templates
<head>
  <title>blaze-benchmarks</title>
  <meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>

<body>
  {{> benchmark template='let_alias' iterations=1000}}
  {{> benchmark template='let_object_shallow' iterations=1000}}
  {{> benchmark template='let_object_deep' iterations=1000}}
</body>

<template name="benchmark">
  <button>Run</button>
  <code style="display: inline-block; width: 20ch"><b>{{template}}</b></code>
  {{#if summary}}
    <code>{{summary}}</code>
  {{/if}}
  <br>

  <div style="display: none">
    {{> Template.dynamic data=data template=template}}
  </div>
</template>

<template name="let_alias">
  {{#let x=a}}{{x}}{{/let}}
</template>

<template name="let_object_shallow">
  {{#let x=a.b}}{{x}}{{/let}}
</template>

<template name="let_object_deep">
  {{#let x=a.b.c.d.e.f}}{{x}}{{/let}}
</template>
import { Template } from 'meteor/templating';
import { ReactiveVar } from 'meteor/reactive-var';

import './main.html';

const values = new ReactiveVar({
  a: { b: { c: { d: { e: { f: '' } } } } }
});

Template.benchmark.onCreated(function () {
  this.summary = new ReactiveVar(null);
});

Template.benchmark.events({
  'click button'(event, { data: { iterations }, summary }) {
    const timings = [];

    // Warmup.
    for (let iteration = 0; iteration < iterations; ++iteration) {
      values.dep.changed();
      Tracker.flush();
    }

    // Measurement.
    for (let round = 0; round < 100; ++round) {
      const now = performance.now();
      for (let iteration = 0; iteration < iterations; ++iteration) {
        values.dep.changed();
        Tracker.flush();
      }
      timings.push(performance.now() - now);
    }

    // Summary.
    const ps = [25, 50, 90, 95];
    timings.sort();
    summary.set(ps.map(p => `p${p}=${Math.floor(timings[p - 1])}`).join(' '));
  },
});

Template.benchmark.helpers({
  data: () => values.get(),
  summary: () => Template.instance().summary.get(),
});

But, as expected, these all take basically the same amount of time.
UI

What do others think about it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Being worked on
Development

No branches or pull requests

1 participant