Skip to content

Conversation

@ben-grande
Copy link
Contributor

@ben-grande ben-grande force-pushed the preload-dispvm branch 4 times, most recently from 67b3ebe to fda9b2a Compare April 29, 2025 15:43
@marmarek
Copy link
Member

marmarek commented May 7, 2025

@deeplow do you have any opinion on this regarding your usage of "internal" qubes? With this change it's still possible to make qrexec calls to them if the policy says "allow", but cannot be targeted with "ask" anymore.

@deeplow
Copy link

deeplow commented May 7, 2025

Conceptually it makes sense, since the user is not supposed to be aware of them. But I'll double check and get back to you.

@deeplow
Copy link

deeplow commented May 7, 2025

I have confirmed, it's OK for internal qubes to no longer be the target of ask policies concerning the SecureDrop Workstation's use-case. Thanks!

@marmarek
Copy link
Member

marmarek commented May 7, 2025

Ok, thanks :)

@ben-grande now just black complains

@codecov
Copy link

codecov bot commented May 7, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 78.93%. Comparing base (6fd3696) to head (c717c92).
⚠️ Report is 16 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #198      +/-   ##
==========================================
- Coverage   79.02%   78.93%   -0.10%     
==========================================
  Files          55       55              
  Lines       10502    10508       +6     
==========================================
- Hits         8299     8294       -5     
- Misses       2203     2214      +11     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@ben-grande
Copy link
Contributor Author

Resolved merge conflicts.

@qubesos-bot
Copy link

qubesos-bot commented May 22, 2025

OpenQA test summary

Complete test suite and dependencies: https://openqa.qubes-os.org/tests/overview?distri=qubesos&version=4.3&build=2025052321-4.3&flavor=pull-requests

Test run included the following:

New failures, excluding unstable

Compared to: https://openqa.qubes-os.org/tests/overview?distri=qubesos&version=4.3&build=2025031804-4.3&flavor=update

  • system_tests_whonix

    • whonix_torbrowser: unnamed test (unknown)
    • whonix_torbrowser: Failed (test died)
      # Test died: no candidate needle with tag(s) 'anon-whonix-tor-brows...
  • system_tests_dispvm

  • system_tests_kde_gui_interactive

    • gui_keyboard_layout: wait_serial (wait serial expected)
      # wait_serial expected: "echo -e '[Layout]\nLayoutList=us,de' | sud...

    • gui_keyboard_layout: Failed (test died)
      # Test died: command 'test "$(cd ~user;ls e1*)" = "$(qvm-run -p wor...

  • system_tests_qwt_win10_seamless@hw13

    • windows_clipboard_and_filecopy: unnamed test (unknown)
    • windows_clipboard_and_filecopy: Failed (test died)
      # Test died: no candidate needle with tag(s) 'windows-Edge-address-...
  • system_tests_qwt_win11@hw13

    • windows_install: wait_serial (wait serial expected)
      # wait_serial expected: qr/dcWzE-\d+-/...

    • windows_install: Failed (test died + timed out)
      # Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...

  • system_tests_gui_tools@hw7

    • qui_widgets_devices: unnamed test (unknown)
    • qui_widgets_devices: Failed (test died)
      # Test died: no candidate needle with tag(s) 'qui-devices-dev-opene...

Failed tests

18 failures
  • system_tests_whonix

    • whonix_torbrowser: unnamed test (unknown)
    • whonix_torbrowser: Failed (test died)
      # Test died: no candidate needle with tag(s) 'anon-whonix-tor-brows...
  • system_tests_dispvm

  • system_tests_kde_gui_interactive

    • gui_keyboard_layout: wait_serial (wait serial expected)
      # wait_serial expected: "echo -e '[Layout]\nLayoutList=us,de' | sud...

    • gui_keyboard_layout: Failed (test died)
      # Test died: command 'test "$(cd ~user;ls e1*)" = "$(qvm-run -p wor...

  • system_tests_qwt_win10_seamless@hw13

    • windows_clipboard_and_filecopy: unnamed test (unknown)
    • windows_clipboard_and_filecopy: Failed (test died)
      # Test died: no candidate needle with tag(s) 'windows-Edge-address-...
  • system_tests_qwt_win11@hw13

    • windows_install: wait_serial (wait serial expected)
      # wait_serial expected: qr/dcWzE-\d+-/...

    • windows_install: Failed (test died + timed out)
      # Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...

  • system_tests_gui_tools@hw7

    • qui_widgets_devices: unnamed test (unknown)
    • qui_widgets_devices: Failed (test died)
      # Test died: no candidate needle with tag(s) 'qui-devices-dev-opene...

Fixed failures

Compared to: https://openqa.qubes-os.org/tests/132953#dependencies

14 fixed
  • system_tests_whonix

    • whonixcheck: fail (unknown)
      Whonixcheck for sys-whonix failed...

    • whonixcheck: unnamed test (unknown)

  • system_tests_suspend

    • suspend: unnamed test (unknown)
    • suspend: Failed (test died)
      # Test died: no candidate needle with tag(s) 'SUSPEND-FAILED' match...
  • system_tests_basic_vm_qrexec_gui

  • system_tests_qrexec

  • system_tests_kde_gui_interactive

    • clipboard_and_web: unnamed test (unknown)

    • clipboard_and_web: Failed (test died)
      # Test died: no candidate needle with tag(s) 'qubes-website' matche...

    • clipboard_and_web: wait_serial (wait serial expected)
      # wait_serial expected: "lspci; echo 2E8vz-\$?-"...

  • system_tests_guivm_vnc_gui_interactive

    • gui_filecopy: unnamed test (unknown)
    • gui_filecopy: Failed (test died)
      # Test died: no candidate needle with tag(s) 'files-work' matched...
  • system_tests_audio

  • system_tests_whonix@hw7

    • whonixcheck: fail (unknown)
      Whonixcheck for sys-whonix failed...

    • whonixcheck: unnamed test (unknown)

Unstable tests

  • system_tests_update

    update2/Failed (1/5 times with errors)
    • job 139051 # Test died: command 'script -c 'qubes-vm-update --force-update --l...
  • system_tests_update@hw1

    update2/Failed (1/5 times with errors)
    • job 139051 # Test died: command 'script -c 'qubes-vm-update --force-update --l...
  • system_tests_update@hw7

    update2/Failed (1/5 times with errors)
    • job 139051 # Test died: command 'script -c 'qubes-vm-update --force-update --l...
  • system_tests_update@hw13

    update2/Failed (1/5 times with errors)
    • job 139051 # Test died: command 'script -c 'qubes-vm-update --force-update --l...

Performance Tests

Performance degradation:

15 performance degradations
  • debian-12-xfce_exec: 8.56 🔺 ( previous job: 7.12, degradation: 120.17%)
  • fedora-41-xfce_exec-data-duplex: 79.08 🔺 ( previous job: 71.56, degradation: 110.51%)
  • dom0_root_seq1m_q8t1_read 3:read_bandwidth_kb: 214696.00 :small_red_triangle: ( previous job: 446963.00, degradation: 48.03%)
  • dom0_root_seq1m_q1t1_read 3:read_bandwidth_kb: 224919.00 :small_red_triangle: ( previous job: 294295.00, degradation: 76.43%)
  • dom0_varlibqubes_seq1m_q8t1_write 3:write_bandwidth_kb: 141171.00 :small_red_triangle: ( previous job: 250795.00, degradation: 56.29%)
  • dom0_varlibqubes_rnd4k_q1t1_write 3:write_bandwidth_kb: 3093.00 :small_red_triangle: ( previous job: 4903.00, degradation: 63.08%)
  • fedora-41-xfce_root_seq1m_q1t1_read 3:read_bandwidth_kb: 271230.00 :small_red_triangle: ( previous job: 318716.00, degradation: 85.10%)
  • fedora-41-xfce_root_seq1m_q1t1_write 3:write_bandwidth_kb: 39245.00 :small_red_triangle: ( previous job: 87940.00, degradation: 44.63%)
  • fedora-41-xfce_root_rnd4k_q32t1_write 3:write_bandwidth_kb: 2673.00 :small_red_triangle: ( previous job: 3599.00, degradation: 74.27%)
  • fedora-41-xfce_private_seq1m_q8t1_write 3:write_bandwidth_kb: 143005.00 :small_red_triangle: ( previous job: 170062.00, degradation: 84.09%)
  • fedora-41-xfce_private_rnd4k_q32t1_write 3:write_bandwidth_kb: 1473.00 :small_red_triangle: ( previous job: 2215.00, degradation: 66.50%)
  • fedora-41-xfce_private_rnd4k_q1t1_write 3:write_bandwidth_kb: 795.00 :small_red_triangle: ( previous job: 1130.00, degradation: 70.35%)
  • fedora-41-xfce_volatile_seq1m_q8t1_write 3:write_bandwidth_kb: 120234.00 :small_red_triangle: ( previous job: 179949.00, degradation: 66.82%)
  • fedora-41-xfce_volatile_rnd4k_q32t1_write 3:write_bandwidth_kb: 2890.00 :small_red_triangle: ( previous job: 5672.00, degradation: 50.95%)
  • fedora-41-xfce_volatile_rnd4k_q1t1_write 3:write_bandwidth_kb: 1117.00 :small_red_triangle: ( previous job: 1953.00, degradation: 57.19%)

Remaining performance tests:

41 tests
  • debian-12-xfce_exec-root: 29.75 🔺 ( previous job: 28.65, degradation: 103.83%)
  • debian-12-xfce_socket: 8.19 🟢 ( previous job: 8.60, improvement: 95.16%)
  • debian-12-xfce_socket-root: 7.81 🟢 ( previous job: 8.52, improvement: 91.56%)
  • debian-12-xfce_exec-data-simplex: 76.07 🔺 ( previous job: 71.62, degradation: 106.21%)
  • debian-12-xfce_exec-data-duplex: 74.54 🔺 ( previous job: 70.34, degradation: 105.96%)
  • debian-12-xfce_exec-data-duplex-root: 71.93 🟢 ( previous job: 82.72, improvement: 86.96%)
  • debian-12-xfce_socket-data-duplex: 147.66 🟢 ( previous job: 156.96, improvement: 94.08%)
  • fedora-41-xfce_exec: 9.63 🔺 ( previous job: 9.27, degradation: 103.96%)
  • fedora-41-xfce_exec-root: 59.70 🟢 ( previous job: 61.51, improvement: 97.06%)
  • fedora-41-xfce_socket: 8.37 🟢 ( previous job: 8.63, improvement: 96.95%)
  • fedora-41-xfce_socket-root: 8.76 🔺 ( previous job: 8.71, degradation: 100.58%)
  • fedora-41-xfce_exec-data-simplex: 75.95 🔺 ( previous job: 75.53, degradation: 100.54%)
  • fedora-41-xfce_exec-data-duplex-root: 94.74 🟢 ( previous job: 109.13, improvement: 86.82%)
  • fedora-41-xfce_socket-data-duplex: 153.88 🔺 ( previous job: 150.61, degradation: 102.17%)
  • dom0_root_seq1m_q8t1_write 3:write_bandwidth_kb: 140551.00 :green_circle: ( previous job: 129298.00, improvement: 108.70%)
  • dom0_root_seq1m_q1t1_write 3:write_bandwidth_kb: 142793.00 :green_circle: ( previous job: 95454.00, improvement: 149.59%)
  • dom0_root_rnd4k_q32t1_read 3:read_bandwidth_kb: 95102.00 :green_circle: ( previous job: 79803.00, improvement: 119.17%)
  • dom0_root_rnd4k_q32t1_write 3:write_bandwidth_kb: 7184.00 :green_circle: ( previous job: 6149.00, improvement: 116.83%)
  • dom0_root_rnd4k_q1t1_read 3:read_bandwidth_kb: 12079.00 :green_circle: ( previous job: 10795.00, improvement: 111.89%)
  • dom0_root_rnd4k_q1t1_write 3:write_bandwidth_kb: 4642.00 :small_red_triangle: ( previous job: 4826.00, degradation: 96.19%)
  • dom0_varlibqubes_seq1m_q8t1_read 3:read_bandwidth_kb: 505825.00 :green_circle: ( previous job: 382273.00, improvement: 132.32%)
  • dom0_varlibqubes_seq1m_q1t1_read 3:read_bandwidth_kb: 443560.00 :green_circle: ( previous job: 437636.00, improvement: 101.35%)
  • dom0_varlibqubes_seq1m_q1t1_write 3:write_bandwidth_kb: 204187.00 :green_circle: ( previous job: 184752.00, improvement: 110.52%)
  • dom0_varlibqubes_rnd4k_q32t1_read 3:read_bandwidth_kb: 92496.00 :green_circle: ( previous job: 62195.00, improvement: 148.72%)
  • dom0_varlibqubes_rnd4k_q32t1_write 3:write_bandwidth_kb: 10526.00 :green_circle: ( previous job: 6479.00, improvement: 162.46%)
  • dom0_varlibqubes_rnd4k_q1t1_read 3:read_bandwidth_kb: 7759.00 :green_circle: ( previous job: 7669.00, improvement: 101.17%)
  • fedora-41-xfce_root_seq1m_q8t1_read 3:read_bandwidth_kb: 375295.00 :green_circle: ( previous job: 368309.00, improvement: 101.90%)
  • fedora-41-xfce_root_seq1m_q8t1_write 3:write_bandwidth_kb: 187051.00 :green_circle: ( previous job: 162081.00, improvement: 115.41%)
  • fedora-41-xfce_root_rnd4k_q32t1_read 3:read_bandwidth_kb: 86047.00 :green_circle: ( previous job: 82694.00, improvement: 104.05%)
  • fedora-41-xfce_root_rnd4k_q1t1_read 3:read_bandwidth_kb: 8695.00 :green_circle: ( previous job: 8485.00, improvement: 102.47%)
  • fedora-41-xfce_root_rnd4k_q1t1_write 3:write_bandwidth_kb: 488.00 :small_red_triangle: ( previous job: 542.00, degradation: 90.04%)
  • fedora-41-xfce_private_seq1m_q8t1_read 3:read_bandwidth_kb: 377185.00 :green_circle: ( previous job: 373957.00, improvement: 100.86%)
  • fedora-41-xfce_private_seq1m_q1t1_read 3:read_bandwidth_kb: 343795.00 :green_circle: ( previous job: 334687.00, improvement: 102.72%)
  • fedora-41-xfce_private_seq1m_q1t1_write 3:write_bandwidth_kb: 87088.00 :green_circle: ( previous job: 61534.00, improvement: 141.53%)
  • fedora-41-xfce_private_rnd4k_q32t1_read 3:read_bandwidth_kb: 84817.00 :green_circle: ( previous job: 80283.00, improvement: 105.65%)
  • fedora-41-xfce_private_rnd4k_q1t1_read 3:read_bandwidth_kb: 7690.00 :green_circle: ( previous job: 7540.00, improvement: 101.99%)
  • fedora-41-xfce_volatile_seq1m_q8t1_read 3:read_bandwidth_kb: 371440.00 :green_circle: ( previous job: 369868.00, improvement: 100.43%)
  • fedora-41-xfce_volatile_seq1m_q1t1_read 3:read_bandwidth_kb: 311797.00 :small_red_triangle: ( previous job: 324737.00, degradation: 96.02%)
  • fedora-41-xfce_volatile_seq1m_q1t1_write 3:write_bandwidth_kb: 29757.00 :green_circle: ( previous job: 17567.00, improvement: 169.39%)
  • fedora-41-xfce_volatile_rnd4k_q32t1_read 3:read_bandwidth_kb: 72455.00 :small_red_triangle: ( previous job: 79021.00, degradation: 91.69%)
  • fedora-41-xfce_volatile_rnd4k_q1t1_read 3:read_bandwidth_kb: 8489.00 :green_circle: ( previous job: 7867.00, improvement: 107.91%)

@ben-grande
Copy link
Contributor Author

ben-grande commented May 22, 2025

https://openqa.qubes-os.org/tests/139759#step/TC_20_DispVM_debian-12-xfce/6
https://openqa.qubes-os.org/tests/139759#step/TC_20_DispVM_fedora-41-xfce/6

Something not being cleaned and them every dispvm test after, broken because of that.

Locally, I just encountered the first issue (which doesn't break following tests), have not encountered the selectors issue locally, will debug.

@marmarek
Copy link
Member

Something not being cleaned and them every dispvm test after, broken because of that.

Yes, some object leaked. We have quite strict checking for that between tests. Outside of tests it's easy to miss - usually it will "just" result in a memory leak that will accumulate over time so you'll notice it only after days/weeks of uptime. The test VM (intentionally) has just 8GB of memory, so issues like this are noticed earlier, even if the leak check missed it. And indeed, it looks like it happened here too:

ERROR:vm.disp8059:Start failed: Not enough memory to start domain 'disp8059'
VM disp8059 start failed at 2025-05-21 21:39:51
WARNING:vm.test-inst-dvm:Not preloading '1' disposable(s) due to insufficient memory
ERROR

See also terminal log from the test run: https://openqa.qubes-os.org/tests/139759/file/system_tests-tests-qubes.tests.integ.dispvm.log looks like preloading remains running beyond test end, you may need to extend tearDown of that test class (possibly to interrupt preloading? maybe killing all dispvms being preloaded will be enough?). The AttributeError: 'DispVM' object has no attribute 'app' may look confusing, but it happens because test cleanup already kicked in, while some background job (preloading?) was attempting to access VM objects.

Quite possibly you may need to save a reference to tasks scheduled in background (for cases of ensure_future instead of await), at least to wait for them in the tearDown method.

@ben-grande
Copy link
Contributor Author

you may need to extend tearDown

Makes sense, preloading by setting feature is a future, so I will make sure to wait for every domain that was preloaded to not be in the vm collection anymore.

See also terminal log from the test run: https://openqa.qubes-os.org/tests/139759/file/system_tests-tests-qubes.tests.integ.dispvm.log

https://openqa.qubes-os.org/tests/139759/#downloads

And now I see I can download the tests logs, that is useful.

Failed to create preloaded disposable, limit of preloaded DispVMs reached

Just note that although this is an exception and clutters the log, it happens because the used event is being simultaneously called 5 times and each of those 5 try to preload 5 new ones. This happens because I changed the used event to also refill preloaded disposables, but that doesn't work well if multiple are used at the same time. It does not cause anything to fail and preloads all that is necessary, the exception is ignored because the used event is called as future, only problem is the log garbage.

I could use an async lock but that would make it preload slower (think of 1 qube being used and them 4 qubes being used, waiting for 1 to become preloaded to start preloading other 4 seems inefficient) or I could handle the exception with a pass as the code above checks how many it could preload.

What do you think?

@marmarek
Copy link
Member

Just note that although this is an exception and clutters the log, it happens because the used event is being simultaneously called 5 times and each of those 5 try to preload 5 new ones.

Maybe change that from exception to a warning log message (and simple return)?

@ben-grande
Copy link
Contributor Author

Maybe change that from exception to a warning log message (and simple return)?

Also possible and I am taking this route then.

@marmarek marmarek merged commit fc01331 into QubesOS:main May 25, 2025
4 of 6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants