HSLU CaptureStudio: A Fast Volumetric Capture and Reconstruction Pipeline for Dynamic Point Clouds and Gaussian Splats
This is the official code repository for the paper:
A Fast Volumetric Capture and Reconstruction Pipeline for Dynamic Point Clouds and Gaussian Splats
Athanasios Charisoudis, Simone Croci, Lam Kit Yung, Pascal Frossard, Aljosa Smolic
European Conference on Visual Media Production (CVMP ’25)
DOI: 10.1145/3756863.3769713
Project page: https://irc-hslu.github.io/capturestudio
Paper: https://doi.org/10.1145/3756863.3769713
The code for the HSLU CaptureStudio pipeline (capture, reconstruction, and export of dynamic point clouds and Gaussian splats) is currently being:
-[x] cleaned up and modularized -[x] documented -[x] prepared for a public release -[x] initial version out (v1) -[ ] incorporate v2 improvements (in progress) -[ ] complete requirements.txt -[ ] scripts for reconstruction and export
If you urgently need access to parts of the code, or have specific research / industry use cases, feel free to reach out:
Athanasios Charisoudis
Lucerne University of Applied Sciences and Arts (HSLU)
📧 athanasios.charisoudis@hslu.ch
I’m happy to discuss:
- early access to parts of the pipeline (where possible)
- collaborations
- reproducibility questions
- integration into your own projects
The repository is organized as follows:
docs/: Project website (based on aifolio template)recon-viewer/: Interactive viewer for reconstructed dynamic point clouds and Gaussian splatssrc/: Main source code for the HSLU CaptureStudio pipeline
-
Install celery system-wide:
sudo apt update sudo apt install -y celery
-
Install the Python dependencies:
python -m pip install celery redis
- Open 3 terminals and run the following commands in each (after cding to the
srcdirectory): Start 12 cpu workers:Start 2 gpu workers:celery -A tasks worker --loglevel=INFO --concurrency=12 --max-tasks-per-child=1 -Q cpu --hostname=cpu@%h
Start flower monitoring tool:celery -A tasks worker --loglevel=INFO --concurrency=2 --max-tasks-per-child=1 -Q gpu --hostname=gpu@%h
celery -A tasks flower --port=5555
- Open 3 terminals and run the following commands in each (after cding to the
-
Run the submission script to start the tasks:
python src/_misc/submission_scripts/apr_may_2025.py
Edit the
src/_misc/submission_scripts/apr_may_2025.pyby providing the performances that you want to run the tasks on.See also other scripts in src/_misc/submission_scripts/
-
Open your browser and go to
http://localhost:5555to access the Flower monitoring tool. -
Handling failed tasks: If any tasks fail, you can retry them by first restarting celery workers and run the submission script (successfully finished tasks are not redone). To do that, first clear the queues by running in src directory:
sudo rabbitmqctl purge_queue cpu && sudo rabbitmqctl purge_queue gpu && redis-cli flushdb && celery -A tasks control shutdown
Please note that this project is under active development, and the code may change frequently. If you encounter any issues or have suggestions, feel free to open an issue on the project's GitHub repository.
If you use this work in your research, please cite:
@inproceedings{10.1145/3756863.3769713,
author = {Charisoudis, Athanasios and Croci, Simone and Lam, Kit Yung and Frossard, Pascal and Smolic, Aljosa},
title = {A Fast Volumetric Capture and Reconstruction Pipeline for Dynamic Point Clouds and Gaussian Splats},
year = {2025},
isbn = {9798400721175},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3756863.3769713},
doi = {10.1145/3756863.3769713},
booktitle = {Proceedings of the 22nd ACM SIGGRAPH European Conference on Visual Media Production},
articleno = {9},
numpages = {11},
keywords = {Volumetric video capture, point clouds, Gaussian splats, dynamic reconstruction},
series = {CVMP '25}
}