Skip to content

adding model and dataset flags to harness#18

Open
stamcenter wants to merge 5 commits intofhe-benchmarking:mainfrom
stamcenter:harness-add-params
Open

adding model and dataset flags to harness#18
stamcenter wants to merge 5 commits intofhe-benchmarking:mainfrom
stamcenter:harness-add-params

Conversation

@stamcenter
Copy link

@stamcenter stamcenter commented Mar 19, 2026

This pull request extends the harness interface by introducing two new parameters:
--model
--dataset

These parameters allow the harness to dynamically select the model and dataset used during submission execution. It also moved the fault mlp model provided in the submission folder into the submissions folder for testing.

@stamcenter stamcenter changed the title Harness add params adding model and dataset flags to harness Mar 19, 2026
Copy link
Contributor

@andreea-alexandru andreea-alexandru left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, I just left a few minor comments.

labels_file=LABELS_PATH,
num_samples=num_samples,
seed=seed)
if dataset_name == "mnist":
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Turn this into a switch statement in preparation for multiple models?

exec_dir = params.rootdir/ ("submission_remote" if remote_be else "submission")
exec_dir = params.rootdir/ ("submission_remote" if remote_be else "submissions")

# check whether the exec_dir contains a subdirector equals to the model name.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo

@@ -28,20 +28,26 @@ def main():

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update the README to reflect the new python3 harness/run_submission.py -h.

"total_latency_ms": round(sum(_timestamps.values()), 4),
"per_stage": _timestampsStr,
"bandwidth": _bandwidth,
"mnist_model_quality" : _model_quality,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just "model_quality"?

#include <chrono>
#include <fstream>
#include <iomanip>
#include <nlohmann/json.hpp>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't seem like something relevant changed in build_task.sh, but now I get a compilation error here
fatal error: nlohmann/json.hpp: No such file or directory
24 | #include <nlohmann/json.hpp>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants