Skip to content

vllm-cpu-release Container Remains Running after test completes #91

@jharriga

Description

@jharriga

DUT system continues to run vllm container after test completes.

ansible-playbook -v -i inventory/hosts.yml llm-benchmark-auto.yml \ -e "test_model= TinyLlama/TinyLlama-1.1B-Chat-v1.0" \ -e "workload_type=chat" \ -e "requested_cores=16"

Final Ansible messages:

PLAY [Auto-Configured LLM Test - Optional Cleanup] *****************************

TASK [Gathering Facts] *********************************************************
ok: [vllm-server]

TASK [Stop and remove vLLM container] ******************************************
skipping: [vllm-server] => {"changed": false, "false_condition": "cleanup_after_test | default(false) | bool", "skip_reason": "Conditional result was False"}

TASK [Display cleanup status] **************************************************
skipping: [vllm-server] => {"false_condition": "cleanup_after_test | default(false) | bool"}

PLAY RECAP *********************************************************************
guidellm-client : ok=41 changed=7 unreachable=0 failed=0 skipped=22 rescued=0 ignored=0
localhost : ok=8 changed=0 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0
vllm-server : ok=76 changed=6 unreachable=0 failed=0 skipped=22 rescued=0 ignored=0

dut# podman ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES e541b9771546 docker.io/vllm/vllm-openai-cpu:v0.18.0 --model --host 0.... About an hour ago Up About an hour vllm-server

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions