In January, I set up a Serverless vLLM endpoint on Runpod with allenai/olmOCR-2-7B-1025-FP8 as the same model, and it managed to work and deploy. However, when I try to do the same steps today, it doesn't work.
Clicking on the endpoint doesn't open the page with information about it, so I cannot see what is failing. When deploying the agent with the model as the parameter, it redirects to a 404 page instead of the endpoint's info page:

In January, I set up a Serverless vLLM endpoint on Runpod with
allenai/olmOCR-2-7B-1025-FP8as the same model, and it managed to work and deploy. However, when I try to do the same steps today, it doesn't work.Clicking on the endpoint doesn't open the page with information about it, so I cannot see what is failing. When deploying the agent with the model as the parameter, it redirects to a 404 page instead of the endpoint's info page: