You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently with, if a user specifies a model format with a custom name in the inference service manifest and deploys it, the Models UI on kubeflow will become disfunctional and not showing any model deployed under the same namespace. This has been mentioned in #46 but there has been no update.
Suggestion:
Check if the model format is a member of PredictorType Enum and returns PredictorType.Custom if it's not (similarly to what we do with old predictor formats before Kserve 0.7). I believe #47 has already attempted at fixing this problem but I can also make a PR for this.
Question:
How should we make PredictorType more flexible for custom model types? Say a user has custom model types: "A", "B" and "C". Even with the above suggestion, the Models UI will still show custom for all three.
The text was updated successfully, but these errors were encountered:
Raising this issue again. Everything works great but this undermines the project, as models logging for debugs (without the need to read the Pod's logs or any additional services) is quite a crucial part for small ML teams.
Currently with, if a user specifies a model format with a custom name in the inference service manifest and deploys it, the Models UI on kubeflow will become disfunctional and not showing any model deployed under the same namespace. This has been mentioned in #46 but there has been no update.
For example,
with a
ClusterServingRuntime
This is because
PredictorType
is currently hardcoded as anEnum
(See here), butgetPredictorType
function will return the name of the model format as an Enum directly instead of checking if it's a member of the Enum first.. TypeScript does not raise an error when by default when this happens, and will returnundefined
, which breaks all the downstream tasks, and the logs from the web-app will not show anything related to this.Suggestion:
Check if the model format is a member of
PredictorType
Enum and returnsPredictorType.Custom
if it's not (similarly to what we do with old predictor formats before Kserve 0.7). I believe #47 has already attempted at fixing this problem but I can also make a PR for this.Question:
How should we make
PredictorType
more flexible for custom model types? Say a user has custom model types: "A", "B" and "C". Even with the above suggestion, the Models UI will still showcustom
for all three.The text was updated successfully, but these errors were encountered: