-
Notifications
You must be signed in to change notification settings - Fork 40
Description
Add the ability to load offline models, so that users can still run their projects with spotty Internet / from places of the world where access to online models is restricted, i.e. China.
See prior discussion in ml5js/ml5-library#1254.
Notably, now that we're using TF version ^4.2.0, we should be able to specify where the model is loaded from. To quote Joey's reply in the issue mentioned above:
If we do end up updating our tf versions to some of the more recent versions, then it looks like in the latest face-landmarks-detection lib we can specify where our model files should be loaded from -- https://github.com/tensorflow/tfjs-models/tree/master/face-landmarks-detection/src/mediapipe#create-a-detector -- which, in this case, would be somewhere on a local server.