You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{% expand "Want more information about model framework and hardware support for each ML model service? Click here." %}}
47
42
48
43
Viam currently supports the following frameworks:
49
44
@@ -59,16 +54,14 @@ Viam currently supports the following frameworks:
59
54
For some models of the ML model service, like the [Triton ML model service](https://github.com/viamrobotics/viam-mlmodelservice-triton/) for Jetson boards, you can configure the service to use either the available CPU or a dedicated GPU.
60
55
{{< /alert >}}
61
56
62
-
For example,use the `ML model / TFLite CPU` service for TFlite ML models.
63
-
If you used the built-in training, this is the ML model service you need to use.
64
-
If you used a custom training script, you may need a different ML model service.
57
+
{{< /expand>}}
65
58
66
59
To deploy a model, click **Select model** and select the model from your organization or the registry.
67
60
Save your config.
68
61
69
-
### Machine learning models from registry
62
+
### Models available to deploy on the ML Model service
70
63
71
-
You can search the machine learning models that are available to deploy on this service from the registry here:
64
+
You can search the machine learning models that are available to deploy on an ML model service from the registry here:
72
65
73
66
{{<mlmodels>}}
74
67
@@ -85,7 +78,14 @@ Enter **JSON** mode and find the `"packages"` section of your config.
85
78
Replace `"version": "latest"` with `"version"` from the package reference you just copied, for example `"version": "2024-11-14T15-05-26"`.
86
79
Save your config to use your specified version of the ML model.
87
80
88
-
## Next steps
81
+
## How the ML model service works
82
+
83
+
The service works with models trained inside and outside the Viam app:
84
+
85
+
- You can [train TFlite](/data-ai/ai/train-tflite/) or [other model frameworks](/data-ai/ai/train/) on data from your machines.
86
+
- You can use [ML models](https://app.viam.com/registry?type=ML+Model) from the [Viam Registry](https://app.viam.com/registry).
87
+
- You can upload externally trained models from a model file on the [**MODELS** tab](https://app.viam.com/data/models) in the **DATA** section of the Viam app.
88
+
- You can use a [model](/data-ai/ai/deploy/#deploy-your-ml-model) trained outside the Viam platform whose files are on your machine. See the documentation of the model of ML model service you're using (pick one that supports your model framework) for instructions on this.
89
89
90
90
On its own the ML model service only runs the model.
91
91
After deploying your model, you need to configure an additional service to use the deployed model.
Copy file name to clipboardexpand all lines: docs/data-ai/ai/train-tflite.md
+3-39
Original file line number
Diff line number
Diff line change
@@ -53,43 +53,6 @@ Follow the guide to [create a dataset](/data-ai/ai/create-dataset/) if you haven
53
53
54
54
{{% /expand%}}
55
55
56
-
{{% expand "A configured camera. Click to see instructions." %}}
57
-
58
-
First, connect the camera to your machine's computer if it's not already connected (like with an inbuilt laptop webcam).
59
-
60
-
Then, navigate to the **CONFIGURE** tab of your machine's page in the [Viam app](https://app.viam.com).
61
-
Click the **+** icon next to your machine part in the left-hand menu and select **Component**.
62
-
The `webcam` model supports most USB cameras and inbuilt laptop webcams.
63
-
You can find additional camera models in the [camera configuration](/operate/reference/components/camera/#configuration) documentation.
64
-
65
-
Complete the camera configuration and use the **TEST** panel in the configuration card to test that the camera is working.
66
-
67
-
{{% /expand%}}
68
-
69
-
{{% expand "No computer or webcam?" %}}
70
-
71
-
No problem.
72
-
You don't need to buy or own any hardware to complete this guide.
73
-
74
-
Use [Try Viam](https://app.viam.com/try) to borrow a rover free of cost online.
75
-
The rover already has `viam-server` installed and is configured with some components, including a webcam.
76
-
77
-
Once you have borrowed a rover, go to its **CONTROL** tab where you can view camera streams and also drive the rover.
78
-
You should have a front-facing camera and an overhead view of your rover.
79
-
Now you know what the rover can perceive.
80
-
81
-
To change what the front-facing camera is pointed at, find the **cam** camera panel on the **CONTROL** tab and click **Toggle picture-in-picture** so you can continue to view the camera stream.
82
-
Then, find the **viam_base** panel and drive the rover around.
83
-
84
-
Now that you have seen that the cameras on your Try Viam rover work, begin by [Creating a dataset and labeling data](/data-ai/ai/create-dataset/).
85
-
You can drive the rover around as you capture data to get a variety of images from different angles.
86
-
87
-
{{< alert title="Tip" color="tip" >}}
88
-
Be aware that if you are running out of time during your rental, you can extend your rover rental as long as there are no other reservations.
89
-
{{< /alert >}}
90
-
91
-
{{% /expand%}}
92
-
93
56
## Train a machine learning (ML) model
94
57
95
58
Now that you have a dataset with your labeled images, you are ready to train a machine learning model.
@@ -163,7 +126,7 @@ Once your model has finished training, you can test it.
163
126
Ideally, you want your ML model to be able to work with a high level of confidence.
164
127
As you test it, if you notice faulty predictions or confidence scores, you will need to adjust your dataset and retrain your model.
165
128
166
-
If you trained a classification model, you can test it with the following instructions.
129
+
If you trained a _classification_ model, you can test it with the following instructions.
167
130
168
131
1. Navigate to the [**DATA** tab](https://app.viam.com/data/view) and click on the **Images** subtab.
169
132
1. Click on an image to open the side menu, and select the **Actions** tab.
@@ -180,7 +143,8 @@ You can test both detection models and classifier models using the following res
180
143
181
144
## Next steps
182
145
183
-
Now your machine can make inferences about its environment. The next step is to [act](/data-ai/ai/act/) or [alert](/data-ai/ai/alert/) based on these inferences.
146
+
Now your machine can make inferences about its environment.
147
+
The next step is to [deploy](/data-ai/ai/deploy/) the ML model and then [act](/data-ai/ai/act/) or [alert](/data-ai/ai/alert/) based on these inferences.
184
148
185
149
See the following tutorials for examples of using machine learning models to make your machine do things based on its inferences about its environment:
0 commit comments