-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Adds visual-based tactile sensor with shape sensing example #3420
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
6808a09
5f83dd2
8eecdc0
7c90c47
ec93980
9115636
e86b370
a8e1c98
6137678
d86cb35
da589c8
95aa18e
58fd12d
b958f49
d7ffe14
75e8e62
41ca4fb
9655791
d79f34c
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -69,3 +69,5 @@ tests/ | |
|
|
||
| # Docker history | ||
| .isaac-lab-docker-history | ||
|
|
||
| **/tactile_record/* | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -190,6 +190,8 @@ | |
| "nvidia.srl", | ||
| "flatdict", | ||
| "IPython", | ||
| "cv2", | ||
| "imageio", | ||
| "ipywidgets", | ||
| "mpl_toolkits", | ||
| ] | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,195 @@ | ||
| .. _overview_sensors_tactile: | ||
|
|
||
| .. currentmodule:: isaaclab | ||
|
|
||
| Visuo-Tactile Sensor | ||
| ==================== | ||
|
|
||
|
|
||
| The visuo-tactile sensor in Isaac Lab provides realistic tactile feedback through integration with TacSL (Tactile Sensor Learning) [Akinola2025]_. It is designed to simulate high-fidelity tactile interactions, generating both visual and force-based data that mirror real-world tactile sensors like GelSight devices. The sensor can provide tactile RGB images, force field distributions, and other intermediate tactile measurements essential for robotic manipulation tasks requiring fine tactile feedback. | ||
|
|
||
|
|
||
| .. figure:: ../../../_static/overview/sensors/tacsl_diagram.jpg | ||
| :align: center | ||
| :figwidth: 100% | ||
| :alt: Tactile sensor with RGB visualization and force fields | ||
|
|
||
|
|
||
| Configuration | ||
| ~~~~~~~~~~~~~ | ||
|
|
||
| Tactile sensors require specific configuration parameters to define their behavior and data collection properties. The sensor can be configured with various parameters including sensor resolution, force sensitivity, and output data types. | ||
|
|
||
| .. code-block:: python | ||
|
|
||
| from isaaclab.sensors.tacsl_sensor import VisuoTactileSensorCfg | ||
| from isaaclab_assets.sensors import GELSIGHT_R15_CFG | ||
| import isaaclab.sim as sim_utils | ||
|
|
||
| # Tactile sensor configuration | ||
| tactile_sensor = VisuoTactileSensorCfg( | ||
| prim_path="{ENV_REGEX_NS}/Robot/tactile_sensor", | ||
| ## Sensor configuration | ||
| render_cfg=GELSIGHT_R15_CFG, | ||
| enable_camera_tactile=True, | ||
| enable_force_field=True, | ||
| ## Elastomer configuration | ||
| elastomer_rigid_body="elastomer", | ||
| ## Force field configuration | ||
| num_tactile_rows=20, | ||
| num_tactile_cols=25, | ||
| tactile_margin=0.003, | ||
| ## Indenter configuration (will be set based on indenter type) | ||
| indenter_rigid_body="indenter", | ||
| indenter_sdf_mesh="factory_nut_loose/collisions", | ||
|
Comment on lines
+43
to
+44
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. What does the indenter configuration mean? Is it fixed to particular object you want the data for? |
||
| ## Force field physics parameters | ||
| tactile_kn=1.0, | ||
| tactile_mu=2.0, | ||
| tactile_kt=0.1, | ||
| ## Camera configuration | ||
| camera_cfg=TiledCameraCfg( | ||
| prim_path="{ENV_REGEX_NS}/Robot/elastomer_tip/cam", | ||
| update_period=1 / 60, # 60 Hz | ||
| height=320, | ||
| width=240, | ||
| data_types=["distance_to_image_plane"], | ||
| spawn=None, # camera already spawned in USD file | ||
| ), | ||
| ) | ||
|
|
||
| The configuration supports customization of: | ||
|
|
||
| * **Render Configuration**: Specify the GelSight sensor rendering parameters using predefined configs | ||
| (e.g., ``GELSIGHT_R15_CFG``, ``GELSIGHT_MINI_CFG`` from ``isaaclab_assets.sensors``) | ||
| * **Tactile Modalities**: | ||
| * ``enable_camera_tactile`` - Enable tactile RGB imaging through camera sensors | ||
| * ``enable_force_field`` - Enable force field computation and visualization | ||
| * **Force Field Grid**: Set tactile grid dimensions (``num_tactile_rows``, ``num_tactile_cols``) and margins, which directly affects the spatial resolution of the computed force field | ||
| * **Indenter Configuration**: Define properties of interacting objects including rigid body name and collision mesh | ||
| * **Physics Parameters**: Control the sensor's force field computation: | ||
| * ``tactile_kn``, ``tactile_mu``, ``tactile_kt`` - Normal stiffness, friction coefficient, and tangential stiffness | ||
| * **Camera Settings**: Configure resolution, focal length, update rates, and 6-DOF positioning relative to the sensor | ||
|
|
||
| Configuration Requirements | ||
| ~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
|
||
| .. important:: | ||
| The following requirements must be satisfied for proper sensor operation: | ||
|
|
||
| **Camera Tactile Imaging** | ||
| If ``enable_camera_tactile=True``, a valid ``camera_cfg`` (TiledCameraCfg) must be provided with appropriate camera parameters. | ||
|
|
||
| **Force Field Computation** | ||
| If ``enable_force_field=True``, the following parameters are required: | ||
|
|
||
| * ``indenter_rigid_body`` - Specific rigid body within the actor | ||
| * ``indenter_sdf_mesh`` - Collision mesh for SDF computation | ||
| * ``elastomer_rigid_body`` - Elastomer rigid body, this is required to track the elastomer's pose and velocity. | ||
|
|
||
| **SDF Computation** | ||
| When force field computation is enabled, penalty-based normal and shear forces are computed using Signed Distance Field (SDF) queries. To achieve GPU acceleration: | ||
|
|
||
| * Interacting objects should have SDF collision meshes | ||
| * An SDFView must be defined during initialization, therefore interacting objects should be specified before simulation. | ||
|
|
||
| **Elastomer Configuration** | ||
| Elastomer properties (``elastomer_rigid_body``, ``elastomer_tip_link_name``) must match the robot model where the sensor is attached. | ||
|
|
||
| **Physics Materials** | ||
| The sensor uses physics materials to configure the compliant contact properties of the elastomer. | ||
| By default, physics material properties are pre-configured in the USD asset. However, you can override | ||
| these properties by specifying the following parameters in ``UsdFileWithPhysicsMaterialOnPrimsCfg`` when | ||
| spawning the robot: | ||
|
|
||
| * ``compliant_contact_stiffness`` - Contact stiffness for the elastomer surface | ||
| * ``compliant_contact_damping`` - Contact damping for the elastomer surface | ||
| * ``apply_physics_material_prim_path`` - Prim path where physics material is applied (typically ``"elastomer/collisions"``) | ||
|
|
||
| If any parameter is set to ``None``, the corresponding property from the USD asset will be retained. | ||
|
|
||
|
|
||
| Usage Example | ||
| ~~~~~~~~~~~~~ | ||
|
|
||
| To use the tactile sensor in a simulation environment, run the demo: | ||
|
|
||
| .. code-block:: bash | ||
|
|
||
| cd scripts/demos/sensors/tacsl | ||
| python tacsl_example.py --use_tactile_rgb --use_tactile_ff --indenter_type nut --num_envs 16 --save_viz | ||
|
|
||
| Available command-line options include: | ||
|
|
||
| * ``--use_tactile_rgb``: Enable camera-based tactile sensing | ||
| * ``--use_tactile_ff``: Enable force field tactile sensing | ||
| * ``--indenter_type``: Specify the type of indenter object (nut, cube, etc.) | ||
| * ``--num_envs``: Number of parallel environments | ||
| * ``--save_viz``: Save visualization outputs for analysis | ||
|
|
||
| For a complete list of available options: | ||
|
|
||
| .. code-block:: bash | ||
|
|
||
| python tacsl_example.py -h | ||
|
|
||
| .. note:: | ||
| The demo examples are based on the Gelsight R1.5, which is a prototype sensor that is now discontinued. The same procedure can be adapted for other visuotactile sensors. | ||
|
|
||
| .. figure:: ../../../_static/overview/sensors/tacsl_demo.jpg | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. actually all added images are used in the doc. tacsl_demo.jpg is used here, and other two jpg files are used at line 152 and 157. |
||
| :align: center | ||
| :figwidth: 100% | ||
| :alt: TacSL tactile sensor demo showing RGB tactile images and force field visualizations | ||
|
|
||
| The tactile sensor supports multiple data modalities that provide comprehensive information about contact interactions: | ||
|
|
||
|
|
||
| Output Tactile Data | ||
| ~~~~~~~~~~~~~~~~~~~ | ||
| **RGB Tactile Images** | ||
| Real-time generation of tactile RGB images as objects make contact with the sensor surface. These images show deformation patterns and contact geometry similar to gel-based tactile sensors [Si2022]_ | ||
|
|
||
|
|
||
| **Force Fields** | ||
| Detailed contact force field and pressure distributions across the sensor surface, including normal and shear components. | ||
|
|
||
| .. list-table:: | ||
| :widths: 50 50 | ||
| :class: borderless | ||
|
|
||
| * - .. figure:: ../../../_static/overview/sensors/tacsl_taxim_example.jpg | ||
| :align: center | ||
| :figwidth: 80% | ||
| :alt: Tactile output with RGB visualization | ||
|
|
||
| - .. figure:: ../../../_static/overview/sensors/tacsl_force_field_example.jpg | ||
| :align: center | ||
| :figwidth: 80% | ||
| :alt: Tactile output with force field visualization | ||
|
|
||
| Integration with Learning Frameworks | ||
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
|
||
| The tactile sensor is designed to integrate seamlessly with reinforcement learning and imitation learning frameworks. The structured tensor outputs can be directly used as observations in learning algorithms: | ||
|
|
||
| .. code-block:: python | ||
|
|
||
| def get_tactile_observations(self): | ||
| """Extract tactile observations for learning.""" | ||
| tactile_data = self.scene["tactile_sensor"].data | ||
|
|
||
| # tactile RGB image | ||
| tactile_rgb = tactile_data.tactile_rgb_image | ||
|
|
||
| # force field | ||
| tactile_normal_force = tactile_data.tactile_normal_force | ||
| tactile_shear_force = tactile_data.tactile_shear_force | ||
|
|
||
| return [tactile_rgb, tactile_normal_force, tactile_shear_force] | ||
|
|
||
|
|
||
|
|
||
| References | ||
| ~~~~~~~~~~ | ||
|
|
||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @iakinola23 Could you help view this documentation here? Thanks! There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hi @iakinola23 updated the documentation with your edits. |
||
| .. [Akinola2025] Akinola, I., Xu, J., Carius, J., Fox, D., & Narang, Y. (2025). TacSL: A library for visuotactile sensor simulation and learning. *IEEE Transactions on Robotics*. | ||
| .. [Si2022] Si, Z., & Yuan, W. (2022). Taxim: An example-based simulation model for GelSight tactile sensors. *IEEE Robotics and Automation Letters*, 7(2), 2361-2368. | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only this diagram seems to be getting used. Please remove any unused images.