-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Adds visual-based tactile sensor with shape sensing example #3420
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
clean up license and configs; use full import path; add doc for visuo_tactile_sensor
|
|
||
| References | ||
| ~~~~~~~~~~ | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@iakinola23 Could you help view this documentation here? Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @iakinola23 updated the documentation with your edits.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kellyguo11 Hi Kelly, I will upload the npz binaries, and the USD files to nucleus once the licensing concerns are resolved. Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just a few high-level comments, haven't gone through the code in detail yet
| taxim_gelsight = gelsightRender("gelsight_r15") | ||
| import ipdb | ||
|
|
||
| ipdb.set_trace() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why is there a breakpoint here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
deleted in the newest commit.
| self._camera_sensor = TiledCamera(self.cfg.camera_cfg) | ||
|
|
||
| # Initialize camera if not already done | ||
| # TODO: Juana: this is a hack to initialize the camera sensor. Should camera_sensor be managed by TacSL sensor or InteractiveScene? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe we can do something similar as the Raycaster camera?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi Kelly, I checked the RayCasterCamera implementation, and it differs slightly from the tacsl sensor logic. The tacsl_sensor includes a camera sensor that can be enabled or disabled via the cfg, while RayCasterCamera only uses CameraData without having a camera or other sensor as a member. I’ve kept the current implementation and cleaned up the comments. Please let me know if you have any further concerns or suggestions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
images should be .jpg
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
all changed to jpg format and documentation is updated accordingly.
| .. code-block:: bash | ||
|
|
||
| conda activate env_isaaclab | ||
| pip install opencv-python==4.11.0 trimesh==4.5.1 imageio==2.37.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need to add these to setup.py?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added opencv-python to setup.py; trimesh is already included, and imageio's dependency is not required in the newest commit.
| license agreement from NVIDIA CORPORATION is strictly prohibited. | ||
|
|
||
| ---------------- | ||
| Tensorized implementation of RGB rendering of gelsight-style visuo-tactile sensors |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you also add any licenses required for gelsight under docs/licenses?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated the license header to be consistent with other files.
|
please also make sure to run ./isaaclab.sh -f for the linter checks |
| return dzdx, dzdy | ||
|
|
||
|
|
||
| def generate_normals(height_map): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
generate_normals not needed and deprecated. only the tensorized version is used.
| self.background_tensor = torch.tensor(self.background, device=self.device) | ||
| print("Gelsight initialization done!") | ||
|
|
||
| def render(self, heightMap): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Delete render as it is not used. Only the tensorized version (render_tensorized ) is used.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
deleted render , generate_normals, and compute_image_gradient , padding as well.
|
Thank you for making TacSL available on IsaacLab! I have three quick questions:
Thanks! |
|
@ooctipus @Mayankm96 could you please help do a review of this one? |
|
@zxhuang97 Hi, thanks for your questions and for checking out TacSL on IsaacLab.
Hope this helps! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only this diagram seems to be getting used. Please remove any unused images.
| conf_r15 = { | ||
| "data_dir": "gelsight_r15_data", | ||
| "background_path": "bg.jpg", | ||
| "calib_path": "polycalib.npz", | ||
| "real_bg": "real_bg.npy", | ||
| "h": 320, | ||
| "w": 240, | ||
| "numBins": 120, | ||
| "pixmm": 0.0877, | ||
| } | ||
| conf_gs_mini = { | ||
| "data_dir": "gs_mini_data", | ||
| "background_path": "bg.jpg", | ||
| "calib_path": "polycalib.npz", | ||
| "real_bg": "real_bg.npy", | ||
| "h": 240, | ||
| "w": 320, | ||
| "numBins": 120, | ||
| "pixmm": 0.065, | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Following all other sensors, please make these into configclass objects with the correct doc-strings. We usually don't want to keep sensor-specific settings in the core codebase.
Sensors configuration live here: https://github.com/isaac-sim/IsaacLab/tree/main/source/isaaclab_assets/isaaclab_assets/sensors
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please choose variable names that are intuitive to read and understand. "h" and "w" are insufficient.
| } | ||
|
|
||
|
|
||
| def generate_normals_tensor(img_tensor): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are users expected to call these functions directly? If not, I think it is better to move them inside the GelSightRender class and mark them as a private using an underscore.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggest dropping the "tensor" from variable names if they are obvious (i.e. no back and forth is happening between different data-types).
| return grad_mag_tensor, grad_dir_tensor, None | ||
|
|
||
|
|
||
| def get_filtering_kernel(kernel_sz=5): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please use proper type-hints and type-annotations as mentioned in coding style-guide.
| elastomer_rigid_body: str = "elastomer" | ||
| """Prim path of the elastomer rigid body for tactile sensing.""" | ||
|
|
||
| elastomer_tactile_mesh: str = "elastomer/visual" | ||
| """Prim path of the elastomer mesh for tactile point generation.""" | ||
|
|
||
| elastomer_tip_link_name: str = "elastomer_tip" | ||
| """Prim path of the elastomer tip link.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is confusing. Why are there three paths that a user must specify? Is it possible to keep this simple with one path where the sensor needs to be made?
| tactile_normal_force: torch.Tensor | None = None | ||
| """Normal forces at each tactile point. Shape: (num_instances, num_tactile_points).""" | ||
|
|
||
| tactile_shear_force: torch.Tensor | None = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are these in sensor frame or world frame?
| tactile_camera_depth: torch.Tensor | None = None | ||
| """Tactile depth images. Shape: (num_instances, height, width).""" | ||
|
|
||
| taxim_tactile: torch.Tensor | None = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this a common name used for this variable? Otherwise, would suggest call it taxim_nominal or something more intituive.
What is taxim btw?
|
@greptileai for more comments and feedback. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Overview
Greptile Summary
This PR integrates TacSL (Tactile Simulation Library) into Isaac Lab, adding visual-based tactile sensor capabilities with dual sensing modalities: camera-based RGB tactile imaging (using GelSight rendering) and SDF-based force field sensing. The implementation introduces a new VisuoTactileSensor class that processes depth images into realistic tactile RGB outputs and computes contact forces at discrete tactile points. The sensor integrates with Isaac Lab's existing sensor architecture, following established patterns for configuration (VisuoTactileSensorCfg), data output (VisuoTactileSensorData), and scene management. A comprehensive demo script demonstrates three scene configurations (base, cube, nut indenters) with visualization utilities for force fields and penetration depth. The feature adds opencv-python as a dependency and includes full documentation and API references.
Critical Issues:
- Syntax error in demo script (lines 341-342 of
tacsl_example.py): The expression2 ** 30is split across lines incorrectly, causing a parsing error that will crash the script - Redundant assertions in
visuotactile_sensor.py(lines 270-281): Multiple assertions that check conditions then immediately assert the same condition again, suggesting copy-paste errors - Unhandled None returns in
gelsight_render.py:get_gs_render_data()can return None, which will cause crashes incv2.imread()when used without validation (lines 172-177) - State mutation in getter (line 910 of
visuotactile_sensor.py):get_timing_summary()modifies_timing_call_count, breaking idempotency and causing incorrect results on subsequent calls - RST table formatting error in
sensors/__init__.py(line 36): Table separator positioned incorrectly, breaking documentation rendering - Typo in documentation (line 106 of
visuo_tactile_sensor.rst): "deom" should be "demo" - Debug print statement (line 69 of
visuotactile_viz_utils.py): Production code contains debugging print that should be removed - Documentation inconsistency in
tacsl_example.py: Usage example references non-existent--enable_camerasflag
Confidence: 2/5 - Multiple critical bugs that will prevent the feature from working correctly, including syntax errors and unhandled exceptions. Requires significant fixes before merge.
16 files reviewed, 8 comments
| loc1_x = loc0_x + tactile_shear_force[row, col][0] / shear_force_threshold * resolution | ||
| loc1_y = loc0_y + tactile_shear_force[row, col][1] / shear_force_threshold * resolution |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: division by threshold values could cause ZeroDivisionError if either threshold is 0.0
| max(0.0, 1.0 - tactile_normal_force[row][col] / normal_force_threshold), | ||
| min(1.0, tactile_normal_force[row][col] / normal_force_threshold), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: division by normal_force_threshold could cause ZeroDivisionError if threshold is 0.0
| bg_path = get_gs_render_data(self.conf["data_dir"], self.conf["background_path"]) | ||
| calib_path = get_gs_render_data(self.conf["data_dir"], self.conf["calib_path"]) | ||
|
|
||
| self.background = cv2.cvtColor(cv2.imread(bg_path), cv2.COLOR_BGR2RGB) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: if get_gs_render_data returns None (line 126), cv2.imread(bg_path) will fail with TypeError when bg_path is None
| idx_x_tensor = torch.floor(grad_mag_tensor / self.x_binr).long() | ||
| idx_y_tensor = torch.floor((grad_dir_tensor + np.pi) / self.y_binr).long() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: no bounds checking before indexing - if gradients fall outside expected ranges, indices could exceed tensor dimensions causing runtime errors
| if not self._camera_sensor.is_initialized: | ||
| assert self._camera_sensor.is_initialized, "Camera sensor is not initialized" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: redundant check - already checked on line 270, will always fail
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. Also please raise an error of specific type (RuntimeError for example). AssertError is too generic and should be avoided.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right on both counts. The check on line 271 is redundant since we already checked on line 270, and we should use a more specific exception type.
| if not self._camera_sensor.is_initialized: | |
| assert self._camera_sensor.is_initialized, "Camera sensor is not initialized" | |
| if not self._camera_sensor.is_initialized: | |
| raise RuntimeError("Camera sensor is not initialized") |
| # get the initial render | ||
| initial_render = self._camera_sensor.data.output | ||
| if initial_render is None: | ||
| assert False, "Initial render is None" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: redundant check - initial_render is already asserted non-None on line 277-278
| assert False, "Initial render is None" | ||
|
|
||
| if self._nominal_tactile is not None: | ||
| assert False, "Nominal tactile is not None" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: redundant check - self._nominal_tactile is initialized to None and only set in this method, so this will always fail after first call
| } | ||
|
|
||
| # skip the first two calls | ||
| self._timing_call_count -= 2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: modifying _timing_call_count changes state during read-only getter, breaks subsequent calls
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Overview
Greptile Summary
This review covers only the changes made since the last review, not the entire PR. Since the previous review, two critical issues remain unresolved in the core implementation files:
-
Critical syntax error in demo script (line 341-342 of
tacsl_example.py): Invalid Python syntax with** 30token will prevent the script from running at all. -
Unhandled None return value (line 467-479 of
visuotactile_sensor.py): Whenlocationsis None after ray intersection failure, the code attempts to use it on lines 471 and 479, causing guaranteed TypeErrors.
These issues were not present or flagged in previous reviews but block the PR from functioning correctly. The first is a parsing error that will crash immediately, while the second creates a runtime failure path when the ray-casting mesh intersection returns only 2 values instead of 3.
Important Files Changed
| Filename | Score | Overview |
|---|---|---|
scripts/demos/sensors/tacsl/tacsl_example.py |
1/5 | Contains critical syntax error on line 341-342 that prevents script execution |
source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.py |
2/5 | Unhandled None case for locations variable (lines 467-479) will cause TypeErrors |
source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_render.py |
3/5 | Multiple error handling issues including bare Exception catch and unvalidated None returns |
source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_viz_utils.py |
3/5 | Potential division-by-zero issues when threshold parameters are 0.0 |
source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor_cfg.py |
4/5 | Missing validation for sensor_type values and ambiguous camera_cfg None behavior |
docs/source/overview/core-concepts/sensors/visuo_tactile_sensor.rst |
4/5 | Minor typo ('deom' instead of 'demo' on line 106) in otherwise comprehensive documentation |
source/isaaclab/isaaclab/sensors/tacsl_sensor/__init__.py |
4/5 | Standard package initialization following IsaacLab patterns |
source/isaaclab/isaaclab/scene/interactive_scene.py |
5/5 | Proper integration of VisuoTactileSensorCfg with environment namespace formatting |
CONTRIBUTORS.md |
5/5 | Simple contributor name addition following proper alphabetical ordering |
docs/source/overview/core-concepts/sensors/index.rst |
5/5 | Added toctree entry for new visuo_tactile_sensor documentation |
.gitignore |
5/5 | Excludes tactile_record directories from version control |
source/isaaclab/isaaclab/sensors/__init__.py |
5/5 | Added documentation and import for new tacsl_sensor module |
source/isaaclab/setup.py |
5/5 | Added opencv-python dependency for tactile sensor image processing |
docs/source/api/lab/isaaclab.sensors.rst |
5/5 | Added API documentation entries for new VisuoTactileSensor classes |
source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor_data.py |
5/5 | Clean data container class following established sensor data patterns |
docs/conf.py |
5/5 | Added cv2 and imageio to autodoc_mock_imports for documentation builds |
Confidence score: 1/5
- This PR will cause immediate failures and cannot be merged in its current state due to syntax and runtime errors.
- Score reduced to 1/5 because: (1) syntax error in demo script line341-342 causes parsing failure preventing execution, (2) unhandled None case in visuotactile_sensor.py lines 467-479 will cause TypeErrors during ray-casting operations, (3) multiple error handling gaps in gelsight_render.py with unvalidated None returns, (4) potential division-by-zero issues in visualization utilities.
- Critical attention required on
scripts/demos/sensors/tacsl/tacsl_example.py(fix syntax error) andsource/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.py(add None validation for locations variable before usage).
16 files reviewed, 9 comments
docs/source/overview/core-concepts/sensors/visuo_tactile_sensor.rst
Outdated
Show resolved
Hide resolved
source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_viz_utils.py
Outdated
Show resolved
Hide resolved
source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_render.py
Outdated
Show resolved
Hide resolved
source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_render.py
Outdated
Show resolved
Hide resolved
source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_render.py
Outdated
Show resolved
Hide resolved
| # Juana: this should be manually called before running any simulation ? | ||
| scene["tactile_sensor"].get_initial_render() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: remove leftover TODO comment before merging
| self._camera_sensor._initialize_impl() | ||
| self._camera_sensor._is_initialized = True |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: : directly setting protected _is_initialized flag bypasses normal initialization flow - consider using public API or inheriting properly
| # Extract results based on what was returned | ||
| if len(result) == 3: | ||
| index_tri, index_ray, locations = result | ||
| else: | ||
| index_tri, index_ray = result | ||
| locations = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: : if ray intersection returns 2 values instead of 3, locations is set to None but used on line471and 479, causing AttributeError
In the USD for the R15 sensor, the camera at In the original TacSL repo, the gelsight_r15.yaml file specifies a camera distance of 0.0203, which is close but not identical. What motivates this adjustment, and why isn’t the exact YAML value used in the USD? Thanks! @JuanaDd |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Overview
Greptile Summary
This PR integrates TacSL (Tactile Sensor Learning) into Isaac Lab, adding visual-based tactile sensing capabilities with both camera-based RGB imaging and force field computation.
Major Changes:
- Adds
VisuoTactileSensorclass implementing dual-modality tactile sensing (camera + force field) - Implements GelSight rendering using example-based Taxim approach for realistic tactile RGB images
- Uses SDF-based penalty methods to compute normal and shear forces at tactile grid points
- Integrates with Isaac Lab's sensor framework, including special handling in
InteractiveScene - Provides comprehensive documentation and demo script showing nut/cube indenter interactions
- Adds trimesh as optional dependency for tactile point generation via ray casting
Architecture:
The sensor generates tactile points on the elastomer surface using trimesh ray casting, then transforms these points to world coordinates each step. For force field computation, it queries the indenter's SDF to detect penetration depth, computes normal forces based on penetration, and applies friction model for shear forces. Camera-based sensing captures depth differences from nominal state and renders them as tactile RGB using calibrated GelSight parameters.
Issues Found:
- Critical syntax error in demo script (line 341-342) prevents execution
- Multiple logic errors including redundant checks, state mutation in getters, and potential null pointer dereferences
- Division by zero vulnerabilities in rendering and visualization code
- Overly broad exception handling masks specific errors
- Debug artifacts (print statements, TODO comments) left in production code
Confidence Score: 2/5
- This PR contains critical bugs that will prevent execution and cause runtime errors
- Score of 2 reflects the presence of a critical syntax error in the demo script that prevents execution, multiple logic errors including state mutation in getters and redundant checks, and several division-by-zero vulnerabilities. While the overall architecture is sound and the integration is well-documented, these bugs must be fixed before merging to avoid runtime failures
- Critical:
scripts/demos/sensors/tacsl/tacsl_example.py(syntax error),source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.py(logic errors),source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_render.py(error handling and null safety)
Important Files Changed
File Analysis
| Filename | Score | Overview |
|---|---|---|
| source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.py | 2/5 | Core sensor implementation with multiple critical logic issues including redundant checks, state mutation in getters, and potential null pointer errors |
| source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_render.py | 2/5 | GelSight rendering implementation with potential ZeroDivisionError, unhandled None values, and overly broad exception handling |
| source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_viz_utils.py | 3/5 | Visualization utilities with potential division by zero issues and debug print statement left in code |
| scripts/demos/sensors/tacsl/tacsl_example.py | 1/5 | Demo script with critical syntax error in PhysX configuration and TODO comment left in code |
| docs/source/overview/core-concepts/sensors/visuo_tactile_sensor.rst | 4/5 | Comprehensive documentation with one typo ('deom' instead of 'demo') - otherwise well-written |
Sequence Diagram
sequenceDiagram
participant User
participant Scene as InteractiveScene
participant Sensor as VisuoTactileSensor
participant Camera as TiledCamera
participant Render as gelsightRender
participant SDF as SDFView
participant Data as VisuoTactileSensorData
User->>Scene: Initialize scene with VisuoTactileSensorCfg
Scene->>Sensor: __init__(cfg)
Sensor->>Data: Create VisuoTactileSensorData()
Sensor->>Sensor: _initialize_impl()
alt enable_camera_tactile
Sensor->>Camera: Create TiledCamera(camera_cfg)
Sensor->>Camera: _initialize_impl()
Sensor->>Render: Create gelsightRender(sensor_type)
Render->>Render: Load calibration data & background
end
alt enable_force_field
Sensor->>Sensor: _find_body_handles()
Sensor->>Sensor: _generate_tactile_points()
Note over Sensor: Use trimesh ray casting to<br/>generate tactile grid points
Sensor->>SDF: Create SdfShapePrim(indenter_sdf_mesh)
SDF->>SDF: Initialize with physics_sim_view
Sensor->>Data: Initialize force field buffers
end
User->>Sensor: setup_compliant_materials()
Sensor->>Sensor: Apply compliant contact materials
User->>Sensor: get_initial_render()
Sensor->>Camera: update(dt=0.0)
Camera-->>Sensor: initial depth data
Sensor->>Sensor: Store as _nominal_tactile
loop Simulation Step
User->>Scene: update(sim_dt)
Scene->>Sensor: _update_buffers_impl(env_ids)
alt Camera Tactile Enabled
Sensor->>Camera: update(sim_physics_dt)
Camera-->>Sensor: distance_to_image_plane
Sensor->>Sensor: Compute depth difference
Sensor->>Render: render_tensorized(diff)
Render-->>Sensor: Tactile RGB image
Sensor->>Data: Store taxim_tactile
end
alt Force Field Enabled
Sensor->>Sensor: Get elastomer transforms
Sensor->>Sensor: _transform_tactile_points_to_world()
Sensor->>Data: Update tactile_points_pos_w
Sensor->>Sensor: Transform points to indenter local frame
Sensor->>SDF: get_sdf_and_gradients(points)
SDF-->>Sensor: SDF values & gradients
Sensor->>Sensor: _compute_tactile_forces_from_sdf()
Note over Sensor: Compute normal forces (F_n = k_n * depth)<br/>Compute shear forces with friction
Sensor->>Data: Update tactile_normal_force, tactile_shear_force
end
end
User->>Sensor: data property
Sensor-->>User: VisuoTactileSensorData with all modalities
16 files reviewed, 3 comments
source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.py
Outdated
Show resolved
Hide resolved
source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.py
Outdated
Show resolved
Hide resolved
| .. note:: | ||
| The demo examples are based on the Gelsight R1.5, which is a prototype sensor that is now discontinued. The same procedure can be adapted for other visuotactile sensors. | ||
|
|
||
| .. figure:: ../../../_static/overview/sensors/tacsl_demo.jpg |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually all added images are used in the doc. tacsl_demo.jpg is used here, and other two jpg files are used at line 152 and 157.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Overview
Greptile Summary
This PR integrates TacSL (Tactile Sensor Learning) with Isaac Lab, adding visual-based and force field tactile sensing capabilities for robotic manipulation tasks.
Key Changes
-
New tactile sensor module (
source/isaaclab/isaaclab/sensors/tacsl_sensor/) implementing dual-mode sensing:- Camera-based tactile imaging using GelSight renderer (Taxim approach) to convert depth maps into realistic tactile RGB images
- Force field computation using SDF queries with penalty-based spring-damper models for normal and shear forces
-
Demo and examples (
scripts/demos/sensors/tacsl/tacsl_example.py) demonstrating sensor usage with different indenter types (cube, nut) -
USD spawner extension (
source/isaaclab/isaaclab/sim/spawners/from_files/) adding ability to apply physics materials to specific prims during spawning -
Comprehensive documentation with usage examples and visual diagrams
Issues Found
The implementation has several critical issues that need resolution:
- Syntax error in demo script (line 360): operator precedence causes
gpu_collision_stack_size=2 ** 30to evaluate incorrectly - Redundant initialization checks in
visuotactile_sensor.py(lines 270, 280) that will always fail after first execution - State mutation in getter (
get_timing_summary) modifies internal counter, breaking subsequent timing calculations - Division by zero risks in visualization functions when thresholds are 0.0
- Potential AttributeError if ray intersection returns unexpected tuple length (line 463)
- Documentation typo: 'deom' should be 'demo'
- Leftover TODO comment in production code (line 392)
Most of these issues were already identified in previous review comments and need to be addressed before merging.
Confidence Score: 2/5
- This PR contains critical syntax errors and logic bugs that will cause runtime failures
- Score reflects multiple critical issues: syntax error in demo preventing execution, redundant checks causing logic failures, and state mutation bugs. While the feature implementation is comprehensive with good documentation, the presence of runtime-blocking bugs and previous unresolved review comments necessitates fixes before merge
- Most attention needed:
visuotactile_sensor.py(redundant checks, state mutation),tacsl_example.py(syntax error),gelsight_utils.py(division by zero risks)
Important Files Changed
File Analysis
| Filename | Score | Overview |
|---|---|---|
| source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.py | 2/5 | Core tactile sensor implementation with several critical issues: redundant initialization checks (line 270, 280), state mutation in getter (line 463), and potential AttributeError when ray intersection returns unexpected format |
| source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_utils.py | 2/5 | GelSight rendering utilities with division by zero risks in visualization functions (lines 81-82, 85-86) when thresholds are 0.0, and proper error handling added for file retrieval |
| scripts/demos/sensors/tacsl/tacsl_example.py | 3/5 | Demo script with syntax error on line 360 (exponentiation precedence issue) and leftover TODO comment on line 392, otherwise functional demonstration of tactile sensor usage |
| source/isaaclab/isaaclab/sim/spawners/from_files/from_files.py | 4/5 | Added new function to apply physics materials to specific prims during USD spawning, properly handles optional parameters and provides appropriate logging |
| docs/source/overview/core-concepts/sensors/visuo_tactile_sensor.rst | 4/5 | Documentation file with comprehensive usage examples and configuration details, contains typo 'deom' instead of 'demo' that needs correction |
Sequence Diagram
sequenceDiagram
participant User
participant Scene as InteractiveScene
participant Sensor as VisuoTactileSensor
participant Camera as TiledCamera
participant Render as GelsightRender
participant SDF as SdfShapePrim
participant Physics as PhysicsSimView
User->>Scene: Initialize scene with tactile sensor
Scene->>Sensor: __init__(cfg)
Sensor->>Sensor: _initialize_impl()
alt Camera Tactile Enabled
Sensor->>Camera: Create TiledCamera
Sensor->>Render: Create GelsightRender
Render->>Render: Load calibration data & background
end
alt Force Field Enabled
Sensor->>Physics: Get body views (elastomer, indenter)
Sensor->>Sensor: Generate tactile points on surface
Sensor->>SDF: Initialize SdfShapePrim for indenter
end
User->>Sensor: get_initial_render()
Sensor->>Camera: update(dt=0.0)
Camera-->>Sensor: Initial depth image
Sensor->>Sensor: Store nominal tactile baseline
loop Simulation Step
User->>Scene: update(sim_dt)
Scene->>Sensor: _update_buffers_impl(env_ids)
alt Camera Tactile Enabled
Sensor->>Camera: update(sim_dt)
Camera-->>Sensor: Current depth image
Sensor->>Sensor: Compute depth difference from nominal
Sensor->>Render: render(depth_diff)
Render-->>Sensor: Tactile RGB image
end
alt Force Field Enabled
Sensor->>Physics: Get elastomer pose & velocity
Sensor->>Sensor: Transform tactile points to world frame
Sensor->>Physics: Get indenter pose & velocity
Sensor->>Sensor: Transform points to indenter local frame
Sensor->>SDF: Query SDF values and gradients
SDF-->>Sensor: Penetration depths & normals
Sensor->>Sensor: Compute penalty-based forces (F_n = k_n * depth)
Sensor->>Sensor: Compute friction forces (F_t = min(k_t * |v_t|, mu * F_n))
end
Sensor-->>User: Return tactile data (RGB, forces, depths)
end
16 files reviewed, no comments
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Overview
Greptile Summary
This PR adds TacSL (Tactile Sensor Learning) integration to Isaac Lab, providing visual-based tactile sensing capabilities for robotic manipulation tasks. The implementation includes both camera-based tactile imaging (using GelSight-style rendering) and force field computation (using SDF-based penalty methods).
Key additions:
- New
VisuoTactileSensorclass with dual sensing modalities (camera RGB + force field) - GelSight rendering utilities for realistic tactile image synthesis from depth maps
- SDF-based force computation with penalty spring-damper models
- Comprehensive demo script and documentation
- Physics material spawner enhancements for compliant contact configuration
Architecture:
The sensor operates in two modes that can be enabled independently:
- Camera tactile: Captures depth from elastomer tip camera, computes difference from nominal state, renders through GelSight model
- Force field: Generates tactile points on elastomer surface, queries SDF for penetration, computes normal/shear forces based on physics parameters
Issues found:
Several logic errors were identified in previous comments that need addressing before merge, particularly around error handling, redundant checks, and potential runtime failures in edge cases. The most critical issues involve division by zero risks and improper None handling that could cause TypeErrors or AttributeErrors during execution.
Confidence Score: 3/5
- Contains critical logic errors that could cause runtime failures in production environments
- While the feature implementation is comprehensive and well-documented, multiple logic errors exist in the core sensor implementation that could lead to runtime crashes. These include division by zero vulnerabilities, improper None handling, redundant checks that will always fail, and state mutation in getter methods. The issues are fixable but need to be addressed before merging.
- Primary attention needed on
source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.pyandsource/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_utils.pyfor logic error fixes
Important Files Changed
File Analysis
| Filename | Score | Overview |
|---|---|---|
| source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.py | 3/5 | Core sensor implementation with multiple logic issues in error handling and state management that could cause runtime failures |
| scripts/demos/sensors/tacsl/tacsl_example.py | 4/5 | Demo script with minor syntax error in operator precedence (line 360) and leftover TODO comment |
| source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor_cfg.py | 5/5 | Configuration file is well-structured with clear documentation and appropriate default values |
| source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_utils.py | 4/5 | Utility functions for GelSight rendering with potential division by zero issues in visualization functions |
Sequence Diagram
sequenceDiagram
participant User
participant InteractiveScene
participant VisuoTactileSensor
participant TiledCamera
participant GelsightRender
participant PhysicsSimView
participant SdfShapePrim
User->>InteractiveScene: Initialize scene with config
InteractiveScene->>VisuoTactileSensor: Create sensor from config
alt Camera Tactile Enabled
VisuoTactileSensor->>TiledCamera: Initialize camera sensor
VisuoTactileSensor->>GelsightRender: Initialize GelSight renderer
VisuoTactileSensor->>TiledCamera: get_initial_render()
TiledCamera-->>VisuoTactileSensor: Return nominal depth
VisuoTactileSensor->>VisuoTactileSensor: Store nominal_tactile baseline
end
alt Force Field Enabled
VisuoTactileSensor->>PhysicsSimView: Create body views for elastomer/indenter
VisuoTactileSensor->>VisuoTactileSensor: Generate tactile points on surface
VisuoTactileSensor->>SdfShapePrim: Initialize SDF view for indenter
VisuoTactileSensor->>VisuoTactileSensor: Allocate force field buffers
end
User->>InteractiveScene: Start simulation loop
loop Every Update Step
InteractiveScene->>VisuoTactileSensor: update(dt)
alt Camera Tactile Enabled
VisuoTactileSensor->>TiledCamera: Update camera sensor
TiledCamera-->>VisuoTactileSensor: Return current depth
VisuoTactileSensor->>VisuoTactileSensor: Compute depth difference from nominal
VisuoTactileSensor->>GelsightRender: render(depth_diff)
GelsightRender-->>VisuoTactileSensor: Return tactile RGB image
end
alt Force Field Enabled
VisuoTactileSensor->>PhysicsSimView: Get elastomer pose/velocity
VisuoTactileSensor->>VisuoTactileSensor: Transform tactile points to world coords
VisuoTactileSensor->>PhysicsSimView: Get indenter pose/velocity
VisuoTactileSensor->>VisuoTactileSensor: Transform points to indenter local frame
VisuoTactileSensor->>SdfShapePrim: Query SDF values and gradients
SdfShapePrim-->>VisuoTactileSensor: Return penetration depth and normals
VisuoTactileSensor->>VisuoTactileSensor: Compute normal forces (F_n = k_n * depth)
VisuoTactileSensor->>VisuoTactileSensor: Compute shear forces (F_t = min(k_t * |v_t|, mu * F_n))
end
VisuoTactileSensor-->>User: Sensor data available via .data property
end
20 files reviewed, no comments
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Overview
Greptile Summary
Adds three new files implementing TacSL visual-based tactile sensor integration with Isaac Lab:
gelsight_utils.pyprovides GelSight rendering utilities using Taxim approach for tactile image synthesisvisuotactile_sensor_cfg.pydefines configuration classes for sensor parameters and calibrationgelsight.pyprovides predefined configurations for GelSight R1.5 and Mini sensors
The implementation includes proper error handling, validation of critical parameters (mm_per_pixel), and graceful fallback when sensor data files are unavailable. Code follows project conventions with comprehensive documentation.
Confidence Score: 4/5
- Safe to merge with minor style improvement recommended
- The three new files are well-structured with proper error handling and validation. Critical parameters like
mm_per_pixelare validated to prevent division by zero. File retrieval includes fallback mechanisms. Only minor issue is a TODO comment that should be cleaned up. - source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_utils.py - contains TODO comment on line 224
Important Files Changed
File Analysis
| Filename | Score | Overview |
|---|---|---|
| source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_utils.py | 4/5 | Adds GelSight rendering utilities for tactile image synthesis with proper error handling and validation |
| source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor_cfg.py | 5/5 | Defines configuration classes for visuo-tactile sensors with comprehensive documentation |
| source/isaaclab_assets/isaaclab_assets/sensors/gelsight.py | 5/5 | Provides predefined configurations for GelSight R1.5 and Mini sensors |
Sequence Diagram
sequenceDiagram
participant User
participant VisuoTactileSensor
participant GelsightRender
participant Utils as gelsight_utils
participant Config as GelSightRenderCfg
User->>Config: Create sensor config (GELSIGHT_R15_CFG)
User->>VisuoTactileSensor: Initialize with config
VisuoTactileSensor->>GelsightRender: __init__(cfg, device)
GelsightRender->>Config: Validate mm_per_pixel
GelsightRender->>Utils: get_gelsight_render_data(bg_path)
Utils-->>GelsightRender: Return cached/downloaded file path
GelsightRender->>Utils: get_gelsight_render_data(calib_path)
Utils-->>GelsightRender: Return cached/downloaded file path
GelsightRender->>GelsightRender: Load background & calibration data
GelsightRender->>GelsightRender: Initialize tensors on device
User->>VisuoTactileSensor: Get tactile data
VisuoTactileSensor->>GelsightRender: render(heightMap)
GelsightRender->>GelsightRender: _gaussian_filtering(height_map)
GelsightRender->>GelsightRender: _generate_normals(height_map)
GelsightRender->>GelsightRender: Compute gradient indices
GelsightRender->>GelsightRender: Lookup calibration parameters
GelsightRender->>GelsightRender: Generate RGB tactile image
GelsightRender-->>VisuoTactileSensor: Return tactile image
VisuoTactileSensor-->>User: Return sensor data
3 files reviewed, 1 comment
| params_g = self.calib_data_grad_g[idx_x, idx_y, :] | ||
| params_b = self.calib_data_grad_b[idx_x, idx_y, :] | ||
|
|
||
| sim_img_rgb = torch.zeros((*idx_x.shape, 3), device=self.device) # TODO: move to init |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: remove TODO comment or create issue to track
| sim_img_rgb = torch.zeros((*idx_x.shape, 3), device=self.device) # TODO: move to init | |
| sim_img_rgb = torch.zeros((*idx_x.shape, 3), device=self.device) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Overview
Greptile Summary
This PR adds visual-based tactile sensor integration with TacSL to Isaac Lab, enabling realistic tactile feedback simulation using GelSight-style sensors. The implementation supports both camera-based tactile imaging and force field computation through SDF-based collision detection.
Key Changes
- Core sensor implementation (
visuotactile_sensor.py): Provides dual-mode tactile sensing with camera-based RGB imaging and penalty-based force field computation using SDF queries - GelSight rendering (
gelsight_utils.py): Implements Taxim-based example rendering to convert depth maps into realistic tactile RGB images with proper calibration data handling - Configuration system (
visuotactile_sensor_cfg.py): Well-structured configuration with clear documentation for sensor parameters, elastomer properties, and physics parameters - Demo script (
tacsl_example.py): Comprehensive example showing sensor setup with multiple indenter types (nut, cube) and configurable tactile modalities - Documentation (
visuo_tactile_sensor.rst): Detailed documentation with configuration examples, usage instructions, and integration guidelines
Implementation Highlights
- Efficient tensor operations with pre-allocated buffers to minimize memory overhead
- GPU-accelerated SDF queries for force field computation
- Proper error handling for file retrieval and initialization failures
- Comprehensive timing statistics for performance monitoring
- Clean separation between camera-based and force-based sensing modalities
The implementation follows Isaac Lab conventions and provides a solid foundation for tactile sensor research and manipulation tasks.
Confidence Score: 4/5
- This PR is safe to merge with good implementation quality and comprehensive documentation
- The code is well-structured with proper error handling, comprehensive documentation, and follows best practices. Previous critical issues (syntax errors, logic bugs) mentioned in earlier comments have been addressed. The implementation shows careful attention to performance optimization and provides clear APIs for integration.
- No files require special attention - all implementations are solid with proper documentation and error handling
Important Files Changed
File Analysis
| Filename | Score | Overview |
|---|---|---|
| docs/source/overview/core-concepts/sensors/visuo_tactile_sensor.rst | 5/5 | Documentation file introducing TacSL tactile sensor with comprehensive usage examples and configuration details - no issues found |
| scripts/demos/sensors/tacsl/tacsl_example.py | 5/5 | Demo script showing tactile sensor integration with proper simulation setup - previous syntax error has been corrected |
| source/isaaclab/isaaclab/sensors/tacsl_sensor/gelsight_utils.py | 5/5 | GelSight rendering utilities with proper error handling and tensor operations - implementation is sound |
| source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor.py | 4/5 | Core tactile sensor implementation with camera and force field sensing - well-structured with comprehensive docstrings |
| source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sensor_cfg.py | 5/5 | Configuration dataclass for tactile sensor with detailed documentation - well-designed configuration structure |
Sequence Diagram
sequenceDiagram
participant User
participant Scene
participant VisuoTactileSensor
participant TiledCamera
participant GelsightRender
participant SDFView
participant PhysicsSimView
User->>Scene: Initialize InteractiveScene(scene_cfg)
Scene->>VisuoTactileSensor: __init__(cfg)
VisuoTactileSensor->>VisuoTactileSensor: _initialize_impl()
alt enable_camera_tactile
VisuoTactileSensor->>GelsightRender: Initialize with render_cfg
GelsightRender->>GelsightRender: Load calibration data & background
VisuoTactileSensor->>TiledCamera: Initialize TiledCamera(camera_cfg)
TiledCamera-->>VisuoTactileSensor: Camera ready
end
alt enable_force_field
VisuoTactileSensor->>VisuoTactileSensor: _generate_tactile_points()
VisuoTactileSensor->>VisuoTactileSensor: Generate grid on elastomer mesh
VisuoTactileSensor->>PhysicsSimView: create_rigid_body_view(elastomer)
VisuoTactileSensor->>PhysicsSimView: create_rigid_body_view(indenter)
VisuoTactileSensor->>SDFView: Initialize SdfShapePrim for indenter
SDFView-->>VisuoTactileSensor: SDF ready for collision queries
end
User->>VisuoTactileSensor: get_initial_render()
VisuoTactileSensor->>TiledCamera: update(dt=0.0)
TiledCamera-->>VisuoTactileSensor: Initial depth image
VisuoTactileSensor->>VisuoTactileSensor: Store nominal_tactile baseline
loop Simulation Step
User->>Scene: update(sim_dt)
Scene->>VisuoTactileSensor: _update_buffers_impl(env_ids)
alt enable_camera_tactile
VisuoTactileSensor->>TiledCamera: update(sim_dt)
TiledCamera-->>VisuoTactileSensor: Current depth image
VisuoTactileSensor->>VisuoTactileSensor: Compute diff from nominal
VisuoTactileSensor->>GelsightRender: render(depth_diff)
GelsightRender-->>VisuoTactileSensor: Tactile RGB image
end
alt enable_force_field
VisuoTactileSensor->>PhysicsSimView: Get elastomer pose & velocity
VisuoTactileSensor->>VisuoTactileSensor: Transform tactile points to world
VisuoTactileSensor->>PhysicsSimView: Get indenter pose & velocity
VisuoTactileSensor->>VisuoTactileSensor: Transform points to indenter local
VisuoTactileSensor->>SDFView: get_sdf_and_gradients(points)
SDFView-->>VisuoTactileSensor: SDF values & gradients
VisuoTactileSensor->>VisuoTactileSensor: Compute normal forces (F_n = k_n * depth)
VisuoTactileSensor->>VisuoTactileSensor: Compute shear forces (F_t = min(k_t*|v_t|, mu*F_n))
end
VisuoTactileSensor-->>Scene: Updated tactile data
end
User->>VisuoTactileSensor: data (property access)
VisuoTactileSensor-->>User: VisuoTactileSensorData with RGB & force fields
5 files reviewed, no comments
Description
This is an implementation of TacSL integrated with Isaac Lab, which demonstrates how to properly configure and use tactile sensors to obtain realistic sensor outputs including tactile RGB images, force fields, and other relevant tactile measurements.
Fixes # (issue)
Type of change
Screenshots
The screenshots of added documentation and simulation outputs.



Checklist
pre-commitchecks with./isaaclab.sh --formatconfig/extension.tomlfileCONTRIBUTORS.mdor my name already exists there