Skip to main content
Generate batch motion trajectories for a UR5e robot arm with collision avoidance using PyRoKi.

Parameters

depth_image
str | PIL.Image | np.ndarray
required
Depth image of the scene.
camera_intrinsics
str | np.ndarray
required
3x3 camera intrinsics matrix.
T_cam_world
str | np.ndarray
required
4x4 camera-to-world transformation matrix.
robot_joint_positions
str | np.ndarray | list
required
Current robot joint positions (7,).
curr_xyzs
str | np.ndarray | List[List[float]]
required
Current end-effector positions (N, 3).
curr_wxyzs
str | np.ndarray | List[List[float]]
required
Current end-effector quaternions (N, 4) in wxyz order.
target_xyzs
str | np.ndarray | List[List[float]]
required
Target positions (N, 3).
target_wxyzs
str | np.ndarray | List[List[float]]
required
Target quaternions (N, 4) in wxyz order.
capsule_transform
str | np.ndarray
required
4x4 transformation matrix for collision capsules.
aux_args
Dict[str, Any]
Auxiliary parameters:
  • robot_radius — Robot collision radius
  • radius_removal_nb — Neighbor count for radius removal
  • radius_removal_dist — Distance threshold for radius removal
  • num_points — Number of scene points for collision checking
  • workspace_xy_dist — Workspace XY distance limit
timeout
float | None
Optional HTTP timeout.

Returns

dict with keys:
  • output — Array of trajectories (N, timesteps, 7)
  • latency_ms — Processing time in milliseconds
  • debug — Debug info with pc (point cloud), scene_capsules, transformed_capsules

Example

from grid_cortex_client import CortexClient
import numpy as np

client = CortexClient()
N = 20
result = client.run(
    model_id="ur5e-pyroki-collision-batch",
    depth_image=np.random.uniform(0.5, 0.6, size=(480, 640)),
    camera_intrinsics=np.eye(3),
    T_cam_world=np.eye(4),
    robot_joint_positions=np.zeros(7),
    curr_xyzs=np.random.uniform(-1, 1, size=(N, 3)),
    curr_wxyzs=np.tile([1.0, 0.0, 0.0, 0.0], (N, 1)),
    target_xyzs=np.random.uniform(-1, 1, size=(N, 3)),
    target_wxyzs=np.tile([1.0, 0.0, 0.0, 0.0], (N, 1)),
    capsule_transform=np.eye(4),
    aux_args={
        "robot_radius": 0.05, "radius_removal_nb": 10,
        "radius_removal_dist": 0.06, "num_points": 1200,
        "workspace_xy_dist": 1.0,
    },
)
print(result["output"].shape)  # (20, timesteps, 7)