This tutorial demonstrates how to use domain randomization techniques in AirGen to generate diverse training data for computer vision models. Domain randomization involves varying simulation parameters (object positions, lighting, textures, weather conditions) to help models generalize better to real-world scenarios.

Notebook for this example can be found: Here

Setting Up the AirGen Car

First, let’s import and initialize the AirGen car simulation:

from grid.robot.wheeled.airgen_car import AirGenCar
airgen_car_0 = AirGenCar()

This code creates an instance of a simulated car in AirGen. You will use this car to interact with the simulation, manipulate objects, and collect data.

Spawning and Manipulating Objects

Let’s explore how to manipulate objects in the simulation environment:

import airgen

# List available assets in the simulation
airgen_car_0.client.simListAssets()

# Get the current vehicle pose
pose = airgen_car_0.client.simGetVehiclePose()
pose_obj = pose

# Modify the position (move 10 units in the x direction)
pose_obj.position.x_val -= 10

# Spawn a new car object with the modified pose
# Parameters: name, asset, pose, scale, attach_to_existing, is_static
airgen_car_0.client.simSpawnObject("Car_New", "Car_01", 
                    pose_obj, airgen.Vector3r(1, 1, 1), True, False)

Now we’ve created a new car in the scene. Let’s modify its position:

# Get the current pose of our new car
curr_pose = airgen_car_0.client.simGetObjectPose("Car_New")
new_pose = curr_pose

# Move it 20 units further in the x direction
new_pose.position.x_val = curr_pose.position.x_val - 20

# Apply the new pose
airgen_car_0.client.simSetObjectPose("Car_New", new_pose)

Adjusting Environmental Conditions

Setting Time of Day

Changing the time of day alters the lighting and shadows in the scene, which is crucial for training models that are robust to different real-world lighting conditions

# Set to a specific time (10 PM on July 11, 2024)
airgen_car_0.client.simSetTimeOfDay(True, "2024-07-11 22:00:00")

Changing Object Textures

Randomizing textures helps vision models learn to recognize objects regardless of appearance. By applying different textures, you ensure that the model does not overfit to a single appearance of an object, making it more robust to variations in the real world

# Apply a custom texture to an object
base_path = "/mnt/azure_blobfuse_mount/user/sessions/"
session_id = "663b8240-f6c5-4e85-95ba-5d7ad5afdadf"
texture_file = "sample_texture.jpg"
texture_path = f"{base_path}{session_id}/{texture_file}"

airgen_car_0.client.simSetObjectMaterialFromTexture(
    'Car_56', 
    texture_path
)

Setting Up Computer Vision Models

Now let’s integrate some vision models to test how they perform in our randomized environments:

from grid.model.perception.detection.gdino import GroundingDINO
from grid.model.perception.segmentation.clipseg import CLIPSeg

# Initialize models
detection_gdino_0 = GroundingDINO()
seg_clipseg_0 = CLIPSeg()

# Define a detection function
def detect(client, object_name="car"):
    rgb_image, pose = client.getImages("front_center", 
                            [airgen.ImageType.Scene])[0]
    boxes, phrases = detection_gdino_0.detect_object(rgb_image, 
                                                object_name)
    return boxes, phrases

# Define a segmentation function
def segment(client, object_name="car"):
    rgb_image, pose = client.getImages("front_center", 
                            [airgen.ImageType.Scene])[0]
    result = seg_clipseg_0.segment_image(rgb_image, object_name)
    return result
  • GroundingDINO is used for object detection, while CLIPSeg is used for segmentation.
  • The detect and segment functions run the respective models on images captured from the simulation.
  • By running these models in randomized environments, you can assess their robustness to changes in scene appearance.

Testing Vision Models with Weather Variations

One of the most important aspects of domain randomization is varying weather conditions:

import time

# Enable weather effects
airgen_car_0.client.simEnableWeather(True)

# Test vision models across different weather conditions
for i in range(10):
    # Calculate weather intensity (0% to 90%)
    weather_intensity = i / 10
    
    # Gradually increase rain and fog
    airgen_car_0.client.simSetWeatherParameter(
        airgen.WeatherParameter.Rain, 
        weather_intensity
    )
    airgen_car_0.client.simSetWeatherParameter(
        airgen.WeatherParameter.Fog, 
        weather_intensity
    )
    
    # Run detection and segmentation at each step
    detect(airgen_car_0.client)
    segment(airgen_car_0.client)
    
    # Wait a second between iterations
    time.sleep(1)
  • This loop simulates a range of weather conditions by gradually increasing rain and fog.
  • At each step, the detection and segmentation models are evaluated, providing insight into their performance under adverse conditions.
  • Such systematic variation is a core part of domain randomization, ensuring the model is not sensitive to specific weather or visibility conditions