mimicgen.utils package#

Submodules#

mimicgen.utils.config_utils module#

A collection of utilities for working with config generators. These generators are re-used from robomimic (https://robomimic.github.io/docs/tutorials/hyperparam_scan.html)

mimicgen.utils.config_utils.set_basic_settings(generator, group, source_dataset_path, source_dataset_name, generation_path, guarantee, num_traj, num_src_demos=None, max_num_failures=25, num_demo_to_render=10, num_fail_demo_to_render=25, verbose=False)#

Sets config generator parameters for some basic data generation settings.

Parameters
  • generator (robomimic ConfigGenerator instance) – config generator object

  • group (int) – parameter group for these settings

  • source_dataset_path (str) – path to source dataset

  • source_dataset_name (str) – name to give source dataset in experiment name

  • generation_path (str) – folder for generated data

  • guarantee (bool) – whether to ensure @num_traj successes

  • num_traj (int) – number of trajectories for generation

  • num_src_demos (int or None) – number of source demos to take from @source_dataset_path

  • max_num_failures (int) – max failures to keep

  • num_demo_to_render (int) – max demos to render to video

  • num_fail_demo_to_render (int) – max fail demos to render to video

  • verbose (bool) – if True, make experiment name verbose using the passed settings

mimicgen.utils.config_utils.set_debug_settings(generator, group)#

Sets config generator parameters for a quick debug run.

Parameters
  • generator (robomimic ConfigGenerator instance) – config generator object

  • group (int) – parameter group for these settings

mimicgen.utils.config_utils.set_learning_settings_for_bc_rnn(generator, group, modality, seq_length=10, low_dim_keys=None, image_keys=None, crop_size=None)#

Sets config generator parameters for robomimic BC-RNN training runs.

Parameters
  • generator (robomimic ConfigGenerator instance) – config generator object

  • group (int) – parameter group for these settings

  • modality (str) – whether this is a low-dim or image observation run

  • seq_length (int) – BC-RNN context length

  • low_dim_keys (list or None) – if provided, set low-dim observation keys, else use defaults

  • image_keys (list or None) – if provided, set image observation keys, else use defaults

  • crop_size (tuple or None) – if provided, size of crop to use for pixel shift augmentation

mimicgen.utils.config_utils.set_obs_settings(generator, group, collect_obs, camera_names, camera_height, camera_width)#

Sets config generator parameters for collecting observations.

mimicgen.utils.config_utils.set_subtask_settings(generator, group, base_config_file, select_src_per_subtask, subtask_term_offset_range=None, selection_strategy=None, selection_strategy_kwargs=None, action_noise=None, num_interpolation_steps=None, num_fixed_steps=None, verbose=False)#

Sets config generator parameters for each subtask.

Parameters
  • generator (robomimic ConfigGenerator instance) – config generator object

  • group (int) – parameter group for these settings

  • base_config_file (str) – path to base config file being used for generating configs

  • select_src_per_subtask (bool) – whether to select src demo for each subtask

  • subtask_term_offset_range (list or None) – if provided, should be list of 2-tuples, one entry per subtask, with the last entry being None

  • selection_strategy (str or None) – src demo selection strategy

  • selection_strategy_kwargs (dict or None) – kwargs for selection strategy

  • action_noise (float or list or None) – action noise for all subtasks

  • num_interpolation_steps (int or list or None) – interpolation steps for all subtasks

  • num_fixed_steps (int or list or None) – interpolation steps for all subtasks

  • verbose (bool) – if True, make experiment name verbose using the passed settings

mimicgen.utils.file_utils module#

A collection of utilities related to files.

mimicgen.utils.file_utils.config_generator_to_script_lines(generator, config_dir)#

Takes a robomimic ConfigGenerator and uses it to generate a set of training configs, and a set of bash command lines that correspond to each training run (one per config). Note that the generator’s script_file will be overridden to be a temporary file that will be removed from disk.

Parameters
  • generator (ConfigGenerator instance or list) – generator(s) to use for generating configs and training runs

  • config_dir (str) – path to directory where configs will be generated

Returns

a list of config files that were generated

run_lines (list): a list of strings that are training commands, one per config

Return type

config_files (list)

mimicgen.utils.file_utils.download_file_from_hf(repo_id, filename, download_dir, check_overwrite=True)#

Downloads a file from Hugging Face.

Reference: https://huggingface.co/docs/huggingface_hub/main/en/guides/download

Example usage:

repo_id = “amandlek/mimicgen_datasets” filename = “core/coffee_d0.hdf5” download_dir = “/tmp” download_file_from_hf(repo_id, filename, download_dir, check_overwrite=True)

Parameters
  • repo_id (str) – Hugging Face repo ID

  • filename (str) – path to file in repo

  • download_dir (str) – path to directory where file should be downloaded

  • check_overwrite (bool) – if True, will sanity check the download fpath to make sure a file of that name doesn’t already exist there

mimicgen.utils.file_utils.download_url_from_gdrive(url, download_dir, check_overwrite=True)#

Downloads a file at a URL from Google Drive.

Example usage:

url = https://drive.google.com/file/d/1DABdqnBri6-l9UitjQV53uOq_84Dx7Xt/view?usp=drive_link download_dir = “/tmp” download_url_from_gdrive(url, download_dir, check_overwrite=True)

Parameters
  • url (str) – url string

  • download_dir (str) – path to directory where file should be downloaded

  • check_overwrite (bool) – if True, will sanity check the download fpath to make sure a file of that name doesn’t already exist there

mimicgen.utils.file_utils.get_all_demos_from_dataset(dataset_path, filter_key=None, start=None, n=None)#

Helper function to get demonstration keys from robomimic hdf5 dataset.

Parameters
  • dataset_path (str) – path to hdf5 dataset

  • filter_key (str or None) – name of filter key

  • start (int or None) – demonstration index to start from

  • n (int or None) – number of consecutive demonstrations to retrieve

Returns

list of demonstration keys

Return type

demo_keys (list)

mimicgen.utils.file_utils.get_env_interface_info_from_dataset(dataset_path, demo_keys)#

Gets environment interface information from source dataset.

Parameters
  • dataset_path (str) – path to hdf5 dataset

  • demo_keys (list) – list of demonstration keys to extract info from

Returns

name of environment interface class env_interface_type (str): type of environment interface

Return type

env_interface_name (str)

mimicgen.utils.file_utils.merge_all_hdf5(folder, new_hdf5_path, delete_folder=False, dry_run=False, return_horizons=False)#

Helper function to take all hdf5s in @folder and merge them into a single one. Returns the number of hdf5s that were merged.

mimicgen.utils.file_utils.parse_source_dataset(dataset_path, demo_keys, task_spec=None, subtask_term_signals=None, subtask_term_offset_ranges=None)#

Parses a source dataset to extract info needed for data generation (DatagenInfo instances) and subtask indices that split each source dataset trajectory into contiguous subtask segments.

Parameters
  • dataset_path (str) – path to hdf5 dataset

  • demo_keys (list) – list of demo keys to use from dataset path

  • task_spec (MG_TaskSpec instance or None) – task spec object, which will be used to infer the sequence of subtask termination signals and offset ranges.

  • subtask_term_signals (list or None) – sequence of subtask termination signals, which should only be provided if not providing @task_spec. Should have an entry per subtask and the last subtask entry should be None, since the final subtask ends when the task ends.

  • subtask_term_offset_ranges (list or None) – sequence of subtask termination offset ranges, which should only be provided if not providing @task_spec. Should have an entry per subtask and the last subtask entry should be None or (0, 0), since the final subtask ends when the task ends.

Returns

list of DatagenInfo instances, one per source

demonstration. Each instance has entries with leading dimension [T, …], the length of the trajectory.

subtask_indices (np.array): array of shape (N, S, 2) where N is the number of

demos and S is the number of subtasks for this task. Each entry is a pair of integers that represents the index at which a subtask segment starts and where it is completed.

subtask_term_signals (list): sequence of subtask termination signals

subtask_term_offset_ranges (list): sequence of subtask termination offset ranges

Return type

datagen_infos (list)

mimicgen.utils.file_utils.write_demo_to_hdf5(folder, env, initial_state, states, observations, datagen_info, actions, src_demo_inds=None, src_demo_labels=None)#

Helper function to write demonstration to an hdf5 file (robomimic format) in a folder. It will be named using a timestamp.

Parameters
  • folder (str) – folder to write hdf5 to

  • env (robomimic EnvBase instance) – simulation environment

  • initial_state (dict) – dictionary corresponding to initial simulator state (see robomimic dataset structure for more information)

  • states (list) – list of simulator states

  • observations (list) – list of observation dictionaries

  • datagen_info (list) – list of DatagenInfo instances

  • actions (np.array) – actions per timestep

  • src_demo_inds (list or None) – if provided, list of selected source demonstration indices for each subtask

  • src_demo_labels (np.array or None) – same as @src_demo_inds, but repeated to have a label for each timestep of the trajectory

mimicgen.utils.file_utils.write_json(json_dic, json_path)#

Write dictionary to json file.

mimicgen.utils.misc_utils module#

A collection of miscellaneous utilities.

class mimicgen.utils.misc_utils.Grid(values, initial_ind=0)#

Bases: object

Keep track of a list of values, and point to a single value at a time.

get()#
next()#
prev()#
class mimicgen.utils.misc_utils.Rate(hz)#

Bases: object

Convenience class for enforcing rates in loops. Modeled after rospy.Rate.

See http://docs.ros.org/en/jade/api/rospy/html/rospy.timer-pysrc.html#Rate.sleep

sleep()#

Attempt to sleep at the specified rate in hz, by taking the time elapsed since the last call to this function into account.

update_hz(hz)#

Update rate to enforce.

class mimicgen.utils.misc_utils.RateMeasure(name=None, history=100, freq_threshold=None)#

Bases: object

Measure approximate time intervals of code execution by calling @measure

disable()#

Disable measurements.

enable()#

Enable measurements.

measure()#

Take a measurement of the time elapsed since the last @measure call and also return the time elapsed.

report_stats(verbose=False)#

Report statistics over measurements, converting timer measurements into frequencies.

class mimicgen.utils.misc_utils.Timer(history=100, ignore_first=False)#

Bases: object

A simple timer.

disable()#

Disable measurements with this timer.

enable()#

Enable measurements with this timer.

report_stats(verbose=False)#
tic()#
timed()#
toc()#
mimicgen.utils.misc_utils.add_red_border_to_frame(frame, ratio=0.02)#

Add a red border to an image frame.

mimicgen.utils.pose_utils module#

A collection of utilities for working with poses.

mimicgen.utils.pose_utils.axisangle2quat(axis, angle)#

Converts axis-angle to (x, y, z, w) quat.

NOTE: this differs from robosuite’s function because it accepts

both axis and angle as arguments, not axis * angle.

mimicgen.utils.pose_utils.interpolate_poses(pose_1, pose_2, num_steps=None, step_size=None, perturb=False)#

Linear interpolation between two poses.

Parameters
  • pose_1 (np.array) – 4x4 start pose

  • pose_2 (np.array) – 4x4 end pose

  • num_steps (int) – if provided, specifies the number of desired interpolated points (not excluding the start and end points). Passing 0 corresponds to no interpolation, and passing None means that @step_size must be provided to determine the number of interpolated points.

  • step_size (float) – if provided, will be used to infer the number of steps, by taking the norm of the delta position vector, and dividing it by the step size

  • perturb (bool) – if True, randomly move all the interpolated position points in a uniform, non-overlapping grid.

Returns

array of shape (N + 2, 3) corresponding to the interpolated pose path, where N is @num_steps num_steps (int): the number of interpolated points (N) in the path

Return type

pose_steps (np.array)

mimicgen.utils.pose_utils.interpolate_rotations(R1, R2, num_steps, axis_angle=True)#

Interpolate between 2 rotation matrices. If @axis_angle, interpolate the axis-angle representation of the delta rotation, else, use slerp.

NOTE: I have verified empirically that both methods are essentially equivalent, so pick your favorite.

mimicgen.utils.pose_utils.make_pose(pos, rot)#

Make homogenous pose matrices from a set of translation vectors and rotation matrices.

Parameters
  • pos (np.array) – batch of position vectors with last dimension of 3

  • rot (np.array) – batch of rotation matrices with last 2 dimensions of (3, 3)

Returns

batch of pose matrices with last 2 dimensions of (4, 4)

Return type

pose (np.array)

mimicgen.utils.pose_utils.pose_in_A_to_pose_in_B(pose_in_A, pose_A_in_B)#

Converts homogenous matrices corresponding to a point C in frame A to homogenous matrices corresponding to the same point C in frame B.

Parameters
  • pose_in_A (np.array) – batch of homogenous matrices corresponding to the pose of C in frame A

  • pose_A_in_B (np.array) – batch of homogenous matrices corresponding to the pose of A in frame B

Returns

batch of homogenous matrices corresponding to the pose of C in frame B

Return type

pose_in_B (np.array)

mimicgen.utils.pose_utils.pose_inv(pose)#

Computes the inverse of homogenous pose matrices.

Note that the inverse of a pose matrix is the following: [R t; 0 1]^-1 = [R.T -R.T*t; 0 1]

Parameters

pose (np.array) – batch of pose matrices with last 2 dimensions of (4, 4)

Returns

batch of inverse pose matrices with last 2 dimensions of (4, 4)

Return type

inv_pose (np.array)

mimicgen.utils.pose_utils.quat2axisangle(quat)#

Converts (x, y, z, w) quaternion to axis-angle format. Returns a unit vector direction and an angle.

NOTE: this differs from robosuite’s function because it returns

both axis and angle, not axis * angle.

mimicgen.utils.pose_utils.quat_slerp(q1, q2, tau)#

Adapted from robosuite.

mimicgen.utils.pose_utils.transform_source_data_segment_using_object_pose(obj_pose, src_eef_poses, src_obj_pose)#

Transform a source data segment (object-centric subtask segment from source demonstration) such that the relative poses between the target eef pose frame and the object frame are preserved. Recall that each object-centric subtask segment corresponds to one object, and consists of a sequence of target eef poses.

Parameters
  • obj_pose (np.array) – 4x4 object pose in current scene

  • src_eef_poses (np.array) – pose sequence (shape [T, 4, 4]) for the sequence of end effector control poses from the source demonstration

  • src_obj_pose (np.array) – 4x4 object pose from the source demonstration

Returns

transformed pose sequence (shape [T, 4, 4])

Return type

transformed_eef_poses (np.array)

mimicgen.utils.pose_utils.unmake_pose(pose)#

Split homogenous pose matrices back into translation vectors and rotation matrices.

Parameters

pose (np.array) – batch of pose matrices with last 2 dimensions of (4, 4)

Returns

batch of position vectors with last dimension of 3 rot (np.array): batch of rotation matrices with last 2 dimensions of (3, 3)

Return type

pos (np.array)

mimicgen.utils.robomimic_utils module#

Collection of utilities related to robomimic.

mimicgen.utils.robomimic_utils.create_env(env_meta, env_name=None, env_class=None, robot=None, gripper=None, camera_names=None, camera_height=84, camera_width=84, render=None, render_offscreen=None, use_image_obs=None, use_depth_obs=None)#

Helper function to create the environment from dataset metadata and arguments.

Parameters
  • env_meta (dict) – environment metadata compatible with robomimic, see https://robomimic.github.io/docs/modules/environments.html

  • env_name (str or None) – if provided, override environment name in @env_meta

  • env_class (class or None) – if provided, use this class instead of the one inferred from @env_meta

  • robot (str or None) – if provided, override the robot argument in @env_meta. Currently only supported by robosuite environments.

  • gripper (str or None) – if provided, override the gripper argument in @env_meta. Currently only supported by robosuite environments.

  • camera_names (list of str or None) – list of camera names that correspond to image observations

  • camera_height (int) – camera height for all cameras

  • camera_width (int) – camera width for all cameras

  • render (bool or None) – optionally override rendering behavior

  • render_offscreen (bool or None) – optionally override rendering behavior

  • use_image_obs (bool or None) – optionally override rendering behavior

  • use_depth_obs (bool or None) – optionally override rendering behavior

mimicgen.utils.robomimic_utils.get_default_env_cameras(env_meta)#

Get the default set of cameras for a particular robomimic environment type.

Parameters

env_meta (dict) – environment metadata compatible with robomimic, see https://robomimic.github.io/docs/modules/environments.html

Returns

list of camera names that correspond to image observations

Return type

camera_names (list of str)

mimicgen.utils.robomimic_utils.make_dataset_video(dataset_path, video_path, num_render=None, render_image_names=None, use_obs=False, video_skip=5)#

Helper function to set up args and call @playback_dataset from robomimic to get video of generated dataset.

mimicgen.utils.robomimic_utils.make_print_logger(txt_file)#

Makes a logger that mirrors stdout and stderr to a text file.

Parameters

txt_file (str) – path to txt file to write

Module contents#