cosense3d.utils package

Submodules

cosense3d.utils.box_utils module

cosense3d.utils.box_utils.bbox_cxcywh_to_xyxy(bbox)[source]

Convert bbox coordinates from (cx, cy, w, h) to (x1, y1, x2, y2).

Parameters:

(Tensor) (bbox) – Shape (n, 4) for bboxes.

Returns:

Tensor: Converted bboxes.

cosense3d.utils.box_utils.bbox_xyxy_to_cxcywh(bbox)[source]

Convert bbox coordinates from (x1, y1, x2, y2) to (cx, cy, w, h).

Parameters:

(Tensor) (bbox) – Shape (n, 4) for bboxes.

Returns:

Tensor, Converted bboxes.

cosense3d.utils.box_utils.boxes3d_to_standup_bboxes(boxes)[source]
Parameters:

boxes – Tensor(N, 7)

Returns:

Tenosr(N, 4): [x_min, y_min, x_max, y_max)

cosense3d.utils.box_utils.boxes_to_corners_2d(boxes_np)[source]

Convert boxes to 4 corners in xy plane :param boxes_np: np.ndarray [N, 7], cols - (x,y,z,dx,dy,dz,det_r) :return: corners: np.ndarray [N, 4, 2], corner order is back left, front left, front back, back left

cosense3d.utils.box_utils.boxes_to_corners_3d(boxes3d: ndarray | Tensor, order: str = 'lwh') ndarray | Tensor[source]

4 ——– 5 ^ z

/| /| |

7 ——– 6 . | | | | | | . x . 0 ——– 1 |/ |/ |/ +——-> y 3 ——– 2

Parameters:

boxes3d – (N, 7 + (2: optional)) [x, y, z, dx, dy, dz, yaw]

or [x, y, z, dx, dy, dz, roll, pitch, yaw], (x, y, z) is the box center. :param order: ‘lwh’ or ‘hwl’. :return: (N, 8, 3), the 8 corners of the bounding box.

cosense3d.utils.box_utils.compute_iou(box, boxes)[source]

Compute iou between box and boxes list

Parameters:
  • box – shapely.geometry.Polygon Bounding box Polygon.

  • boxes – list List of shapely.geometry.Polygon.

Returns:

iou : np.ndarray Array of iou between box and boxes.

cosense3d.utils.box_utils.convert_box_to_polygon(boxes_array)[source]

Convert boxes array to shapely.geometry.Polygon format.

:param boxes_arraynp.ndarray

(N, 4, 2) or (N, 8, 3).

Returns:

list of converted shapely.geometry.Polygon object.

cosense3d.utils.box_utils.corners_to_boxes_3d(corners: ndarray | Tensor, mode: int = 9) ndarray | Tensor[source]

4 ——– 5 ^ z

/| /| |

7 ——– 6 . | | | | | | . x . 0 ——– 1 |/ |/ |/ +——-> y 3 ——– 2

Parameters:
  • corners – (N, 8, 3)

  • mode – 9 | 7

Returns:

boxes, (N, 9 | 7)

cosense3d.utils.box_utils.decode_boxes(reg, points, lwh_mean)[source]
cosense3d.utils.box_utils.denormalize_bbox(normalized_bboxes)[source]
cosense3d.utils.box_utils.enlarge_box3d(boxes3d, extra_width=(0, 0, 0))[source]
Parameters:
  • boxes3d – [x, y, z, dx, dy, dz, heading], (x, y, z) is the box center

  • extra_width – [extra_x, extra_y, extra_z]

Returns:

cosense3d.utils.box_utils.find_rigid_alignment(A, B)[source]

Find rotation and translation from A to B. Parameters

Parameters:
  • A – (B, N, 3)

  • B – (B, N, 3)

Returns:

cosense3d.utils.box_utils.limit_period(val, offset=0.5, period=6.283185307179586)[source]
cosense3d.utils.box_utils.mask_boxes_outside_range_numpy(boxes: ndarray, limit_range: list, order: str, min_num_corners: int = 2) ndarray[source]
Parameters:
  • boxes – (N, 7) [x, y, z, dx, dy, dz, heading], (x, y, z) is the box center

  • limit_range – [minx, miny, minz, maxx, maxy, maxz]

  • order – ‘lwh’ or ‘hwl’

  • min_num_corners – The required minimum number of corners to be considered as in range.

Returns:

The filtered boxes.

cosense3d.utils.box_utils.mask_boxes_outside_range_torch(boxes, lidar_range)[source]
cosense3d.utils.box_utils.normalize_bbox(bboxes)[source]
cosense3d.utils.box_utils.remove_points_in_boxes3d(points, boxes3d, x_idx=0)[source]
Parameters:
  • points – (num_points, x_idx + 3 + C)

  • boxes3d – (N, 7) [x, y, z, dx, dy, dz, heading], (x, y, z) is the box center, each box DO NOT overlaps

Returns:

cosense3d.utils.box_utils.transform_boxes_3d(boxes_in, transform, mode=7)[source]
Parameters:
  • boxes_in – (N, 7)

  • transform – (4, 4)

  • mode – 7 | 9

cosense3d.utils.eval_detection_utils module

cosense3d.utils.eval_detection_utils.cal_ap_all_point(scores, tps, n_pred, n_gt)[source]

source: https://github.com/rafaelpadilla/Object-Detection-Metrics/blob/7c0bd0489e3fd4ae71fc0bc8f2a67dbab5dbdc9c/lib/Evaluator.py#L292

cosense3d.utils.eval_detection_utils.cal_precision_recall(scores, tps, n_pred, n_gt)[source]
cosense3d.utils.eval_detection_utils.calculate_ap(result_stat, iou, global_sort_detections)[source]

Calculate the average precision and recall, and save them into a txt.

Parameters

result_statdict

A dictionary contains fp, tp and gt number.

iou : float

global_sort_detectionsbool

Whether to sort the detection results globally.

cosense3d.utils.eval_detection_utils.caluclate_tp_fp(det_boxes, det_score, gt_boxes, result_stat, iou_thresh, det_range=None)[source]

Calculate the true positive and false positive numbers of the current frames.

Parameters

det_boxestorch.Tensor

The detection bounding box, shape (N, 8, 3) or (N, 4, 2) or (N, 7).

det_score :torch.Tensor

The confidence score for each preditect bounding box.

gt_boxestorch.Tensor

The groundtruth bounding box.

result_stat: dict

A dictionary contains fp, tp and gt number.

iou_threshfloat

The iou thresh.

rangelist, [left_range, right_range]

The evaluation range left bound

cosense3d.utils.eval_detection_utils.eval_final_results(result_stat, iou_thrs, global_sort_detections=False)[source]
cosense3d.utils.eval_detection_utils.ops_cal_tp(pred_boxes, gt_boxes, iou_mode='3d', IoU_thr=0.7)[source]
cosense3d.utils.eval_detection_utils.voc_ap(rec, prec)[source]

VOC 2010 Average Precision.

cosense3d.utils.iou2d_calculator module

cosense3d.utils.iou2d_calculator.bbox_overlaps(bboxes1, bboxes2, mode='iou', is_aligned=False, eps=1e-06)[source]

Calculate overlap between two set of bboxes.

FP16 Contributed by https://github.com/open-mmlab/mmdetection/pull/4889 Note: Assume bboxes1 is M x 4, bboxes2 is N x 4, when mode is ‘iou’, there are some new generated variable when calculating IOU using bbox_overlaps function:

  1. is_aligned is False

    area1: M x 1 area2: N x 1 lt: M x N x 2 rb: M x N x 2 wh: M x N x 2 overlap: M x N x 1 union: M x N x 1 ious: M x N x 1

    Total memory:

    S = (9 x N x M + N + M) * 4 Byte,

    When using FP16, we can reduce:

    R = (9 x N x M + N + M) * 4 / 2 Byte R large than (N + M) * 4 * 2 is always true when N and M >= 1. Obviously, N + M <= N * M < 3 * N * M, when N >=2 and M >=2,

    N + 1 < 3 * N, when N or M is 1.

    Given M = 40 (ground truth), N = 400000 (three anchor boxes in per grid, FPN, R-CNNs),

    R = 275 MB (one times)

    A special case (dense detection), M = 512 (ground truth),

    R = 3516 MB = 3.43 GB

    When the batch size is B, reduce:

    B x R

    Therefore, CUDA memory runs out frequently.

    Experiments on GeForce RTX 2080Ti (11019 MiB):

    dtype | M | N | Use | Real | Ideal |
  2. is_aligned is True

    area1: N x 1 area2: N x 1 lt: N x 2 rb: N x 2 wh: N x 2 overlap: N x 1 union: N x 1 ious: N x 1

    Total memory:

    S = 11 x N * 4 Byte

    When using FP16, we can reduce:

    R = 11 x N * 4 / 2 Byte

So do the ‘giou’ (large than ‘iou’).

Time-wise, FP16 is generally faster than FP32.

When gpu_assign_thr is not -1, it takes more time on cpu but not reduce memory. There, we can reduce half the memory and keep the speed.

If is_aligned is False, then calculate the overlaps between each bbox of bboxes1 and bboxes2, otherwise the overlaps between each aligned pair of bboxes1 and bboxes2.

Parameters:
  • bboxes1 – (Tensor) shape (B, m, 4) in <x1, y1, x2, y2> format or empty.

  • bboxes2 – (Tensor) shape (B, n, 4) in <x1, y1, x2, y2> format or empty.

B indicates the batch dim, in shape (B1, B2, …, Bn). If is_aligned is True, then m and n must be equal. :param mode: (str) “iou” (intersection over union), “iof” (intersection over foreground) or “giou” (generalized intersection over union). Default “iou”. :param is_aligned: (bool, optional) If True, then m and n must be equal. Default False. :param eps: (float, optional) A value added to the denominator for numerical

stability. Default 1e-6.

Returns:

Tensor: shape (m, n) if is_aligned is False else shape (m,)

Example:
>>> bboxes1 = torch.FloatTensor([
>>>     [0, 0, 10, 10],
>>>     [10, 10, 20, 20],
>>>     [32, 32, 38, 42],
>>> ])
>>> bboxes2 = torch.FloatTensor([
>>>     [0, 0, 10, 20],
>>>     [0, 10, 10, 19],
>>>     [10, 10, 20, 20],
>>> ])
>>> overlaps = bbox_overlaps(bboxes1, bboxes2)
>>> assert overlaps.shape == (3, 3)
>>> overlaps = bbox_overlaps(bboxes1, bboxes2, is_aligned=True)
>>> assert overlaps.shape == (3, )
Example:
>>> empty = torch.empty(0, 4)
>>> nonempty = torch.FloatTensor([[0, 0, 10, 9]])
>>> assert tuple(bbox_overlaps(empty, nonempty).shape) == (0, 1)
>>> assert tuple(bbox_overlaps(nonempty, empty).shape) == (1, 0)
>>> assert tuple(bbox_overlaps(empty, empty).shape) == (0, 0)
cosense3d.utils.iou2d_calculator.cast_tensor_type(x, scale=1.0, dtype=None)[source]
cosense3d.utils.iou2d_calculator.fp16_clamp(x, min=None, max=None)[source]

cosense3d.utils.logger module

class cosense3d.utils.logger.LogMeter(total_iter, logdir, delimiter='\t', log_every=20, wandb_project=None)[source]

Bases: object

add_meter(name, meter)[source]
log(epoch, iteration, lr, **kwargs)[source]
update(**kwargs)[source]
class cosense3d.utils.logger.SmoothedValue(window_size=20, fmt=None)[source]

Bases: object

property avg
property global_avg
property max
property median
update(value, n=1)[source]
property value
class cosense3d.utils.logger.TestLogger(logdir)[source]

Bases: object

log(msg)[source]
cosense3d.utils.logger.setup_logger(exp_name, debug)[source]

cosense3d.utils.lr_scheduler module

class cosense3d.utils.lr_scheduler.LRUpdater(optimizer, total_iter, policy, **kwargs)[source]

Bases: object

Unified API for updating LR with different LR schedulers.

get_last_lr()[source]
load_state_dict(state_dict)[source]
state_dict()[source]
step_epoch(epoch)[source]
step_itr(itr)[source]
class cosense3d.utils.lr_scheduler.TransformerAdaptiveScheduler(optimizer: Optimizer, dim_embed: int, warmup_steps: int, itrs_per_epoch: int, last_epoch: int = -1, global_fade_ratio: float = 1, verbose: bool = False)[source]

Bases: _LRScheduler

calc_lr(step, dim_embed, warmup_steps)[source]
get_lr() float[source]
cosense3d.utils.lr_scheduler.build_lr_scheduler(optimizer, cfg, total_iter)[source]

cosense3d.utils.metrics module

class cosense3d.utils.metrics.Metric(cfg, log_dir)[source]

Bases: object

add_samples(data_dict)[source]
save_detections(filename)[source]
summary()[source]
class cosense3d.utils.metrics.MetricBev(cfg, run_path, logger, name='test')[source]

Bases: Metric

add_samples(out_dict)[source]
Args:
out_dict:
bev:

conf: Tensor, (B, H, W, C) or (N, C) unc: Tensor (optional), (B, H, W, C) or (N, C) gt: Tensor, (B, H, W, C) or (N, C)

format_str(result_dict)[source]
iou(conf, gt, unc=None)[source]

Compare the thresholded pred BEV map with the full gt BEV map (including non observable area)

summary()[source]
summary_hook()[source]
class cosense3d.utils.metrics.MetricMOT(cfg, log_dir)[source]

Bases: Metric

add_samples(data_dict)[source]
class cosense3d.utils.metrics.MetricObjDet(cfg, log_dir, logger, bev=False)[source]

Bases: Metric

add_sample(name, pred_boxes, gt_boxes, confidences, ids=None)[source]
add_samples(out_dict)[source]
cal_ap_11_point(IoU_thr=0.5)[source]

source: https://github.com/rafaelpadilla/Object-Detection-Metrics/blob/7c0bd0489e3fd4ae71fc0bc8f2a67dbab5dbdc9c/lib/Evaluator.py#L315

cal_ap_all_point(IoU_thr=0.5)[source]

source: https://github.com/rafaelpadilla/Object-Detection-Metrics/blob/7c0bd0489e3fd4ae71fc0bc8f2a67dbab5dbdc9c/lib/Evaluator.py#L292

cal_precision_recall(IoU_thr=0.5)[source]
save_detections(filename)[source]
summary()[source]
class cosense3d.utils.metrics.MetricSemSeg(cfg, run_path, name='test')[source]

Bases: Metric

add_samples(data_dict)[source]
cal_ious_and_accs()[source]
save_detections(filename)[source]

cosense3d.utils.misc module

cosense3d.utils.misc.check_numpy_to_torch(x)[source]
cosense3d.utils.misc.ensure_dir(path)[source]
cosense3d.utils.misc.list_dirs(path)[source]
cosense3d.utils.misc.load_from_pl_state_dict(model, pl_state_dict)[source]
cosense3d.utils.misc.load_json(filename)[source]
cosense3d.utils.misc.load_yaml(filename, cloader=False)[source]

Load yaml file into dictionary.

Parameters

filenamestr

Full path of yaml file.

Returns

paramsdict

A dictionary that contains defined parameters.

cosense3d.utils.misc.multi_apply(func, *args, **kwargs)[source]

Apply function to a list of arguments.

Note:

This function applies the func to multiple inputs and map the multiple outputs of the func into different list. Each list contains the same type of outputs corresponding to different inputs.

Args:
func (Function): A function that will be applied to a list of

arguments

Returns:

tuple(list): A tuple containing multiple list, each list contains a kind of returned results by the function

cosense3d.utils.misc.pad_list_to_array_np(data)[source]

Pad list of numpy data to one single numpy array :param data: list of np.ndarray :return: np.ndarray

cosense3d.utils.misc.save_json(data, filename)[source]
cosense3d.utils.misc.save_yaml(data, filename, cdumper=False)[source]
cosense3d.utils.misc.setup_logger(exp_name, debug)[source]
cosense3d.utils.misc.torch_tensor_to_numpy(torch_tensor)[source]

Convert a torch tensor to numpy.

Parameters

torch_tensor : torch.Tensor

Returns

A numpy array.

cosense3d.utils.misc.update_dict(dict_out, dict_add)[source]

Merge config_add into config_out. Existing values in config_out will be overwritten by the config_add.

Parameters

dict_out: dict dict_add: dict

Returns

config_out: dict

Updated config_out

cosense3d.utils.module_utils module

cosense3d.utils.module_utils.build_dropout(cfgs)[source]
cosense3d.utils.module_utils.build_norm_layer(cfgs, shape)[source]
cosense3d.utils.module_utils.digit_version(version_str: str, length: int = 4)[source]

Convert a version string into a tuple of integers.

This method is usually used for comparing two versions. For pre-release versions: alpha < beta < rc.

Args:

version_str (str): The version string. length (int): The maximum number of version levels. Default: 4.

Returns:

tuple[int]: The version info in digits (integers).

cosense3d.utils.module_utils.get_target_module(target)[source]
cosense3d.utils.module_utils.instantiate_target_module(target, cfg=None, **kwargs)[source]

cosense3d.utils.pclib module

cosense3d.utils.pclib.cart2cyl(input_xyz)[source]
cosense3d.utils.pclib.cyl2cart(input_xyz_polar)[source]
cosense3d.utils.pclib.get_tf_matrix_torch(vectors, inv=False)[source]
cosense3d.utils.pclib.header(points)[source]
cosense3d.utils.pclib.lidar_bin2bin(bin_file, out_file)[source]
cosense3d.utils.pclib.lidar_bin2pcd(bin_file, out_file, replace=False)[source]
cosense3d.utils.pclib.lidar_bin2pcd_o3d(bin_file, out_file, replace=False)[source]
cosense3d.utils.pclib.lidar_ply2bin(ply_file, bin_file, fields=['x', 'y', 'z', 'intensity'], replace=False)[source]

Read ply and save to the cosense3d binary format.

Parameters:
  • ply_file – str, input file name

  • bin_file – str, output file name

  • fields – list of str, names that indicates ‘x’, ‘y’, ‘z’ and ‘intensity’

  • replace – replace the exisiting file if True

cosense3d.utils.pclib.load_pcd(pcd_file: str, return_o3d: bool = False)[source]

Read pcd and return numpy array.

Parameters:
  • pcd_file – The pcd file that contains the point cloud.

  • return_o3d – Default returns numpy array, set True to return pcd as o3d PointCloud object

Returns:

lidar_dict, xyz: (pcd_np | pcd : np.ndarray | o3d.geometry.PointCloud) the lidar xyz coordinates in numpy format, shape:(n, 3); intensity: (optional) np.ndarray, (n,). label: (optional) np.ndarray, (n,). time: (optional) np.ndarray, (n,). ray: (optional) np.ndarray, (n,).

cosense3d.utils.pclib.mask_points_in_box(points, pc_range)[source]
cosense3d.utils.pclib.mask_points_in_range(points: array, dist: float) array[source]
Return type:

np.array

cosense3d.utils.pclib.mask_values_in_range(values, min, max)[source]
cosense3d.utils.pclib.mat_pitch(cosa, sina, zeros=0, ones=1)[source]
cosense3d.utils.pclib.mat_roll(cosa, sina, zeros=0, ones=1)[source]
cosense3d.utils.pclib.mat_yaw(cosa, sina, zeros=0, ones=1)[source]
cosense3d.utils.pclib.pose2tf(pose)[source]
cosense3d.utils.pclib.pose_err_global2relative_torch(poses, errs)[source]

Calculate relative pose transformation based on the errorneous global positioning :param poses: Nx2 or Nx3, first row is ego pose, other rows are the coop poses :param errs: Nx3, first row is ego pose error and other rows for coop pose errors :return: (N-1)x3, relative localization errors between ego and coop vehicles

cosense3d.utils.pclib.pose_to_transformation(pose)[source]
Parameters:

pose – list, [x, y, z, roll, pitch, yaw]

Returns:

transformation: np.ndarray, (4, 4)

cosense3d.utils.pclib.project_points_by_matrix_torch(points, transformation_matrix)[source]

Project the points to another coordinate system based on the transformation matrix.

Parameters:
  • points – torch.Tensor, 3D points, (N, 3)

  • transformation_matrix – torch.Tensor, Transformation matrix, (4, 4)

Returns:

projected_points : torch.Tensor, The projected points, (N, 3)

cosense3d.utils.pclib.read_ply(filename)[source]
cosense3d.utils.pclib.rotate3d(points, euler)[source]

Rotate point cloud with the euler angles given in pose.

Parameters:
  • points – np.ndarray, N x (3 + C) each point in the row has the format [x, y, z, …]

  • euler – list or np.ndarray [roll, pitch, yaw]

Returns:

points: np.ndarray rotated point cloud

cosense3d.utils.pclib.rotate_box_corners_with_tf_np(corners: ndarray, tf_np: ndarray) ndarray[source]

Rotate points with transformation matrix :param corners: Nx8X3 points array :param tf_np: 4x4 transformation matrix :return: corners, Nx8X3 points array

cosense3d.utils.pclib.rotate_points_along_z_np(points, angle)[source]
Parameters:
  • points – (N, 3 + C or 2 + C)

  • angle – float, angle along z-axis, angle increases x ==> y

cosense3d.utils.pclib.rotate_points_along_z_torch(points, angle)[source]
Parameters:
  • points – (N, 2 + C) or (B, 2 + C)

  • angle – float or tensor of shape (B), angle along z-axis, angle increases x ==> y

cosense3d.utils.pclib.rotate_points_batch(points, angles, order='xyz')[source]
Parameters:
  • points – (B, N, 3 + C)

  • angles – (B, 1|3), radians rotation = R(3)R(2)R(1) if angles shape in (B, 3)

Returns:

points_rot: (B, N, 3 + C)

cosense3d.utils.pclib.rotate_points_with_tf_np(points: ndarray, tf_np: ndarray) ndarray[source]

Rotate points with transformation matrix.

Parameters:
  • (np.ndarray) (tf_np) – Nx3 points array

  • (np.ndarray) – 4x4 transformation matrix

Returns:

points (np.ndarray): Nx3 points array

cosense3d.utils.pclib.rotation_mat2euler_torch(mat)[source]
cosense3d.utils.pclib.rotation_matrix(euler, degrees=True)[source]

Construct rotation matrix with the given pose.

Parameters:

euler – list or np.ndarray [roll, pitch, yaw]

Returns:

rot: np.ndarray, 3x3 rotation matrix

cosense3d.utils.pclib.save_cosense_ply(data, output_file_name)[source]
cosense3d.utils.pclib.tf2pose(tf_matrix)[source]

cosense3d.utils.tensor_utils module

cosense3d.utils.tensor_utils.check_numpy_to_torch(x)[source]
cosense3d.utils.tensor_utils.pad_list_to_array_torch(data)[source]

Pad list of numpy data to one single numpy array :param data: list of np.ndarray :return: np.ndarray

cosense3d.utils.train_utils module

cosense3d.utils.train_utils.build_lr_scheduler(optimizer, cfg, steps_per_epoch)[source]
cosense3d.utils.train_utils.build_optimizer(model, cfg)[source]
cosense3d.utils.train_utils.clip_grads(params, max_norm=35, norm_type=2)[source]
cosense3d.utils.train_utils.get_gpu_architecture()[source]
cosense3d.utils.train_utils.is_tensor_to_cuda(data, device=0)[source]
cosense3d.utils.train_utils.load_model_dict(model, pretrained_dict)[source]
cosense3d.utils.train_utils.load_tensors_to_gpu(batch_dict, device=0)[source]

Load all tensors in batch_dict to gpu

cosense3d.utils.train_utils.seed_everything(seed)[source]

cosense3d.utils.vislib module

cosense3d.utils.vislib.bbx2linset(bbx, color=(0, 1, 0))[source]

Convert the bounding box to o3d lineset for visualization.

:param bbxnp.ndarray

shape: (n, 7) or (n, 11) or (n, 8, 3).

:param colortuple

The bounding box color.

Returns:

line_set : open3d.LineSet

cosense3d.utils.vislib.draw_2d_bboxes_on_img(img, boxes2d, ax_in=None)[source]
Parameters:
  • img – np.ndarray

  • boxes2d – np.ndarray, (N, 4, 2) for 4 corners or (N, 2, 2) for left top and right bottom corners inn pixel metric

cosense3d.utils.vislib.draw_3d_points_boxes_on_img(ax, img, lidar2img, points=None, boxes=None)[source]

1 ——– 6 ^ z

/| /| |

2 ——– 5. | | | | | | . x . 0 ——– 7 |/ |/ |/ +——-> y 3 ——– 4

Parameters:
  • ax – plt plot axis

  • img – np.ndarray, (H, W, 3)

  • lidar2img – np.ndarray, (4, 4), transformation matrix from lidar to camera coordinates

  • points – np.ndarray, (N, 3+C)

  • boxes – np.ndarray, (N, 8, 3) or (N, 7), in lidar coordinates

cosense3d.utils.vislib.draw_box_plt(boxes_dec, ax, color=None, linewidth_scale=2.0, linestyle='solid')[source]

draw boxes in a given plt ax :param boxes_dec: (N, 5) or (N, 7) in metric :param ax: :return: ax with drawn boxes

cosense3d.utils.vislib.draw_matched_boxes(boxes1, boxes2, match, out_file=None)[source]
cosense3d.utils.vislib.draw_points_boxes_plt(pc_range=None, points=None, boxes_pred=None, boxes_gt=None, wandb_name=None, points_c='gray', bbox_gt_c='green', bbox_pred_c='red', linewidth_scale=0.75, bbox_pred_label=None, bbox_gt_label=None, return_ax=False, ax=None, marker_size=2.0, filename=None)[source]
cosense3d.utils.vislib.get_palette_colors(palette)[source]
cosense3d.utils.vislib.o3d_draw_agent_data(agent_dict, data_path)[source]
cosense3d.utils.vislib.o3d_draw_frame_data(frame_dict, data_path)[source]
cosense3d.utils.vislib.o3d_draw_pcds_bbxs(pcds: list, bbxs: list, bbxs_colors: list | None = None, pcds_colors: list | None = None)[source]
Parameters:
  • pcds – list of np array

  • bbxs – list of np array, bounding boxes in corner format

  • bbxs_colors – list of tuples

  • pcds_colors – list of np array, shape same as pcds

cosense3d.utils.vislib.o3d_play_sequence(meta_dict, data_path)[source]
cosense3d.utils.vislib.plot_cavs_points(cavs, points_key='points')[source]
cosense3d.utils.vislib.plt_draw_frame_data(frame_dict, data_path)[source]
cosense3d.utils.vislib.update_axis_linset(line_set, axis_len=5)[source]
cosense3d.utils.vislib.update_lineset_vbo(vbo, bbx, color=None)[source]
cosense3d.utils.vislib.visualization(func_list, batch_data)[source]

Module contents