Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Visualization #36

Open
Adam1904 opened this issue Jul 21, 2023 · 2 comments
Open

Visualization #36

Adam1904 opened this issue Jul 21, 2023 · 2 comments

Comments

@Adam1904
Copy link

Adam1904 commented Jul 21, 2023

thanks for great work. Ihave some questions:

1- How can I visualize real points and virtual points in BEV (Bird's Eye View) both with and without an image, similar to Figure 1 and Figure 3c and d? Could you please explain how du the figures visualized?

2- Could you please explain how Figure 4 was visualized?

3-and one more question please, in the paper:

For instance segmentation, we use CenterNet2 [73] which adds a cascade RoI heads [3] on top of the first stage proposal network. The overall network runs at 40 FPS and achieves 43.3 instance segmentation mAP on the nuScenes image dataset [2].

What hyperparameters did you use to get this result (43.3 instance segmentation mAP)? How much is the learning rate? How many iterations? How many images per batch? how many epochs?

@tianweiy
Copy link
Owner

  1. figure 1 uses open3d. and only virtual points inside bounding boxes are highlighted. the right side of figure 1 is zoom in (in the ui) and screenshots. for figure 3, a,b,c are illustration plots I drew in keynotes, d is also a zoom in + screenshot.

  2. figure 4 is just matplotlib or something similar. We collapsed the z-xis of all lidar points and draw the boxes.

  3. I don't remember the details now but I think this config is similar to what we use https://github.com/xingyizhou/CenterNet2/blob/master/configs/nuImages_CenterNet2_DLA_640_8x.yaml

@Adam1904
Copy link
Author

Adam1904 commented Jul 23, 2023

Is there any code for that, please? Figure 1 uses open3d, and only virtual points inside bounding boxes are highlighted. I tried using this code https://github.com/tianweiy/CenterPoint/blob/master/tools/visual.py, but it still doesn't work. I executed the following command:

python ./tools/visual.py --path ./dataa/nuScenes/samples/LIDAR_TOP_VIRTUAL/n008-2018-05-21-11-06-59-0400__LIDAR_TOP__1526915243047392.pcd.bin.pkl.npy

And I modified main in visual.py:

if __name__ == '__main__':
    parser = argparse.ArgumentParser(description="LIDAR_TOP_VIRTUAL")
    parser.add_argument('--path', help='path to visualization file', type=str)
    args = parser.parse_args()

    data = np.load(args.path, allow_pickle=True).item()
    virtual_points = data['virtual_points']

    pcd = o3d.geometry.PointCloud()
    pcd.points = o3d.utility.Vector3dVector(virtual_points[:, :3])

    o3d.visualization.draw_geometries([pcd])

However, it hangs and the result is not displayed. And how can i use detections and scores to plot boxes? any help?
thank you for ur reply

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants