Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using large dataset such as SensatUrban #1

Open
aymanmuk opened this issue Dec 23, 2024 · 3 comments
Open

Using large dataset such as SensatUrban #1

aymanmuk opened this issue Dec 23, 2024 · 3 comments

Comments

@aymanmuk
Copy link

aymanmuk commented Dec 23, 2024

Hi, thank you for sharing your work! I have been trying to implement your code on large datasets such as SensatUrban and Dales. However, I encountered a couple of challenges:

In scene_seg.py, while computing nearest neighbors per tile:

_, idxs = tiled_knn(q_pts, s_pts, k=1, tile_size=20.5, margin=2 * dl) # 3.5
Dividing the area by 3.5 meters takes a significant amount of time. Could you recommend any adjustments or optimizations for this parameter to improve performance on large datasets?

Also, in Kpconv_blocks.py, I ran into the following error:
output_feats = torch.sum(neighbor_feats * neighbors_weights, dim=1) # -> (M, G, 0//G)
RuntimeError: The size of tensor a (59584) must match the size of tensor b (931) at non-singleton dimension 0

Could you provide guidance on resolving this issue?

@HuguesTHOMAS
Copy link
Collaborator

Hi @aymanmuk,

Thx for your interest!

For the tiled knn function, you can increase the tile size as long as there is no OOM error, it should be faster. For larger scenes, with smaller density, larger tiels makes more sense. The parameters in the code were chosen for indoor scenes.

For the block, you can use is the error happening as well if you use the summation mode?

Best,
Hugues

@aymanmuk
Copy link
Author

aymanmuk commented Dec 23, 2024

Hi @HuguesTHOMAS,
Thank you for the quick responce.
For the second point, I think it there is a mistake in the code. It should be:
neighbor_feats = neighbor_feats.view(-1, H, self.groups, 1) # (M, H, C, 1)
instead of:
neighbor_feats = neighbors_weights.view(-1, H, self.groups, 1) # (M, H, C, 1)

For the first point, I will try tile_size=50.
According to previous Dales implemetation, I changed the following:
cfg.model.kp_radius = 20
cfg.data.init_sub_size = 0.025
cfg.model.in_sub_size = 0.025
cfg.model.input_channels = 1

Please advise if these values need further adjustments. Thanks

@HuguesTHOMAS
Copy link
Collaborator

Indeed you are right for the second point.

For you parameters, this is not right. You should keep cfg.model.kp_radius to a relatively small value, between 2 and 4, and use cfg.train.in_radius = 20

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants