-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using large dataset such as SensatUrban #1
Comments
Hi @aymanmuk, Thx for your interest! For the tiled knn function, you can increase the tile size as long as there is no OOM error, it should be faster. For larger scenes, with smaller density, larger tiels makes more sense. The parameters in the code were chosen for indoor scenes. For the block, you can use is the error happening as well if you use the summation mode? Best, |
Hi @HuguesTHOMAS, For the first point, I will try tile_size=50. Please advise if these values need further adjustments. Thanks |
Indeed you are right for the second point. For you parameters, this is not right. You should keep |
Hi, thank you for sharing your work! I have been trying to implement your code on large datasets such as SensatUrban and Dales. However, I encountered a couple of challenges:
In scene_seg.py, while computing nearest neighbors per tile:
_, idxs = tiled_knn(q_pts, s_pts, k=1, tile_size=20.5, margin=2 * dl) # 3.5
Dividing the area by 3.5 meters takes a significant amount of time. Could you recommend any adjustments or optimizations for this parameter to improve performance on large datasets?
Also, in Kpconv_blocks.py, I ran into the following error:
output_feats = torch.sum(neighbor_feats * neighbors_weights, dim=1) # -> (M, G, 0//G)
RuntimeError: The size of tensor a (59584) must match the size of tensor b (931) at non-singleton dimension 0
Could you provide guidance on resolving this issue?
The text was updated successfully, but these errors were encountered: