Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When exporting large point clouds, RAM usage can be too high and app exits #1375

Open
matlabbe opened this issue Nov 5, 2024 · 0 comments
Open

Comments

@matlabbe
Copy link
Member

matlabbe commented Nov 5, 2024

Related to this issue: #1368

Using min and max limits of Node filtering option helps to export one section of the map after the other, but it is tedious to estimate if the density of nodes is different across the map. Setting wrong limits will make the app crash if RAM is filled. We could add an option of export nodes up to a target max RAM usage, then automatically start a new cloud when limit is reached, saving the previous cloud (with some counting suffix using a base file name), clear memory and continue with next nodes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant