Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shredding heavy files can cause memory overflow #18

Open
sundowndev opened this issue Jan 26, 2023 · 0 comments
Open

Shredding heavy files can cause memory overflow #18

sundowndev opened this issue Jan 26, 2023 · 0 comments
Labels
bug Something isn't working need-triage

Comments

@sundowndev
Copy link
Owner

Previously, we used to write random data to files by set of 1024 bytes. Which means that for a file of 4096 bytes, we perform 4 write operations to rewrite the file entirely. Since #10, we're now doing a single write operation per file. This can lead the program to crash if the file is too heavy (we should perform some benchmarks to confirm that). Doing a progressive shredding instead can solve that case.

@sundowndev sundowndev added bug Something isn't working need-triage labels Jan 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working need-triage
Projects
None yet
Development

No branches or pull requests

1 participant