Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[torch.compile] Large cache size limit #2604

Open
anijain2305 opened this issue Dec 26, 2024 · 3 comments
Open

[torch.compile] Large cache size limit #2604

anijain2305 opened this issue Dec 26, 2024 · 3 comments

Comments

@anijain2305
Copy link

anijain2305 commented Dec 26, 2024

Describe the bug

torch._dynamo.config.accumulated_cache_size_limit = 1024
if hasattr(torch._dynamo.config, "cache_size_limit"):
torch._dynamo.config.cache_size_limit = 1024

From torch 2.5 version, we should not need such a large cache size limit. Is it possible for someone to double check and remove the override?

Reproduction

NA

Environment

NA

@zhyncs
Copy link
Member

zhyncs commented Dec 26, 2024

It was added at #2069 As I remember, we should set, otherwise something with FlashInfer will fail

@zhyncs
Copy link
Member

zhyncs commented Dec 26, 2024

We may test whether reducing the size is also compatible (not directly deleting).

@anijain2305
Copy link
Author

I see. Maybe a better way is to make FlashInfer kernels torch.compile compatible?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants