Skip to content

Handle Too Many Open Files #3

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Byron opened this issue Dec 1, 2010 · 1 comment
Closed

Handle Too Many Open Files #3

Byron opened this issue Dec 1, 2010 · 1 comment

Comments

@Byron
Copy link
Member

Byron commented Dec 1, 2010

Currently the gitdb will memory map pack indices, and loose objects for reading. If there are too many mapped files, which could happen on large databases, one has to gracefully unload existing memory maps and reload them later.
This could be as easy as deleting the data caches, which are memory maps most of the time.

This task could be done by the PackedDB, and additionally by the git-db which handles multiple object databases.

@Byron Byron modified the milestones: v0.3.2, v0.3.5 - bugfixes Nov 14, 2014
@Byron
Copy link
Member Author

Byron commented Jan 7, 2015

I think this now works, see #60 .
The memory manager is able to unload existing maps if it hits a resource limit (soft or hard).

@Byron Byron closed this as completed Jan 7, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

1 participant