Bob Hoffman wrote:
This goes out to you admins who manage servers with a heavy load of information.
I would like to know what you do about the number of files in a folder, or if that is a concern. I think there is a limitation or a slow down if it gets to big, but what is optimal (if necessary)
Example- running a website that allows a user to upload some photos (small ones). You get lets say 300,000 users each uploading 10 photos. That's 3 million files.
Storing that in one folder would seem like it would cause an issue when using that folder, is that right?
If it does, what do you do about that? How do you handle things?
If you have 300,000 clients you could give them their own folder each and then the folders would have only 10 photos, but one folder would contain 300,000 folders.
SO what is best for file management and system resources?
Using hash_index on ext3 or a hashing file system helps... but in many such contexts, I've found if you can do a multi-level directory hashing scheme (compute some reproducible hash on a file name or user name/ID) and index into a directory structure, this can help. -Alan