实现基于文件的缓存的最佳实践

I'm implementing a file-based cache system for my php application. When using, it can handle about 200,000 files stored in the same folder, making it difficult to manage: calculate the total size of the folder or list the files inside it can be near impossible to do in a reasonable time.

My question is: is good for performance save the files in separated folders? can this reduce the IO time?

On a modern file system (say ext3, ntfs) 200K files in a directory won't be slow if you are opening a single file. Listing will be slower ofcourse, but distributing your files over many directories won't help you there.