Memory leaks happen.
You might be a neat and tidy person that tries to ensure that no waste is made since the start of your project coding but leaks happen.
I fell into this trap.
Was feeling nice and comfortable believing that all my code was behaving nice and using a database was keeping the memory consumption low.
Everything was working perfectly when working with a few hundred files.
However, when dealing with little over 50 thousand files the picture turned out to be grey and the evil java.lang.OutOfMemoryError: Java heap space appears.
I tried my best to pinpoint the evil doers but I couldn't really point my finger at any guilty party since none seemed to blame.
Looking for solutions, most people recommend increasing the size of the virtual machine to support the stress. But as mentioned on the opcode website: http://blogs.opcodesolutions.com/roller/java/entry/solve_java_lang_outofmemoryerror_java, adding more memory only hides the problem, doesn't really bring a scalable solution.
That blog post in particular turned out to be very useful. A good approach is using profiling tools that "police" my application and see how it behaves and which parts are not behaving nicely.
NetBeans is remarkable in that sense.
It already comes with a profiling tool that is built-in and quite simple to use.
Just click on "Profile" --> "Main Project" and follow the directions.
It will launch your project and allow you to track what is really going on underneath the hood while the program is running.
Far better than following your program from the external task manager.
I'm attaching a screenshot so that you can see how it looks like:
My application was bursting at 100Mb, with profiling I was able to track down the reason of the leak: the log entries were consuming too many resources.
Disabling the log entries you can see on the screenshot how it goes well above the previous limit using less than 10Mb of memory to index 100 000 files.
Still a lot of things to improve on my code but this nifty tool sure helps to make it possible.