Questions in short:
1. Is initializing an object earlier to force an `OOM` error before a time consuming operation really as bad as it sounds (or just a bad habit) and why?
2. What are the available options for preventing a `OOM` error in Java whitout using hard-coded restrictions and what are their respective pros and cons?
More annoying/specific detals:
I am writing a program, which ran into some `OutOfMemoryErrors` pretty often. A quick calculation showed that the memory usage and the consequent error was expected and mostly due to a big 2D array (*first array*), whose actual size is defined by `args` or a setting file when the program starts, but otherwise stable, and another (*second array*), used to write to a file the result of a time consuming manipulation of the *first array* at the end of the program, whose size is defined in a similar way at the start of the program. Some more memory could be needed depending on user input, but a max possible memory usage seems in my case reasonably predictable. I currently want to leave the heap-size defined with the `-Xmx` option without setting a hard-coded max size for the *first* or *second* array or setting a predefined heap-size through a launcher and just give a hint to the user if something funny is likely to happen.
After some disappointing experiences a “dirty trick” that worked pretty well and saved me a lot of time in early tests was to initialize the *second array* at the beginning of the program. This caused the program to exit with an `OOM` error in the early stages in troublesome cases, avoiding the frustration of waiting for the time consuming part of the program to end without getting the resulting file because of a later `OOM` error.
Things I’m already considering are reviewing and possibly improving memory usage, making sure that memory that isn’t used anymore is freed before the *second array* is needed, finding a way to write to file that requires less memory, and using `Runtime.freeMemory()` and related methods to predict possible troubles with a safe margin and give at least an early warning at the beginning of the program instead of the `OOM` error message to an eventual final user.
I’m looking more for a good review of possible approaches to this kind of problems than actual solutions to my specific case and my two questions are
1. what are the actual drawbacks of my “dirty trick”? What I can notice is:
1. It causes the program to quit ungracefully,
2. It may affect readability (in my specific case, relatively simple program, single array, doesn’t seem to be a big issue, but seems an otherwise legit consideration)
3. It may affect performance (didn’t find a noticeable effect, it may be more of an issue for programs with more variable memory usage; this seems to have to do mostly with garbage collection, some detailed treatment of this topic can be found elsewhere and seems too wide for answers to this question, it may still be true system-wide)
4. It can cause false positives if some memory can be freed before the *second array* is used.
5. It helps procrastinating addressing more serious problems with memory usage.
2. What are other possible solutions to similar problems in java (answers inappropriate for my specific case are welcomed; repeatedly calling methods from `java.lang.Runtime` before some more memory is needed or at regular times seems a valid option, but not ideal for my program, because speed is quite important and max memory usage can be reasonably predicted beforehand; I’m writing this program mostly for fun, but answers motivated by UX or similar are welcomed as long as pro and cons of specific solutions are clear).
In some cases the Java specification doesn’t guarantee how much actual memory is used for some objects in specific implementations, but guesses seem usually safe for most practical uses, reports on major exceptions are also welcomed.