I require 64GB to fit an entire dataset in memory for deep learning, but have only 12GB RAM. Virtual memory being the next-best alternative, I learned it can be effectively increased via increasing the pagefile size - but this source suggests it'd increase system instability.
All other sources state to the contrary, only noting lowered SSD lifespan, which isn't a problem - but I rather not take chances; this said, is there a limit to how much pagefile size can be increased without yielding instability?
Additional info: Win10 OS, 26GB OS-allocated pagefile size (need 52GB + c, c = safe minimum)
PRE-ANSWER: proceeded as described here, with ~70GB memory-mapped data; the average data load speedup is 42-FOLD. I suspect this figure may be bumped to ~130, though won't work on it now unless someone answers this. Lastly, this is sustainable and won't degrade the SSD, as the use is 99.9%+ reads. Will post full answer with details eventually.