Patch creation methods for deep learning on very big data with relatively low amounts of memory (32GB)
I am trying to train a deep-learning semantic segmentation model for satellite imagery. In doing so, I’ve created a test run of the data on a small AOI with patchify
and rasterio
without any issues. However, I am now trying to expand this to include a lot more patches to train the model on and have increased my AOI in an effort to do so. For context, previously I’d have a ndarray of approximately 41848x14555x9 (x, y, n_bands). Now I’m looking to increase this to 84632x37000x9 (x, y, n_bands).