I'm training a CNN model on images. Initially, I was training on image patches of size (256, 256) and everything was fine. Then I changed my dataloader to load full HD images (1080, 1920) and I was cropping the images after some processing. In this case, the GPU memory keeps increasing with every batch. Why is this happening?
PS: While tracking losses, I'm doing loss.detach().item() so that loss is not retained in the graph.