Skip to content

Draft: Use a double-buffering like technique to only partially load large images

Arjen Hiemstra requested to merge work/ahiemstra/doubleimage into master

I have a number of images on my machine that I consider "torture test" images of image viewers. These are all large to very large images of galaxies and other stellar phenomena courtesy of hubblesite.org. The largest of these is a 22620 x 15200 JPEG of Galaxy M81: https://hubblesite.org/contents/media/images/2007/19/2127-Image.html?keyword=m81 .

When loading this image in master Koko, the memory usage looks like this:

Screenshot_20210614_201046

The largest peak here is around 2.5GiB and often overloads my system, triggering earlyoom which will potentially kill Koko. We can void this by using Image's sourceSize and sourceClipRect properties.

Unfortunately, with an async image, modifying these properties will first clear the Image before loading the data again. To avoid this, this implements a fake double-buffering approach where we have one active image and one inactive image. The inactive image is loaded with new data, then once it is done we swap these, making the inactive image visible and hiding the old image.

Since we only have a partial view of the image now, a base layer is added that is a scaled down version of the main image, which is displayed when we don't have proper data for the visible area yet.

This changes the memory usage to this:

Screenshot_20210806_180255

The largest peak is now a little over 400 MiB with the same image.

This is what it looks like in action (the first Image is the mentioned M81 image, which is rather slow to load):

Peek_2021-08-06_17-47

Draft because:

  • Zooming in beyond 100% isn't handled well yet.
  • Zooming out also has problems.
  • Need to find a replacement to ExifExtractor for the image size.
Edited by Mikel Johnson

Merge request reports