Member-only story
Image usage, memory comparison and best practices in iOS (WWDC2018)
Downsampling to process large images
The main memory footprint of an image in an application (which usually happens when the image is to be loaded and displayed) is actually irrelevant to the size of the image itself; what matters is the size of the image. The decode buffer is calculated as width*height*N, where N is usually 4 (the most common ARGB888 format), and N depends on the format you are using. But many times, we will pass a UIImage directly to the UIImageView. In fact, the size of the View may be much smaller than the UIImage itself. And ended up using more memory then we actually need.
Following are the ways to use image:
I: This is the most often used. Some people might think of imagewithcontentsoffile. In fact, it has only one difference from the below method. Under normal circumstances, imageNamed will cache this image during the entire app, so there is no need to repeat the decode; Imagewithcontentsoffile does not have this cache, and the memory will be freed when the image is not used.
image1.image = UIImage(named: "000.jpg")
II: This is a widely used method of scaling pictures. Which does’t actually make much change in memory uses on ram.
+ (UIImage*)OriginImage:(UIImage *)image scaleToSize:(CGSize)size {
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)]…