swiftswiftuidownsampling

Downsampling Images with SwiftUI


I'm displaying images in my app that are downloaded from the network, but I'd like to downsample them so they aren't taking up multiple MB of memory. I could previously do this quite easily with UIKit:

func resizedImage(image: UIImage, for size: CGSize) -> UIImage? {
    let renderer = UIGraphicsImageRenderer(size: size)
    return renderer.image { (context) in
        image.draw(in: CGRect(origin: .zero, size: size))
    }
}

There are other methods as well, but they all depend on knowing the image view's desired size, which isn't straightforward in SwiftUI.

Is there a good API/method specifically for downsampling SwiftUI images?


Solution

  • I ended up solving it with geometry reader, which isn't ideal since it messes up the layout a bit.

    @State var image: UIImage
    var body: some View {
       GeometryReader { geo in
           Image(uiImage: self.image)
               .resizable()
               .aspectRatio(contentMode: .fit)
               .onAppear {
                   let imageFrame = CGRect(x: 0, y: 0, width: geo.size.width, height: geo.size.height)
                   self.downsize(frame: imageFrame) // call whatever downsizing function you want
               }
          }
    }
    

    Use the geometry proxy to determine the image's frame, then downsample to that frame. I wish SwiftUI had their own API for this.