I'm extracting pixel colors from a CGImage
using the code described in this answer.
However, I just realized that if I load an image that was created on another device, the pixels values look wrong. The first obvious problem is that the alpha is gone. The CGImageAlphaInfo
reports .noneSkipLast
, but I know the image is RGBA. If I read it from the same device it was created, it looks fine. The second problem is that there is some color bleeding, as if the image had been resized. Perhaps is being compressed or something.
Here's an example:
Source image is this watermelon, 12x12
It was created on my iPad. But if I load it on my iPhone through iCloud using that code I linked, I get this:
The alpha channel is gone, and colors bleed.
If from the same iPhone I send the little watermelon to my Mac using Airdrop, and send it back using Airdrop again (so it is supposedly the same image!), and load it now, I get the correct image:
(dark brown areas is where alpha is 0)
If you have a couple of iOS device with iCloud enabled, you can reproduce this behavior in this app: http://swiftpixels.endavid.com where I'm using that code to read pixel colors.
What could be the difference between those images? How can I read the correct image from iCloud? Should I look for hints in UIImage
instead of CGImage
?
Any clues? Thanks!
Update
For reference, I'm reading the image using a UIImagePickerController
, using this code:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let image = info[UIImagePickerController.InfoKey.originalImage] as? UIImage {
loadImageInMemory(image)
}
picker.dismiss(animated: true, completion: nil)
self.presentingViewController?.dismiss(animated: true, completion: nil)
}
fileprivate func loadImageInMemory(_ image: UIImage) {
/// skipping some preparation
guard let cgImage = image.cgImage else {
return
}
/// getImageData is a function like getPixelColor linked above,
/// but for a Rect
self.imageData = cgImage.getImageData(rect, width, height)
}
I also found this question which may be related: UIImagePickerController and iCloud photos
As Rob suggested in the comments below, I changed the Photos settings in my phone to "Download and Keep Originals" (instead of "Optimize iPhone Storage"), and that fixes the problem. So I guess the question is why iCloud tries to compress a PNG image that it's just 1,355 bytes, and whether it's possible to access the original image from the UIImagePickerController
The reason why the image looked blurry and without alpha seems to be that Photos streams Jpeg-compressed images from iCloud by default. Even if your original image would be smaller without compression, like in this pixel art example.
As pointed out by Rob, a way to verify that this is the case is to change your Photos settings. From the Settings app:
Settings -> Photos -> Download and Keep Originals
This would fix the issue, but of course it's not desirable. If you want to keep using Photos, instead of implementing your own iCloud solution, while keeping the Optimize iPhone Storage
setting, you can use PhotoKit to retrieve the original image.
Replace this code,
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let image = info[UIImagePickerController.InfoKey.originalImage] as? UIImage {
loadImageInMemory(image)
}
picker.dismiss(animated: true, completion: nil)
self.presentingViewController?.dismiss(animated: true, completion: nil)
}
by this other code:
import Photos
// ...
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
// it will be loaded asynchronously
loadImageFromPicker(info: info)
picker.dismiss(animated: true, completion: nil)
self.presentingViewController?.dismiss(animated: true, completion: nil)
}
private func loadImageFromPicker(info: [UIImagePickerController.InfoKey : Any]) {
var phAsset: PHAsset?
if #available(iOS 11.0, *) {
phAsset = info[UIImagePickerController.InfoKey.phAsset] as? PHAsset
} else {
// Fallback on earlier versions
if let referenceURL = info[UIImagePickerController.InfoKey.referenceURL] as? URL {
let fetchResult = PHAsset.fetchAssets(withALAssetURLs: [referenceURL], options: nil)
phAsset = fetchResult.firstObject
}
}
guard let asset = phAsset else {
return
}
// size doesn't matter, because resizeMode = .none
let size = CGSize(width: 32, height: 32)
let options = PHImageRequestOptions()
options.version = .original
options.deliveryMode = .highQualityFormat
options.resizeMode = .none
options.isNetworkAccessAllowed = true
PHImageManager.default().requestImage(for: asset, targetSize: size, contentMode: .aspectFit, options: options) { [weak self] (image, info) in
if let s = self, let image = image {
s.loadImageInMemory(image)
}
}
}
This code will work with both local images and iCloud images.