I'm trying to do a mass conversion of images to data so it can be stored in core data. The conversion part works fine, and if I do it without updating core data it shows memory usage at less that 100MB
If I update the core data object, it just keeps consuming memory until the app crashes.
func updateLocalImages() {
let fetchRequest: NSFetchRequest<Picture> = Picture.fetchRequest()
fetchRequest.predicate = NSPredicate(format: "pictureName != \"\"")
do {
let pictures = try moc.fetch(fetchRequest)
print("Picture Update Count: \(pictures.count)")
for picture in pictures {
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
let path = paths[0]
if let picName = picture.pictureName {
let imagePath = path.appendingPathComponent(picName)
if let uiImage = UIImage(contentsOfFile: imagePath.path) {
if let imageData = uiImage.jpegData(compressionQuality: 1.0) {
autoreleasepool {
picture.pictureData = imageData
print("Picture Updated")
saveContext()
}
}
}
}
}
} catch {
print("Fetching Failed")
}
}
If I comment out the picture.pictureData = imageData line I don't get the memory issues.
What's the correct way of going about this? There is an unknown number of images (mine current sits at about 5.5GB worth)
Found the problem was core data didn't appear to be releasing the objects. I changed it to the below
if let pictures = project.pictures {
projectPicNumber = pictures.count
for pic in pictures {
currentPicNumber = currentPicNumber + 1
let picture : Picture = pic as! Picture
if let imgData = convertImage(picture: picture) {
picture.pictureData = imgData
}
}
project.converted = true
saveContext()
viewContext.refreshAllObjects()
}
The most memory it got to was about 500MB when converting 33 images