I need some help again converting my Android app to iOS...
As I mentioned in other post the app is about image processing and it's based on ColorMatrix. The problem is that Swift only supports (as far as I know...) 4x4 matrix and for brightness or contrast I use 5x5 matrix in Android.
In Android these are my matrix:
Brightness:
float[] brightMatrix = {
1f, 0f, 0f, 0f, 32f,
0f, 1f, 0f, 0f, 32f,
0f, 0f, 1f, 0f, 32f,
0f, 0f, 0f, 1f, 0f,
0f, 0f, 0f, 0f, 1f};
Contrast:
float[] contMatrix = {
1.190f, 0f, 0f, 0f, -12.065f,
0f, 1.190f, 0f, 0f, -12.065f,
0f, 0f, 1.190f, 0f, -12.065f,
0f, 0f, 0f, 1f, 0f,
0f, 0f, 0f, 0f, 1f};
But I really don't know how to get same results in Swift...
In case of Brightness I've tried to use this code dividing 32/255 to get the value 0.12549:
func zoeFilter() -> UIImage? {
let inImage = CIImage (image: self)
dynamic let brightFactor = CIFilter (name: "CIColorControls")
brightFactor?.setValue(inImage, forKey: kCIInputImageKey)
brightFactor?.setValue(0.12549, forKey: kCIInputBrightnessKey)
let brightImage = brightFactor?.outputImage
let cgImage = CIContext().createCGImage(brightImage!, from: brightImage!.extent)
return UIImage (cgImage: cgImage!)
}
But the result is not the same at all... in iOS it's like a white fog...
And speaking about Contrast I've found the next code but it's impossible for me to transform the Android matrix in just a value here contFactor?.setValue(1, forKey: kCIInputContrastKey)
just knowing that 1 is equal to neutral:
func chiaFilter() -> UIImage? {
let inImage = CIImage (image: self)
dynamic let contFactor = CIFilter (name: "CIColorControls")
contFactor?.setValue(inImage, forKey: kCIInputImageKey)
contFactor?.setValue(1, forKey: kCIInputContrastKey)
let contImage = contFactor?.outputImage
let cgImage = CIContext().createCGImage(contImage!, from: contImage!.extent)
return UIImage (cgImage: cgImage!)
}
On the other side, changing the exposure is easy because the matrix in Android is like:
float[] expMatrix = {
1.3f, 0f, 0f, 0f, 0f,
0f, 1.3f, 0f, 0f, 0f,
0f, 0f, 1.3f, 0f, 0f,
0f, 0f, 0f, 1f, 0f,
0f, 0f, 0f, 0f, 1f};
So in Swift I can use the next code getting exactly the same result:
func liaFilter() -> UIImage? {
let inImage = CIImage (image: self)
dynamic let expoMatrix = CIFilter (name: "CIColorMatrix")
expoMatrix?.setDefaults()
expoMatrix?.setValue(inImage, forKey: kCIInputImageKey)
expoMatrix?.setValue(CIVector (x: 1.3, y: 0, z: 0, w: 0), forKey: "inputRVector")
expoMatrix?.setValue(CIVector (x: 0, y: 1.3, z: 0, w: 0), forKey: "inputGVector")
expoMatrix?.setValue(CIVector (x: 0, y: 0, z: 1.3, w: 0), forKey: "inputBVector")
expoMatrix?.setValue(CIVector (x: 0, y: 0, z: 0, w: 1), forKey: "inputAVector")
let expoImage = expoMatrix?.outputImage
let cgImage = CIContext().createCGImage(expoImage!, from: expoImage!.extent)
return UIImage (cgImage: cgImage!)
}
It's so confused... I know... but it'd be great if someone could have an idea about how to solve these problems between Android and iOS.
Thanks in advance!
EDITION AFTER FRANK'S ANSWER
Here is after applying filter with Android using:
float[] brightMatrix = {
1f, 0f, 0f, 0f, 32f,
0f, 1f, 0f, 0f, 32f,
0f, 0f, 1f, 0f, 32f,
0f, 0f, 0f, 1f, 0f,
0f, 0f, 0f, 0f, 1f};
Here is the same picture applying filter with iOS using:
brightMatrix?.setValue(CIVector (x: 1, y: 0, z: 0, w: 0), forKey: "inputRVector")
brightMatrix?.setValue(CIVector (x: 0, y: 1, z: 0, w: 0), forKey: "inputGVector")
brightMatrix?.setValue(CIVector (x: 0, y: 0, z: 1, w: 0), forKey: "inputBVector")
brightMatrix?.setValue(CIVector (x: 0, y: 0, z: 0, w: 1), forKey: "inputAVector")
brightMatrix?.setValue(CIVector (x: 32.0/255.0, y: 32.0/255.0, z: 32.0/255.0, w: 0), forKey: "inputBiasVector")
FINAL SOLUTION WORKING AFTER FRANK'S PROPOSAL
func zoeFilter() -> UIImage? {
let inImage = CIImage (image: self)
dynamic let SRGBMatrix = CIFilter (name: "CILinearToSRGBToneCurve")
dynamic let brightMatrix = CIFilter (name: "CIColorMatrix")
dynamic let linearMatrix = CIFilter (name: "CISRGBToneCurveToLinear")
SRGBMatrix?.setValue(inImage, forKey: kCIInputImageKey)
let SRGBImage = SRGBMatrix?.outputImage
brightMatrix?.setDefaults()
brightMatrix?.setValue(SRGBImage, forKey: kCIInputImageKey)
brightMatrix?.setValue(CIVector (x: 1, y: 0, z: 0, w: 0), forKey: "inputRVector")
brightMatrix?.setValue(CIVector (x: 0, y: 1, z: 0, w: 0), forKey: "inputGVector")
brightMatrix?.setValue(CIVector (x: 0, y: 0, z: 1, w: 0), forKey: "inputBVector")
brightMatrix?.setValue(CIVector (x: 0, y: 0, z: 0, w: 1), forKey: "inputAVector")
brightMatrix?.setValue(CIVector (x: 32.0/255.0, y: 32.0/255.0, z: 32.0/255.0, w: 0), forKey: "inputBiasVector")
let brightImage = brightMatrix?.outputImage
linearMatrix?.setDefaults()
linearMatrix?.setValue(brightImage, forKey: kCIInputImageKey)
let linearImage = linearMatrix?.outputImage
let cgImage = CIContext().createCGImage(linearImage!, from: linearImage!.extent)
return UIImage (cgImage: cgImage!)
}
You should be able to use your brightness and contrast matrices by also setting the inputBiasVector
parameter of the CIColorMatrix
filter like so:
// the first 4 should actually be the default, so you probably don't need to set them explicitly
brightMatrix?.setValue(CIVector(x: 1, y: 0, z: 0, w: 0), forKey: "inputRVector")
brightMatrix?.setValue(CIVector(x: 0, y: 1, z: 0, w: 0), forKey: "inputGVector")
brightMatrix?.setValue(CIVector(x: 0, y: 0, z: 1, w: 0), forKey: "inputBVector")
brightMatrix?.setValue(CIVector(x: 0, y: 0, z: 0, w: 1), forKey: "inputAVector")
brightMatrix?.setValue(CIVector(x: 32.0/255.0, y: 32.0/255.0, z: 32.0/255.0, w: 0), forKey: "inputBiasVector")
It's strange, though, that you can't use the CIColorControls
filter to achieve the same. Maybe you can post example images.