Using iOS CIFilter
, I am trying to recreate the glitch effect from Photoshop shown in this video:
https://www.youtube.com/watch?v=1Ibreg9T168
Here's the original image:
At the start of the video, the person firstly desaturates the image to make it grayscale. Then creates a copy of that layer. Then disables the Red channel on that layer. Then shifts it slightly to the left to get this look at 55 second mark:
This should be doable easily with CIFilter
. I did the below:
if let image = UIImage(named: "demo13"), let editted = applyGlitchEffect(to: image) {
imageView.image = editted
}
func applyGlitchEffect(to image: UIImage) -> UIImage? {
guard let ciImage = CIImage(image: image) else { return nil }
let context = CIContext(options: nil)
let grayscaleFilter = CIFilter(name: "CIColorControls", parameters: [
kCIInputImageKey: ciImage,
kCIInputSaturationKey: 0.0
])
let noRedChannel = CIFilter(name: "CIColorMatrix", parameters: [
kCIInputImageKey: grayscaleFilter?.outputImage as Any,
"inputRVector": CIVector(x: 0, y: 0, z: 0, w: 0)
])
let shifted = noRedChannel?.outputImage?.transformed(by: CGAffineTransform(translationX: -20, y: 0))
let blended = CIFilter(name: "CIScreenBlendMode")
blended?.setValue(shifted, forKey: kCIInputImageKey)
blended?.setValue(grayscaleFilter?.outputImage as Any, forKey: kCIInputBackgroundImageKey)
if let finalOutput = blended?.outputImage, let cgImage = context.createCGImage(finalOutput, from: ciImage.extent) {
return UIImage(cgImage: cgImage)
}
return nil
}
This gives me the below result:
As you can see, my result is similar to the one in the video at 55 second mark. However, my image has a bluish tint to it for some reason. Whereas the one in the video is grayscale with shifted reds and blacks at the edges.
I suspect I might be using the wrong blend mode or something. I did try a few other blend modes but didn't get similar result.
I looked up what blend mode Photoshop uses and according to this:
https://helpx.adobe.com/photoshop/using/blending-modes.html
the default is "Normal: Edits or paints each pixel to make it the result color. This is the default mode."
How can I recreate it? What am I doing wrong?
I think the problem arises because you are essentially adding the green and blue channel twice to your image:
GB (R removed) + original RGB (grayscale)
(+
being your blend mode of choice here.)
So instead of using the grayscale image as background in your blending, try to only use the red channel instead:
let onlyRedChannel = CIFilter(name: "CIColorMatrix", parameters: [
kCIInputImageKey: grayscaleFilter?.outputImage as Any,
"inputGVector": CIVector(x: 0, y: 0, z: 0, w: 0),
"inputBVector": CIVector(x: 0, y: 0, z: 0, w: 0)
])
// ...
blended?.setValue(onlyRedChannel as Any, forKey: kCIInputBackgroundImageKey)
By the way, I highly recommend using the "new" (iOS 13) protocol-based CIFilter
interface. You need to import CoreImage.CIFilterBuiltins
for that. Then you can create and set up filters like that:
let onlyRedChannel = CIFilter.colorMatrix()
onlyRedChannel.inputImage = grayscaleFilter?.outputImage
onlyRedChannel.gVector = CIVector(x: 0, y: 0, z: 0, w: 0)
onlyRedChannel.bVector = CIVector(x: 0, y: 0, z: 0, w: 0)