Applying filters and saving the edited photo

Generating a filtered image 🖼

The next step is to provide our ImageController with a function that inputs an image, applies a specific filter, and outputs the edited image. But let’s define an enum first to make it easier to choose between the different filter types. Create a new Swift file “Helper.swift”, add the SwiftUI framework, and create the following enum to it.

import SwiftUI

enum FilterType {
    case Original
    case Sepia
}

For now, we only want to use two filters: one that returns the original image and one that applies a sepia effect. We will add more filters later. We can now add a function generateFilteredImage to our ImageController, which requires an “inputImage” and the selected “filter” as arguments and returns the filtered image as an UIImage Optional.

func generateFilteredImage(inputImage: UIImage?, filter: FilterType) -> UIImage? {
    
}

To process and edit images, we use the CoreImage framework. Apple describes it as “an image processing and analysis technology that provides high-performance processing for still and video images”. To work with this framework within our generateFilteredImage function, we need to initialize a CIContext instance first. 

let context = CIContext(options: nil)

You can imagine the CIContext as a manager that is responsible for rendering the processed image.

Then, we make sure that the inputImage exists and is not nil by using a guard statement. While doing this we also convert our inputImage to a so-called CIImage so that the CoreImage framework can read and process it.

guard let imageToEdit = CIImage(image: inputImage!) else {
    return nil
}

How we edit the CIImage should depend on the selected filter. For this purpose, we use a switch statement.  

switch filter {
case .Original:
    return unprocessedImage
case .Sepia:
    
}

If the “Sepia” filter is selected, we use the built-in CoreImage Sepia filter. Each CoreImage filter needs different keys, which we have to assign to specific values. The CISepiaTone is quite simple and needs to know which image to apply the sepia effect to, represented by the “inputImage” key. For setting this key, we use our imageToEdit.

case .Sepia:
    let filter = CIFilter(name: "CISepiaTone")
    filter?.setValue(imageToEdit, forKey: "inputImage")

Using an if-let statement, we then check whether the applied filter has successfully returned a new, edited image. Within this if-let statement, we again check if we can render a new CIImage out of it with our context property’s help.

case .Sepia:
    let filter = CIFilter(name: "CISepiaTone")
    filter?.setValue(imageToEdit, forKey: "inputImage")

    if let output = filter?.outputImage {
        if let cgimg = context.createCGImage(output, from: output.extent) {
            let processedImage = UIImage(cgImage: cgimg)
        }
    }

If that worked, we use the newly created processedImage to return it as an UIImage.

case .Sepia:
    let filter = CIFilter(name: "CISepiaTone")
    filter?.setValue(imageToEdit, forKey: "inputImage")

    if let output = filter?.outputImage {
        if let cgimg = context.createCGImage(output, from: output.extent) {
            let processedImage = UIImage(cgImage: cgimg)
            return processedImage
        }
    }

If something goes wrong, we return the unedited inputImage instead.

func generateFilteredImage(inputImage: UIImage?, filter: FilterType) -> UIImage? {
    
    //...
    
    return inputImage
}

Applying a filter through tapping its thumbnail

We just implemented the necessary function to apply a filter to an existing image. When the user taps on the existing ThumbnailView in the ContentView we want to utilize the unprocessedImage of the imageController to apply the .Original filter to it and assign the resulting UIImage to the displayedImage of the imageController.

For this purpose, we safe-unwrap the unprocessedImage of our imageController within the if-statement of our ContentView.

if let imageToDisplay = imageController.displayedImage, let originalImage = imageController.unprocessedImage {
    //...
} else {
    //...
}

Then, we wrap the ThumbnailView into a Button.

Button(action: {
    imageController.displayedImage = imageController.generateFilteredImage(inputImage: originalImage, filter: .Original)
}) {
    ThumbnailView(imageToDisplay: originalImage, width: geometry.size.width*(21/100), height: geometry.size.height*(15/100), filterName: "Original")
}

Let’s move on to implementing the Sepia filter into our ContentView. But instead of inserting a separate ThumbnailView instance again, we declare an array in our ContentView containing the available FilterTypes.

let availableFilters: [FilterType] = [.Original, .Sepia]

Then, we cycle through our availableFilters by using a ForEach loop with initializing one ThumbnailView for every FilterType in it.

HStack {
    ForEach(availableFilters, id: \.self) { filter in
        Button(action: {
            imageController.displayedImage = imageController.generateFilteredImage(inputImage: originalImage, filter: filter)
        }) {
            ThumbnailView(imageToDisplay: originalImage, width: geometry.size.width*(21/100), height: geometry.size.height*(15/100), filterName: "\(filter)")
        }
    }
}

Now we have to tell our ContentView to update its body to display the new displayedImage. We do this within our ImageController by using the @Published property wrapper again: 

@Published var displayedImage: UIImage?

Run the app to see if everything works fine!

Finally, we want the thumbnails to show us already a preview of how the edited photo will look. Therefore, we generate a filtered image depending on the respective filter and pass it to each ThumbnailView in our ContentView.

However, we should keep in mind that processing too much image data can lead to performance issues. So we want to display only filtered previews for the ThumbnailViews, which have been compressed before.

Therefore we create a new Swift file named “Extensions.swift”, import the UIKit framework and add the following extension. 

import UIKit

extension UIImage {
    func compressed() -> UIImage? {
        var compressedImage = UIImage()
        
        if let imageData = self.pngData(){
            let options = [
                kCGImageSourceCreateThumbnailWithTransform: true,
                kCGImageSourceCreateThumbnailFromImageAlways: true,
                kCGImageSourceThumbnailMaxPixelSize: 200] as CFDictionary

            imageData.withUnsafeBytes { ptr in
               guard let bytes = ptr.baseAddress?.assumingMemoryBound(to: UInt8.self) else {
                  return
               }
               if let cfData = CFDataCreate(kCFAllocatorDefault, bytes, imageData.count){
                  let source = CGImageSourceCreateWithData(cfData, nil)!
                  let imageReference = CGImageSourceCreateThumbnailAtIndex(source, 0, options)!
                  compressedImage = UIImage(cgImage: imageReference)
               }
            }
        }
        
        return compressedImage
    }
}

The .compressed extension makes the affected image smaller and thus saves a lot of performance.

Now we can use the compressed version of the originalImage to apply the appropriate filter and show the result in the respective ThumbnailView.

ForEach(availableFilters, id: \.self) { filter in
    Button(action: {
        imageController.displayedImage = imageController.generateFilteredImage(inputImage: originalImage, filter: filter)
    }) {
        ThumbnailView(imageToDisplay: imageController.generateFilteredImage(inputImage: originalImage.compressed(), filter: filter)!, width: geometry.size.width*(21/100), height: geometry.size.height*(15/100), filterName: "\(filter)")
    }
}

If we run the app now, we will see that our thumbnails have already applied the corresponding filter. Pretty cool!

More filters! 🎨

We’ve already seen how to apply a sepia effect. But we want to add a few more filters. So, let’s add the following cases to our FilterType enum:

enum FilterType {
    case Original
    case Sepia
    case Mono
    case Vibrance
}

Let’s add the new FilterTypes to the availableFilters array of our ContentView.

let availableFilters: [FilterType] = [.Original, .Sepia, .Mono, .Vibrance]

Consequently, we now have to adapt our generateFilteredImage function of the ImageController. First, we extend our switch statement with the case “.Mono”. 

switch filter {
//...
case .Mono:
    
}

For this effect, we would like to apply a built-in Core Image Filter again. But how do we know the name of a suitable filter for a black-and-white effect?

Let’s take a look at the Core Image Filter Reference. It contains many categories of filters provided by CoreImage and gives us the necessary information to use them for each of these filters. For a black and white effect, for example, we go to the section “CICategoryColorEffect”. If we look through this category, we see a filter called CIPhotoEffectMono. This filter is pre-configured and only needs a CIIMage for the “inputImage” key. 

We go back to our switch statement and write below our .Mono case:

case .Mono:
    let filter = CIFilter(name: "CIPhotoEffectMono")
    filter?.setValue(imageToEdit, forKey: "inputImage")

Now, we can render and output the edited image just as we did with our .Sepia effect.

case .Mono:
    let filter = CIFilter(name: "CIPhotoEffectMono")
    filter?.setValue(imageToEdit, forKey: "inputImage")

    if let output = filter?.outputImage {
        if let cgimg = context.createCGImage(output, from: output.extent) {
            let processedImage = UIImage(cgImage: cgimg)
            return processedImage
        }
    }

Next, we extend our switch statement with the .Vibrance case. For this effect, we use the CIVibrance filter, which we find in the CoreImage Filter reference under CICategoryColorAdjustment. 

There we also see that – besides the “inputImage” – we also need a value for the “inputAmount” key to adjust the effect’s intensity. Correspondingly we add the following code to our switch statement:

case .Vibrance:
    let filter = CIFilter(name: "CIVibrance")
    filter?.setValue(imageToEdit, forKey: "inputImage")
    filter?.setValue(20, forKey: "inputAmount")

    if let output = filter?.outputImage {
        if let cgimg = context.createCGImage(output, from: output.extent) {
            let processedImage = UIImage(cgImage: cgimg)
            return processedImage
        }
    }

If we run our app now we can choose between two more filters!

Challenge: Try adding two more filters to the app: one to apply a highlight effect and one to add a vignette to the photo. Tip: The CoreImage filters CIHighlightShadowAdjust and CIVignette would fit for this. In the reference documentation, you can look up which keys you have to use for this. 

Have a look at the finished project to see how you can add these filters.

Saving the edited photo ⬇️

A photo editing app would be pointless if we could not save the edited image. The implementation of this functionality is, fortunately, very straightforward. We add the following function to our ImageController:

func saveImage() {
    if let imageToSave = displayedImage {
        UIImageWriteToSavedPhotosAlbum(imageToSave, nil, nil, nil)
    } else {
        print("There is no image to be saved")
    }
}

The saveImage function calls the UIImageWriteToSavedPhotosAlbum method and saves the displayedImage to the user’s gallery.

We can now add the imageController as an EnvironmentObject to our SaveButton view and use the saveImage function when the user taps the “save” icon.

struct SaveButton: View {
    
    @EnvironmentObject var imageController: ImageController
    
    var body: some View {
        Button(action: {
            imageController.saveImage()
        }) {
            Image(systemName: "square.and.arrow.down")
                .imageScale(.large)
        }
    }
}

Conclusion 🎊

Congratulations, you just created your own photo filter app! We just learned how to integrate a PHPicker into SwiftUI. We also saw how to use the CoreImage framework to apply custom effects to the images. Finally, we also figured out how to save the edited images.

If you want, you can extend this app’s functionality and, for example, offer other editing settings such as brightness or contrast control. The finished project also contains two more filters, so make sure you check it out!

3 replies on “Applying filters and saving the edited photo”

Hi,
I tried to add Distortion effect but it doesn’t render the image properly and I can’t figure out how to add CIVector as well. Maybe you have a hint to apply it ?
Thanks

The app works great except that at first when I pick an image from my gallery, the “original” thumbnail is correctly oriented in portrait mode but the “filter” thumbnails are oriented in landscape made. I corrected this by changing pngData() to jpegData(compressedQuality: 1.0) on the extention.swift file.

However, the main problem still exists: Whenever I tap on any filter thumbnail, the displayed image changes to landscape and that wouldn’t be too much of a problem except that the image loses the frame size and cuts heads off of portrait images. I’m not sure if it is a GeometryReader conflict with UIImage .

Any suggestions?
Thanks

Leave a Reply

Your email address will not be published. Required fields are marked *