Add filters to a Selfie Portrait App with CIFilters.

Wow, this week is have been awesome so far for iOS developers, WWDC has shown a lot of new cool features in Swift 4 and also gave us access to Xcode 9 and iOS 11, although this post is not related to any of this new features I can’t start without mention how excited I am about the new frameworks such ARKit, my favorite new one so far!

If you want to check more about what’s new on Swift 4 go to this link

Let’s go back to our project, adding filters to images in based Camera apps became a standard feature to give the user the chance to play around with their media, and its very simple to implement using CoreImage.

I have to throw a disclaimer before we start, this is not a tutorial like the other that I wrote before, is more like a documented sample project that I hope it can be helpful in your own apps most of the camera logic is already done, you will download a full-screen camera Snapchat Like that I build using AVFoundation, we will focus just on building the process after taking a photo, so if you are not familiar with AVFoundation framework here is the resource that I took to build the camera, and  here is the starter  project if you want to follow along

Let’s take a quick look at the content of the project; first, we can notice that we are doing this project 100% programmatically, there are a few directories with files on it, let’s go to the directory called Model.

You can see a struct called CameraStateTracker that will keep track of the state of the camera, there is a CameraController class that encapsulates all the logic of the configuration of the camera, and also there is a class PhotoCaptureDelegate that will be the one in charge of providing the captured image. In the Controller directory, we can find a Viewcontroller called CameraVC that is our main ViewController that is in charge of mediate our models into the UI, I know that there is a lot done but again I must say that we will focus on the CoreImage framework.

Finally, there are some assets for the camera that I made myself and also that I found on  Google’s Material Design website, now run the project in a device.

You can see that the app crashed with the following error on the console…

The app’s Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data.

That’s an easy fix, we will add two keys to the info.plist

Privacy – Camera Usage Description 

Privacy – Photo Library Usage Description

You need to provide a String for this two keys asking for permission after that run the app again, and you should see this…

//Photo

So this camera handles flash and changes from rear and front cameras, but as you can tell it doesn’t take pictures yet, so let’s start implementing that now starting by the UI.

Because we want to give to this camera the ability to filter an image by swiping we will add a horizontal collection view inside a customView that will handle the filter responsibility and a cell that will display the image filtered based on the index.

Let’s start with the cell, create a new file and call it FilterCell and add this…

import UIKit
final class FilterCell: UICollectionViewCell {
    let photoImageView: UIImageView = {
        let iv = UIImageView()
        iv.translatesAutoresizingMaskIntoConstraints = false
        iv.contentMode = .scaleAspectFill
        iv.clipsToBounds = true
        return iv
    }()
    
    override init(frame: CGRect) {
        super.init(frame: frame)
        setUpViews()
    }
    
    required init?(coder aDecoder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }
    
    private func setUpViews() {       
        addSubview(photoImageView)   
        NSLayoutConstraint.activate([
            photoImageView.topAnchor.constraint(equalTo: topAnchor),
            photoImageView.leadingAnchor.constraint(equalTo: leadingAnchor),
            photoImageView.trailingAnchor.constraint(equalTo: trailingAnchor),
            photoImageView.bottomAnchor.constraint(equalTo: bottomAnchor)
            ])
    }

    func setUpcell(_ image: UIImage?) {
        DispatchQueue.main.async {
            self.photoImageView.image = image
        }
    }
}

This cell has an imageView anchored to its edges, now we need to create a UICollectionViewFlowLayout subclass for the horizontally scrollable collectionView, create a new file and call it ListLayout add this implementation…

import UIKit

final class ListLayout: UICollectionViewFlowLayout {
    override init() {
        super.init()
        minimumLineSpacing = 0
        scrollDirection = .horizontal
    }
    
    required init?(coder aDecoder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }
    
    override var itemSize: CGSize {
        set {
        }
        get {
            return CGSize(width: (self.collectionView?.frame.width)!, height: (self.collectionView?.frame.height)!)
        }
    }
}
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s