Photography changed when I rented the Leica M Monochrom. It was a lot of firsts for me: manual focus, unreliable meter, rangefinder instead of TTL, and no color.
It was hard.
Not annoying, but hard. I learned a lot from such a different experience. Even though I’ve never fancied film, I realized I could learn a lot from the experience of shooting film. The Monochrom taught me this.
Most of my friends learned photography on black and white film. Shooting in black and white forces you to think about light, exposure, and tone without the distraction of color.
Since I know more about computer algorithms than film processing, I decided to create my own digital film and RAW+JPEG camera app for iPhone which uses them. It was the perfect weekend project, and I thought others could benefit from what I learned.
In the next few articles I talk about how to build digital film (i.e., a series CoreImage
filters), and a camera app to test them out. Today, we’re gonna discuss the infrastructure and basic app we need for our later experiments.
If you’ve never played around with AVFoundation
or CoreImage
, this could serve as a gentle introduction.
Before we go any further: This is naive code that works on my everyday carry1, and may not perform well on other devices. If you continue reading, I expect you will adhere to my disclaimer2.
The app is a single view controller, with a single button to take a photo, and a viewfinder to compose the shot. That’s it.
In the view controller, we need to:
AVCaptureSession
with a target triple: AVDevice
, AVCaptureInput
, AVCaptureOutput
CIFilter
stack, which will serve as our film processorIf you’ve never used AVFoundation
, setting it up to capture camera data is straightforward. Everything is stored in an AVCaptureSession
. It contains inputs (based on your chosen device), and outputs that you wire into the session.
let session = AVCaptureSession()
let output = AVCapturePhotoOutput()
let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
func setupAVSession() {
session.beginConfiguration()
session.sessionPreset = AVCaptureSessionPresetPhoto
let input = try! AVCaptureDeviceInput(device: device)
if session.canAddInput(input) {
session.addInput(input)
}
if session.canAddOutput(output) {
session.addOutput(output)
}
session.commitConfiguration()
}
First, defaultDevice(withMediaType:)
returns the rear camera for the AVMediaTypeVideo
. Second, I use the AVCaptureSessionPresetPhoto
session preset and pair it with an AVCapturePhotoOutput
to configure the session to capture 12 megapixel stills. AVCapturePhotoOutput
is a brand new class in iOS 10 which makes still photo capture a piece of cake.
In later articles, we’ll write a more complicated chain of filters. For our first build, let’s keep things simple and use the built-in CIPhotoEffectNoir
filter. This yields a black and white image, with exaggerated contrast. Before I got into more realistic film simulations, I took several nice photographs using just this filter.
First, I define a FilterType
enum to delineate each filter:
enum FilterType {
case noir, none
func apply(toImage image: CIImage) -> CIImage {
switch self {
case .noir:
return applyNoirFilter(toImage: image)
case .none:
return image
}
}
private func applyNoirFilter(toImage image: CIImage) -> CIImage {
let params = [kCIInputImageKey: image]
guard let bwFilter = CIFilter(name: "CIPhotoEffectNoir", withInputParameters: params)
else { return image }
guard let result = bwFilter.outputImage
else { return image }
return result
}
}
With FilterType
defined, writing a simple manager to maintain the currently selected filter is trivial:
struct FilterProcessor {
var selectedFilter: FilterType = .noir
func convertedImage(forSelectedFilter input: CIImage) -> CIImage {
return selectedFilter.apply(toImage: input)
}
}
So far, both frameworks have provided exactly what we needed. Unfortunately, AVFoundation
doesn’t have a pre-built affordance for a filtered viewfinder on iOS.
The standard approach uses an AVCaptureVideoPreviewLayer
and its constructor init(session:)
. That returns a CALayer
which can be added to your view and yields a high frame-rate representation of your capture input. However, in iOS, you can’t set the filters
property on this layer.
Instead, we have to manually filter the view. The manual approach requires the use of a UIImageView
and we have to add a second output to our session: AVCaptureVideoDataOutput
.
In your setupAVSession()
method, add the following:
let bufferQueue = DispatchQueue.global(qos: .userInteractive)
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: bufferQueue)
if session.canAddOutput(videoOutput) {
session.addOutput(videoOutput)
}
AVCaptureVideoDataOutput
gives us access to each frame, vended from the capture source to captureOutput(_:didOutputSampleBuffer:from:)
. Then, we simply need to apply our FilterType
to each frame, and set the resulting image in our UIImageView
.
@IBOutlet weak var cameraView: UIImageView!
let filterManager = FilterProcessor()
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer
sampleBuffer: CMSampleBuffer!,
from connection: AVCaptureConnection!) {
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
let cameraImage = CIImage(cvPixelBuffer: pixelBuffer)
let filteredImage = filterManager.convertedImage(forSelectedFilter: cameraImage)
let finalImage = UIImage(ciImage: filteredImage)
DispatchQueue.main.async {
self.cameraView.image = finalImage
}
}
We have to repackage the image buffer as a CIImage
so we can filter it. Then export the result as a UIImage
. This can all be done on the same background queue where we received the original frame. To update the UI (i.e., our UIImageView
), we need to be in the main thread.
Finally, the viewDidLoad()
method of your view controller should look like this:
override func viewDidLoad() {
super.viewDidLoad()
setupAVSession()
session.startRunning()
}
If you run the code as-is—and your device orientation is not .landscapeLeft
—then the viewfinder will be distorted, and/or upside-down. To simplify the calculations I disable auto-rotation, and lock device orientation to .portrait
.
Assuming device orientation is always portrait, you can add the following line to the delegate method after we define the cameraImage
variable:
var cameraImage = CIImage(cvPixelBuffer: pixelBuffer)
// Image Origin for Portrait is (Right, Top), which is the number 6
// See docs for kCGImagePropertyOrientation
cameraImage = cameraImage.applyingOrientation(6)
The core infrastructure for our filter test harness is fleshed out. AVFoundation
is vending camera data, we have a CIFilter
stack to filter that data, and we have a viewfinder to see a live preview of the filtered camera data.
In the next article we’ll get into the details of the capturing a photo in RAW+JPEG format, and then save it into the device photo library. That will give us a functioning camera app for us to begin testing our digital film.