Taking control of the iPhone camera in iOS 8 with Swift (Part 2)

 

Using the AVFoundation API, we are going to set up a capture session and make an app that allows us to use all the new fine-grained controls added to iOS 8. This includes manually controlling focus, exposure, and ISO. First off, we just need to set up a basic camera preview. In Part 1, we created a touch-based way to manually control focus. If you haven’t read that yet you can read it here.

In this part of the tutorial, we’re going to dive in to the API a little deeper and create a second axis for controlling our image using the vertical touch position.

First off, let’s just clean up this code that determines how far in to the screen is being touched. Let’s create a function that takes in a UITouch object, and returns a CGPoint specifying how far in to the screen we’ve tapped, as a percentage of the screens total width. For example tapping on the very top-right of the screen will return a CGPoint with values 1,1. While tapping on the bottom-left will return 0,0… the center of the screen 0.5, 0.5, and all values in between.

func touchPercent(touch : UITouch) -> CGPoint {
    // Get the dimensions of the screen in points
    let screenSize = UIScreen.mainScreen().bounds.size
    
    // Create an empty CGPoint object set to 0, 0
    var touchPer = CGPointZero
    
    // Set the x and y values to be the value of the tapped position, divided by the width/height of the screen
    touchPer.x = touch.locationInView(self.view).x / screenSize.width
    touchPer.y = touch.locationInView(self.view).y / screenSize.height
    
    // Return the populated CGPoint
    return touchPer
}

This declares a method named touchPercent, which takes a UITouch object named touch as an argument, and returns a CGPoint. The actual math is simple… take the touched point and divide by the total number of points for either the width or height of the screen. We are already using the x value for focus, but we need to adjust the touchesBegan and touchesMoved method to use our new function.

override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
    let touchPer = touchPercent( touches.anyObject() as UITouch )
    focusTo(Float(touchPer.x))
}

override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {
    let touchPer = touchPercent( touches.anyObject() as UITouch )
    focusTo(Float(touchPer.x))
}

Note: you can remove the old screenWidth variable we were declaring before.

Next, we want to specify the ISO. If you’re unfamiliar with that is, it’s simple enough to say that the ISO is a setting for digital cameras that specifies what amount of light to allow in to the lens. A lower ISO will give brighter and higher quality pictures, but with a slower shutter speed, resulting in blurrier photos. A very low ISO results in a camera mode where a tripod is useful. With a very high ISO, you can take quick action shots with the very low shutter speed, but expect to see lower quality images with more noise. I’m by no means a professional photographer, so I’m probably not explaining it all that well, but that’s about my level of understanding.

Okay, moving on… let’s just add it to the app and see what it does for ourselves :)

First, let’s update the focusTo() method to take in an isoValue in addition to a focusValue

func updateDeviceSettings(focusValue : Float, isoValue : Float) {
    if let device = captureDevice {
        if(device.lockForConfiguration(nil)) {
            device.setFocusModeLockedWithLensPosition(focusValue, completionHandler: { (time) -> Void in
                //
            })
            
            // Adjust the iso to clamp between minIso and maxIso based on the active format
            let minISO = device.activeFormat.minISO
            let maxISO = device.activeFormat.maxISO
            let clampedISO = isoValue * (maxISO - minISO) + minISO
            
            device.setExposureModeCustomWithDuration(AVCaptureExposureDurationCurrent, ISO: clampedISO, completionHandler: { (time) -> Void in
                //
            })
            
            device.unlockForConfiguration()
        }
    }
}

The first new thing in this function is the name is now updateDeviceSettings. This is just because we’re adding more than focus. We’ve also specified focusValue and isoValue as distinct values.

Next, we have the normal locking and setting of the focus, followed by an additional bit of code that stored the minISO and maxISO, then adjusts our isoValue to fit within those two values, proportionally.

For example, if minISO is 50, and maxISO is 100. A value of 0 passed on to isoValue will result in a clampedISO of 50. But a value of 0.5 will result in a clampedISO of 75.

After adjusting this value, we can call the method that actually sets the ISO, setExposureModeCustomWithDuration().

The first parameter to this method is the shutter speed, in this case we aren’t trying to specify shutter speed so we simply use the constant AVCaptureExposureDurationCurrent. This basically just says we don’t want to specify a shutter time, and are just modifying the ISO. Second, is our clampedISO value, then finally we have a completionHandler just as we do with the setFocusModeLockedWithLensPosition method.

Try running the app and sliding your finger from the top to the bottom of the screen. Do you see the difference in the general pixel brightness? This is the ISO being modified in real time.

In the next part, we’ll dig a little deeper and naturally, implement some more UI to control these settings.

Here is the final code from this post:
Part 2 on Github

Got a question or issue?

Join us on our new forums.

Sharing is caring :)

 
Developing iOS 8 Apps in Swift
An upcoming ebook detailing everything you need to know to produce marketable apps for iOS 8 using swift.
Learn to produce real world applications through tutorials. Available for pre-order today at a 25% discount.


Sign up now and get a set of FREE video tutorials on writing iOS apps when Xcode 6 is released.



Subscribe via RSS

Apple Watch Announced. Here’s what we know so far

 

Apple Watch

Apple Watch Announced. Here’s what we know so far. This page is a work in progress.

There are three versions of Apple Watch:

  • Apple Watch
  • Apple Watch Sport
  • Apple Watch Edition – 18k Gold

The price starts at $349, and will be available early 2015

Apple Watch

An iPhone required to use Apple Watch

The device uses a Crown as the input device in addition to two types of touch detection. A ‘tap’ and a ‘force push’. The difference being how hard you push on the screen.

The Apple Watch provides support for Maps, Calendar, iMessage, Siri, Dictation, connects to wireless speakers, and a variety of other Apple ecosystem features.

The battery uses a wireless charger that attaches magnetically to the back of the watch. The watch uses haptic feedback for navigation and other use cases.

The second button on the watch will bring up your list of contacts. Digital touch allows you to ‘tap’ someone remotely using their watch, then drawing back and forth will wirelessly sync to other Apple Watch wearers.

User’s can also share their heartbeat, making the other user feel their heartbeat, or chat using the walkie-talkie feature.

 

Apple Watch Edition

WatchKit is used to allow developers to create glances, apps, and notifications.

The device can be used to track heart rate, and makes use of accelerometers.

Built-in activity app measures standing, moving, and exercising. These categories are referred to as rings, and hitting a quota visually fills the ring.

Move
Measure calories burned

Excercise
Measures brisk activitiy, at a brisk walk or above

Stand
Measures how often you stood up from sitting.

Got a question or issue?

Join us on our new forums.

Sharing is caring :)

 
Developing iOS 8 Apps in Swift
An upcoming ebook detailing everything you need to know to produce marketable apps for iOS 8 using swift.
Learn to produce real world applications through tutorials. Available for pre-order today at a 25% discount.


Sign up now and get a set of FREE video tutorials on writing iOS apps when Xcode 6 is released.



Subscribe via RSS

Taking control of the iPhone camera in iOS 8 with Swift (Part 1)

 

Updated on September 20, 2014 for Xcode 6 GM

Using the AVFoundation API, we are going to set up a capture session and make an app that allows us to use all the new fine-grained controls added to iOS 8. This includes manually controlling focus, exposure, and ISO. First off, we just need to set up a basic camera preview. By the end of Part 1 we’ll have that in place along with a nifty way to control focus. Ready? Let’s get going…

First off we’ll create a new Xcode Project using Swift as the language, and using the Single View template.

singleAppTemplate

Now, in the ViewController.swift file we can start adding in our custom code inside of viewDidLoad().

First we create a AVCaptureSession object to work with. Let’s do this as a class variable.

let captureSession = AVCaptureSession()

This may give an error due to not being able to find AVCaptureSession. So near the top of the file make sure to add:

import AVFoundation

Now, in viewDidLoad let’s set our quality settings and find a device to record from.

First, let’s take a look at the list of devices available.

captureSession.sessionPreset = AVCaptureSessionPresetLow
let devices = AVCaptureDevice.devices()
println(devices)

Run this and you’ll see something like this:

[<AVCaptureFigVideoDevice: 0x16e7f720 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>,
<AVCaptureFigVideoDevice: 0x16d91a00 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>,
<AVCaptureFigAudioDevice: 0x16e88c00 [iPhone Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>]
pple.avfoundation.avcapturedevice.built-in_video:1]>,
<AVCaptureFigAudioDevice: 0x16e88c00 [iPhone Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>]

This is from my iPhone 5S. Looks like we have two microphones, and the front and back cameras. Cool. For our purposes let’s try and grab the back camera.

Let’s add this to a ViewController, and store the front facing camera if we find one

import UIKit
import AVFoundation

class ViewController: UIViewController {

    let captureSession = AVCaptureSession()

    // If we find a device we'll store it here for later use
    var captureDevice : AVCaptureDevice?

    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        captureSession.sessionPreset = AVCaptureSessionPresetLow

        let devices = AVCaptureDevice.devices()

        // Loop through all the capture devices on this phone
        for device in devices {
            // Make sure this particular device supports video
            if (device.hasMediaType(AVMediaTypeVideo)) {
                // Finally check the position and confirm we've got the back camera
                if(device.position == AVCaptureDevicePosition.Back) {
                    captureDevice = device as? AVCaptureDevice
                }
            }
        }

    }

}

After we set the captureDevice, let’s begin the session by implementing a function to start the session

if captureDevice != nil {
    beginSession()
}

…and later in the class we implement beginSession()…

func beginSession() {
    var err : NSError? = nil
    captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err))

    if err != nil {
        println("error: \(err?.localizedDescription)")
    }

    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    self.view.layer.addSublayer(previewLayer)
    previewLayer?.frame = self.view.layer.frame
    captureSession.startRunning()
}

If you run the app on a device now, you should see a preview of the camera. This is pretty much just the standard iOS camera. Let’s now modify the focus mode. Add a new method called configureDevice() and have beginSession() call it on the first line starting the capture session.

func configureDevice() {
    if let device = captureDevice {
        device.lockForConfiguration(nil)
        device.focusMode = .Locked
        device.unlockForConfiguration()
    }
}

Add this method, it locks the device, sets the focus to locked, and then unlocks the device.

Run the app now and try tapping to focus on different parts of the scene. The default focus behavior should now be disabled. This means we can control the focus on our own. Let’s add a UISlider to control the focus.

Now, let’s add a manual focusTo function based on a value from 0.0 to 1.0

func focusTo(value : Float) {
    if let device = captureDevice {
        if(device.lockForConfiguration(nil)) {
            device.setFocusModeLockedWithLensPosition(value, completionHandler: { (time) -> Void in
                //
            })
            device.unlockForConfiguration()
        }
    }
}

First, we validate that the device exists, then we lock the device. If the lock is successful we call the setFocusModeLockedWithLensPosition() API to tell the lens to focus on the point ‘value’, which is passed in to the focusTo() method.

Now let’s implement touch controls using these methods:

let screenWidth = UIScreen.mainScreen().bounds.size.width
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
    var anyTouch = touches.anyObject() as UITouch
    var touchPercent = anyTouch.locationInView(self.view).x / screenWidth
    focusTo(Float(touchPercent))
}

override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {
    var anyTouch = touches.anyObject() as UITouch
    var touchPercent = anyTouch.locationInView(self.view).x / screenWidth
    focusTo(Float(touchPercent))
}

This just gets a value from 0.0 to 1.0 based on how far you are touching on the screen horizontally. Run the app now and slide a finger left to right on the screen. You can now manually control focus this way! Cool right?

Next time we’ll add an option for manually setting the ISO and exposure. But for now, this is a start. Make sure to subscribe to my newsletter to be notified of Part 2. Coming soon!

Want a deeper look at the AVFoundation API? Pre-order my upcoming book on developing iOS 8 Apps in Swift.

Here is the final code from this post:

Part 1 on Github

import UIKit
import AVFoundation

class ViewController: UIViewController {

    let captureSession = AVCaptureSession()
    var previewLayer : AVCaptureVideoPreviewLayer?

    // If we find a device we'll store it here for later use
    var captureDevice : AVCaptureDevice?

    override func viewDidLoad() {
        super.viewDidLoad()

        // Do any additional setup after loading the view, typically from a nib.
        captureSession.sessionPreset = AVCaptureSessionPresetHigh

        let devices = AVCaptureDevice.devices()

        // Loop through all the capture devices on this phone
        for device in devices {
            // Make sure this particular device supports video
            if (device.hasMediaType(AVMediaTypeVideo)) {
                // Finally check the position and confirm we've got the back camera
                if(device.position == AVCaptureDevicePosition.Back) {
                    captureDevice = device as? AVCaptureDevice
                    if captureDevice != nil {
                        println("Capture device found")
                        beginSession()
                    }
                }
            }
        }

    }

    func focusTo(value : Float) {
        if let device = captureDevice {
            if(device.lockForConfiguration(nil)) {
                device.setFocusModeLockedWithLensPosition(value, completionHandler: { (time) -> Void in
                    //
                })
                device.unlockForConfiguration()
            }
        }
    }

    let screenWidth = UIScreen.mainScreen().bounds.size.width
    override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
        var anyTouch = touches.anyObject() as UITouch
        var touchPercent = anyTouch.locationInView(self.view).x / screenWidth
        focusTo(Float(touchPercent))
    }

    override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {
        var anyTouch = touches.anyObject() as UITouch
        var touchPercent = anyTouch.locationInView(self.view).x / screenWidth
        focusTo(Float(touchPercent))
    }

    func configureDevice() {
        if let device = captureDevice {
            device.lockForConfiguration(nil)
            device.focusMode = .Locked
            device.unlockForConfiguration()
        }

    }

    func beginSession() {

        configureDevice()

        var err : NSError? = nil
        captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err))

        if err != nil {
            println("error: \(err?.localizedDescription)")
        }

        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        self.view.layer.addSublayer(previewLayer)
        previewLayer?.frame = self.view.layer.frame
        captureSession.startRunning()
    }

}

Got a question or issue?

Join us on our new forums.

Sharing is caring :)

 
Developing iOS 8 Apps in Swift
An upcoming ebook detailing everything you need to know to produce marketable apps for iOS 8 using swift.
Learn to produce real world applications through tutorials. Available for pre-order today at a 25% discount.


Sign up now and get a set of FREE video tutorials on writing iOS apps when Xcode 6 is released.



Subscribe via RSS