Apple Watch Announced. Here’s what we know so far

 

Apple Watch

Apple Watch Announced. Here’s what we know so far. This page is a work in progress.

There are three versions of Apple Watch:

  • Apple Watch
  • Apple Watch Sport
  • Apple Watch Edition – 18k Gold

The price starts at $349, and will be available early 2015

Apple Watch

An iPhone required to use Apple Watch

The device uses a Crown as the input device in addition to two types of touch detection. A ‘tap’ and a ‘force push’. The difference being how hard you push on the screen.

The Apple Watch provides support for Maps, Calendar, iMessage, Siri, Dictation, connects to wireless speakers, and a variety of other Apple ecosystem features.

The battery uses a wireless charger that attaches magnetically to the back of the watch. The watch uses haptic feedback for navigation and other use cases.

The second button on the watch will bring up your list of contacts. Digital touch allows you to ‘tap’ someone remotely using their watch, then drawing back and forth will wirelessly sync to other Apple Watch wearers.

User’s can also share their heartbeat, making the other user feel their heartbeat, or chat using the walkie-talkie feature.

 

Apple Watch Edition

WatchKit is used to allow developers to create glances, apps, and notifications.

The device can be used to track heart rate, and makes use of accelerometers.

Built-in activity app measures standing, moving, and exercising. These categories are referred to as rings, and hitting a quota visually fills the ring.

Move
Measure calories burned

Excercise
Measures brisk activitiy, at a brisk walk or above

Stand
Measures how often you stood up from sitting.

Got a question or issue?

Join us on our new forums.

Sharing is caring :)

 
Developing iOS 8 Apps in Swift
An upcoming ebook detailing everything you need to know to produce marketable apps for iOS 8 using swift.
Learn to produce real world applications through tutorials. Available for pre-order today at a 25% discount.


Sign up now and get a set of FREE video tutorials on writing iOS apps when Xcode 6 is released.



Subscribe via RSS

Taking control of the iPhone camera in iOS 8 with Swift (Part 1)

 

Using the AVFoundation API, we are going to set up a capture session and make an app that allows us to use all the new fine-grained controls added to iOS 8. This includes manually controlling focus, exposure, and ISO. First off, we just need to set up a basic camera preview. By the end of Part 1 we’ll have that in place along with a nifty way to control focus. Ready? Let’s get going…

First off we’ll create a new Xcode Project using Swift as the language, and using the Single View template.

Now, in the ViewController.swift file we can start adding in our custom code inside of viewDidLoad().

First we create a AVCaptureSession object to work with. Let’s do this as a class variable.

let captureSession = AVCaptureSession()

Now, in viewDidLoad let’s set our quality settings and find a device to record from.

First, let’s take a look at the list of devices available.

captureSession.sessionPreset = AVCaptureSessionPresetLow
let devices = AVCaptureDevice.devices()
println(devices)

Run this and you’ll see something like this:

[<AVCaptureFigVideoDevice: 0x16e7f720 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>,
<AVCaptureFigVideoDevice: 0x16d91a00 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>,
<AVCaptureFigAudioDevice: 0x16e88c00 [iPhone Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>]
pple.avfoundation.avcapturedevice.built-in_video:1]>,
<AVCaptureFigAudioDevice: 0x16e88c00 [iPhone Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>]

This is from my iPhone 5S. Looks like we have two microphones, and the front and back cameras. Cool. For our purposes let’s try and grab the back camera.

Let’s add this to a ViewController, and store the front facing camera if we find one

import UIKit
import AVFoundation

class ViewController: UIViewController {
    
    let captureSession = AVCaptureSession()
    
    // If we find a device we'll store it here for later use
    var captureDevice : AVCaptureDevice?
                            
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
        captureSession.sessionPreset = AVCaptureSessionPresetLow
        
        let devices = AVCaptureDevice.devices()
        
        // Loop through all the capture devices on this phone
        for device in devices {
            // Make sure this particular device supports video
            if (device.hasMediaType(AVMediaTypeVideo)) {
                // Finally check the position and confirm we've got the back camera
                if(device.position == AVCaptureDevicePosition.Back) {
                    captureDevice = device as? AVCaptureDevice
                }
            }
        }
        
    }

}

After we set the captureDevice, let’s begin the session by implementing a function to start the session

if captureDevice != nil {
    beginSession()
}

…and later in the class we implement beginSession()…

func beginSession() {
    var err : NSError? = nil
    captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err))
    
    if err != nil {
        println("error: \(err?.localizedDescription)")
    }
    
    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    self.view.layer.addSublayer(previewLayer)
    previewLayer?.frame = self.view.layer.frame
    captureSession.startRunning()
}

If you run the app on a device now, you should see a preview of the camera. This is pretty much just the standard iOS camera. Let’s now modify the focus mode. Add a new method called configureDevice() and have beginSession() call it on the first line starting the capture session.

func configureDevice() {
    if let device = captureDevice {
        device.lockForConfiguration(nil)
        device.focusMode = .Locked
        device.unlockForConfiguration()
    }
}

Add this method, it locks the device, sets the focus to locked, and then unlocks the device.

Run the app now and try tapping to focus on different parts of the scene. The default focus behavior should now be disabled. This means we can control the focus on our own. Let’s add a UISlider to control the focus.

Now, let’s add a manual focusTo function based on a value from 0.0 to 1.0

func focusTo(value : Float) {
    if let device = captureDevice {
        if(device.lockForConfiguration(nil)) {
            device.setFocusModeLockedWithLensPosition(value, completionHandler: { (time) -> Void in
                //
            })
            device.unlockForConfiguration()
        }
    }
}

First, we validate that the device exists, then we lock the device. If the lock is successful we call the setFocusModeLockedWithLensPosition() API to tell the lens to focus on the point ‘value’, which is passed in to the focusTo() method.

Now let’s implement touch controls using these methods:

override func touchesBegan(touches: NSSet!, withEvent event: UIEvent!) {
    var touchPercent = Float(touches.anyObject().locationInView(self.view).x / 320.0)
    focusTo(touchPercent)
}

override func touchesMoved(touches: NSSet!, withEvent event: UIEvent!) {
    var touchPercent = Float(touches.anyObject().locationInView(self.view).x / 320.0)
    focusTo(touchPercent)
}

This just gets a value from 0.0 to 1.0 based on how far you are touching on the screen horizontally. Run the app now and slide a finger left to right on the screen. You can now manually control focus this way! Cool right?

Next time we’ll add an option for manually setting the ISO and exposure. But for now, this is a start. Make sure to subscribe to my newsletter to be notified of Part 2. Coming soon!

Want a deeper look at the AVFoundation API? Pre-order my upcoming book on developing iOS 8 Apps in Swift.

Here is the final code from this post:

import UIKit
import AVFoundation

class ViewController: UIViewController {
    
    let captureSession = AVCaptureSession()
    var previewLayer : AVCaptureVideoPreviewLayer?
    
    // If we find a device we'll store it here for later use
    var captureDevice : AVCaptureDevice?
                            
    override func viewDidLoad() {
        super.viewDidLoad()

        // Do any additional setup after loading the view, typically from a nib.
        captureSession.sessionPreset = AVCaptureSessionPresetHigh
        
        let devices = AVCaptureDevice.devices()
        
        // Loop through all the capture devices on this phone
        for device in devices {
            // Make sure this particular device supports video
            if (device.hasMediaType(AVMediaTypeVideo)) {
                // Finally check the position and confirm we've got the back camera
                if(device.position == AVCaptureDevicePosition.Back) {
                    captureDevice = device as? AVCaptureDevice
                    if captureDevice != nil {
                        println("Capture device found")
                        beginSession()
                    }
                }
            }
        }
        
    }
    
    func focusTo(value : Float) {
        if let device = captureDevice {
            if(device.lockForConfiguration(nil)) {
                device.setFocusModeLockedWithLensPosition(value, completionHandler: { (time) -> Void in
                    //
                })
                device.unlockForConfiguration()
            }
        }
    }
    
    override func touchesBegan(touches: NSSet!, withEvent event: UIEvent!) {
        var touchPercent = Float(touches.anyObject().locationInView(self.view).x / 320.0)
        focusTo(touchPercent)
    }

    override func touchesMoved(touches: NSSet!, withEvent event: UIEvent!) {
        var touchPercent = Float(touches.anyObject().locationInView(self.view).x / 320.0)
        focusTo(touchPercent)
    }
    
    func configureDevice() {
        if let device = captureDevice {
            device.lockForConfiguration(nil)
            device.focusMode = .Locked
            device.unlockForConfiguration()
        }
        
    }
    
    func beginSession() {
        
        configureDevice()
        
        var err : NSError? = nil
        captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err))
        
        if err != nil {
            println("error: \(err?.localizedDescription)")
        }
        
        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        self.view.layer.addSublayer(previewLayer)
        previewLayer?.frame = self.view.layer.frame
        captureSession.startRunning()
    }

}

Got a question or issue?

Join us on our new forums.

Sharing is caring :)

 
Developing iOS 8 Apps in Swift
An upcoming ebook detailing everything you need to know to produce marketable apps for iOS 8 using swift.
Learn to produce real world applications through tutorials. Available for pre-order today at a 25% discount.


Sign up now and get a set of FREE video tutorials on writing iOS apps when Xcode 6 is released.



Subscribe via RSS

Using Equatable and NilLiteralConvertible to re-implement Optionals in Swift (Part 2)

 

A few weeks ago I did a little thought experiment on how we might re-implement Swift’s optional type by using the Swift enum. If you haven’t read that yet, you can read it here. In this post, we’re going to push a little further and see if we can get something that corresponds a little more closely to the Swift Optional type.

In the last post we ended up with this class, JOptional:

enum JOptional<T> {
    case None
    case Some(T)
    
    init(_ value: T) {
        self = .Some(value)
    }
    
    init() {
        self = .None
    }
    
    func unwrap() -> Any {
        switch self {
        case .Some(let x):
            return x
        default:
            fatalError("Unexpectedly found nil while unwrapping an JOptional value")
        }
    }
    
}

It’s fairly straightforward to use the class by using the generic constructor, passing in your value, and then using our overloaded >! operator to unwrap.

// Instantiate some optionals
var myOptionalString = JOptional("A String!")
var a = myOptionalString.unwrap() // "A String!"
var c = myOptionalString>!     // "A String!"

In Swift’s actual Optional implementation, as of Xcode 6 Beta 5, Optionals can also be compared to nil, without doing any unwrapping first. For example:

var myNilValue : String? = nil

if(myNilValue == nil) {
    println("This is nil!")
}
else {
    println("This isn't nil, carry on")
}
This is nil!

This simply creates your standard Swift optional set to a nil value, then we can compare directly to nil and act upon this information. If we try this same code with our version of optional, we run in to a problem:

var myNilJOptional = JOptional(myNilValue)

if(myNilJOptional == nil) {
    println("This is nil!")
}
else {
    println("This isn't nil, carry on")
}
This isn't nil, carry on

Wait, what? Now it’s not nil? Why is our version giving an incorrect result?

Let’s dig a little deeper… If we put a breakpoint right after the setting of these values, and check them out in the watch window, we see the following:

The JOptional is nil, but it has a ‘Some’ case instead of ‘None’, because we used the init(_ T) method to instantiate it. As a result, JOptional is in fact *not* equal to nil!

We could modify this behavior by doing a check on that init method and setting None if the object is nil, but this introduces the issue of needing to update the case if the inner object is modified. This introduces a variety of issues, so what may be better is to simply implement two Swift interfaces that help solve this common problem: Equatable and NilLiteralConvertible.

Equatable just specifies that we are going to implement our own operator for ==, specific to our type.
NilLiteralConvertible gives an interface to convert between JOptionals and nil literals. In our case, a nil object with type JOptional should be converted to JOptional.None.

So first, we add the interfaces to our JOptional definition:

enum JOptional<T> : Equatable, NilLiteralConvertible {
  ...

Next, we need to implement the equality operator. Because this is an operator overload, it goes in global space (outside of the enum):

func == <T>(lhs: JOptional<T>, rhs: JOptional<T>) -> Bool {
    switch (lhs,rhs) {
    case (.Some(let lhsVal), .Some(let rhsVal)):
        // Both have a value, the *optionals* are equal, although the values might not be
        return true
    case (.None, .None):
        // Both are nil
        return true
    default:
        // One does not have a value, but the other does
        return false
    }
}

We’re using pattern matching of tuples here in order to handle three cases:

  1. Both values are not-nil
  2. Both values are nil
  3. Values nil-ness does not match

It’s important to note, comparing two optional (even in Swift’s implementation) only compares that the case is the same (Some vs None) It does not compare the inner values, you need to unwrap for that behavior.

If we get a nil lhs or rhs value in this == operator, we need Swift to know to convert that to the .None case. So we implement the convertFromNilLiteral() method.

static func convertFromNilLiteral() -> JOptional {
    return .None
}

So now, not only can we say things like this:

myNilJOptional = nil

And the result will be that myNilJOptional is set to JOptional.None

At the same time, we can do direct comparisons of JOptional values due to the equatable method, including to nil values. Like this:

if(myNilJOptional == nil) {
  // Do something
}

Checking out earlier code again, we get the expected result:

This is nil!
This is nil!

This topic is explored more in-depth in my upcoming book.

In the meantime, take a look at my other tutorials and posts about Swift.

Got a question or issue?

Join us on our new forums.

Sharing is caring :)

 
Developing iOS 8 Apps in Swift
An upcoming ebook detailing everything you need to know to produce marketable apps for iOS 8 using swift.
Learn to produce real world applications through tutorials. Available for pre-order today at a 25% discount.


Sign up now and get a set of FREE video tutorials on writing iOS apps when Xcode 6 is released.



Subscribe via RSS