Designing Animations with UIViewPropertyAnimator in iOS 10 and Swift 3

Designing Animations with UIViewPropertyAnimator in iOS 10 and Swift 3

This is part of a series of tutorials introducing new features in iOS 10, the Swift programming language, and the new XCode 8 beta, which were just announced at WWDC 16

UIKit in iOS 10 now has “new object-based, fully interactive and interruptible animation support that lets you retain control of your animations and link them with gesture-based interactions” through a family of new objects and protocols.

In short, the purpose is to give developers extensive, high-level control over timing functions (or easing), making it simple to scrub or reverse, to pause and restart animations, and change the timing and duration on the fly with smoothly interpolated results. These animation capabilities can also be applied to view controller transitions.

I hope to concisely introduce some of the basic usage and mention some sticking points that are not covered by the talk or documentation.

Building the Base App

We’re going to try some of the features of UIViewPropertyAnimator in a moment, but first, we need something to animate.

Create a single-view application and add the following to ViewController.swift.

import UIKit

class ViewController: UIViewController {
    // this records our circle's center for use as an offset while dragging
    var circleCenter: CGPoint!
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        // Add a draggable view
        let circle = UIView(frame: CGRect(x: 0.0, y: 0.0, width: 100.0, height: 100.0))
        circle.center = self.view.center
        circle.layer.cornerRadius = 50.0
        circle.backgroundColor = UIColor.green()
        
        // add pan gesture recognizer to
        circle.addGestureRecognizer(UIPanGestureRecognizer(target: self, action: #selector(self.dragCircle)))
        
        self.view.addSubview(circle)
    }
    
    func dragCircle(gesture: UIPanGestureRecognizer) {
        let target = gesture.view!
        
        switch gesture.state {
        case .began, .ended:
            circleCenter = target.center
        case .changed:
            let translation = gesture.translation(in: self.view)
            target.center = CGPoint(x: circleCenter!.x + translation.x, y: circleCenter!.y + translation.y)
        default: break
        }
    }
}

There’s nothing too complicated happening here. In viewDidLoad, we created a green circle and positioned in the center of the screen. Then we attached a UIPanGestureRecognizer instance to it so that we can respond to pan events in dragCircle: by moving that circle across the screen. As you may have guessed, the result is a draggable circle:

Enjoy

About UIViewPropertyAnimator

UIViewPropertyAnimator is the main class we’ll be using to animate view properties, interrupt those animations, and change the course of animations mid-stream. Instances of UIViewPropertyAnimator maintain a collection of animations, and new animations can be attached at (almost) any time.

Note: “UIViewPropertyAnimator instance” is a bit of a mouthful, so I’ll be using the term animator for the rest of this tutorial.

If two or more animations change the same property on a view, the last animation to be added or started* “wins”. The interesting thing is than rather than an jarring shift, the new animation is combined with the old one. It’s faded in as the old animation is faded out.

* Animations which are added later to a UIViewPropertyAnimator instance, or are added with a start delay override earlier animations. Last-to-start wins.

Interruptable, Reversible Animations

Let’s dive in and add a simple animation to build upon later. With this animation, the circle will slowly expand to twice it’s original size when dragged. When released, it will shrink back to normal size.

First, let’s add properties to our class for the animator and an duration:

// ...
class ViewController: UIViewController {
    // We will attach various animations to this in response to drag events
    var circleAnimator: UIViewPropertyAnimator!
    let animationDuration = 4.0
// ...

Now we need to initialize the animator in viewDidLoad::

// ...
// animations argument is optional
circleAnimator = UIViewPropertyAnimator(duration: animationDuration, curve: .easeInOut, animations: { 
[unowned circle] in
    // 2x scale
    circle.transform = CGAffineTransform(scaleX: 2.0, y: 2.0)
})
// self.view.addSubview(circle) here
// ...

When we initialized circleAnimator, we passed in arguments for the duration and curveproperties. The curve can be set to one of four simple, predefined timing curves. In our example we choose easeInOut. The other options are easeIn, easeOut, and linear. We also passed an animations closure which doubles the size of our circle.

Now we need a way to trigger the animation. Swap in this implementation of dragCircle:. This version starts the animation when the user begins dragging the circle, and manages the animation’s direction by setting the value of circleAnimator.isReversed.

func dragCircle(gesture: UIPanGestureRecognizer) {
    let target = gesture.view!
    
    switch gesture.state {
    case .began, .ended:
        circleCenter = target.center
        
        if (circleAnimator.isRunning) {
            circleAnimator.pauseAnimation()
            circleAnimator.isReversed = gesture.state == .ended
        }
        circleAnimator.startAnimation()
        
        // Three important properties on an animator:
        print("Animator isRunning, isReversed, state: \(circleAnimator.isRunning), \(circleAnimator.isReversed), \(circleAnimator.state)")
    case .changed:
        let translation = gesture.translation(in: self.view)
        target.center = CGPoint(x: circleCenter!.x + translation.x, y: circleCenter!.y + translation.y)
    default: break
    }
}

Run this version. Try to make the circle “breathe”. Hold it down for a second..

A Sticking Point

Take a look at this video of our circle after it has made it all the way to the end of the animation:

It's not moving.

It’s not moving. It’s stuck in at the expanded size.

Ok, so what’s happening here? The short answer is that the animation threw away the reference it had to the animation when it finished.

Animators can be in one of three states:

  • inactive: the initial state, and the state the animator returns to after the animations reach an end point (transitions to active)
  • active: the state while animations are running (transitions to stopped or inactive)
  • stopped: a state the animator enters when you call the stopAnimation: method (returns to inactive)

Here it is, represented visually:

State Transitions

(source: UIViewAnimating protocol reference)

Any transition to the inactive state will cause all animations to be purged from the animator (along with the animator’s completion block, if it exists).

We’ve already seen the startAnimation method, and we’ll delve into the other two shortly.

Let’s get our circle unstuck. We need to change up the initialization of circleAnimator:

expansionAnimator = UIViewPropertyAnimator(duration: expansionDuration, curve: .easeInOut)

…and modify dragCircle::

// ...
// dragCircle:
case .began, .ended:
    circleCenter = target.center
        
    if circleAnimator.state == .active {
        // reset animator to inactive state
        circleAnimator.stopAnimation(true)
    }
    
    if (gesture.state == .began) {
        circleAnimator.addAnimations({
            target.transform = CGAffineTransform(scaleX: 2.0, y: 2.0)
        })
    } else {
        circleAnimator.addAnimations({
            target.transform = CGAffineTransform.identity
        })
    }

case .changed:
// ...

Now, whenever the user starts or stops dragging, we stop and finalize the animator (if it’s active). The animator purges the attached animation and returns to the inactive state. From there, we attach a new animation that will send our circle towards the desired end state.

A nice benefit of using transforms to change a view’s appearence is that you can reset the view’s appearance easily by setting its transform property to CGAffineTransform.identity. No need to keep track of old values.

Note that circleAnimator.stopAnimation(true) is equivalent to:

circleAnimator.stopAnimation(false) // don't finish (stay in stopped state)
circleAnimator.finishAnimation(at: .current) // set view's actual properties to animated values at this moment

The finishAnimationAt: method takes a UIViewAnimatingPosition value. If we pass start or end, the circle will instantly transform to the scale it should have at the beginning or end of the animation, respectively.

About Durations

There’s a subtle bug in this version. The problem is, every time we stop an animation and start a new one, the new animation will take 4.0 seconds to complete, no matter how close the view is to reaching the end goal.

Here’s how we can fix it:

// dragCircle:
// ...
case .began, .ended:
    circleCenter = target.center
    
    let durationFactor = circleAnimator.fractionComplete // Multiplier for original duration
    // multiplier for original duration that will be used for new duration
    circleAnimator.stopAnimation(false)
    circleAnimator.finishAnimation(at: .current)
    
    if (gesture.state == .began) {
        circleAnimator.addAnimations({
            target.backgroundColor = UIColor.green()
            target.transform = CGAffineTransform(scaleX: 2.0, y: 2.0)
        })
    } else {
        circleAnimator.addAnimations({
            target.backgroundColor = UIColor.green()
            target.transform = CGAffineTransform.identity
        })
    }
    
    circleAnimator.startAnimation()
    circleAnimator.pauseAnimation()
    // set duration factor to change remaining time
    circleAnimator.continueAnimation(withTimingParameters: nil, durationFactor: durationFactor)
case .changed:
// ...

Now, we explicitly stop the animator, attach one of two animations depending on the direction, and restart the animator, using continueAnimationWithTimingParameters:durationFactor:to adjust the remaining duration. This is so that “deflating” from a short expansion does not take the full duration of the original animation. The method continueAnimationWithTimingParameters:durationFactor:can also be used to change an animator’s timing function on the fly*.

* When you pass in a new timing function, the transition from the old timing function is interpolated. If you go from a springy timing function to a linear one, for example, the animations may remain “bouncy” for a moment, before smoothing out.

Timing Functions

The new timing functions are much better than what we had before.

The old UIViewAnimationCurve options are still available (static curves like easeInOut, which I’ve used above), and there are two new timing objects available: UISpringTimingParameters and UICubicTimingParameters

UISpringTimingParameters

UISpringTimingParameters instances are configured with a damping ratio, and an optional mass, stiffness, and initial velocity. These are all fed into the proper equation to give you realistically bouncy animations. The initializer for your view property animator will still expect a duration argument when passed an instance of UISpringTimingParameters, but that argument is ignored. The equation doesn’t have a place for it. This addresses a complaint about some of the old spring animation functions.

Let’s do something different and use a spring animator to keep the circle tethered to the center of the screen:

ViewController.swift

import UIKit

class ViewController: UIViewController {
    // this records our circle's center for use as an offset while dragging
    var circleCenter: CGPoint!
    // We will attach various animations to this in response to drag events
    var circleAnimator: UIViewPropertyAnimator?
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        // Add a draggable view
        let circle = UIView(frame: CGRect(x: 0.0, y: 0.0, width: 100.0, height: 100.0))
        circle.center = self.view.center
        circle.layer.cornerRadius = 50.0
        circle.backgroundColor = UIColor.green()
        
        // add pan gesture recognizer to circle
        circle.addGestureRecognizer(UIPanGestureRecognizer(target: self, action: #selector(self.dragCircle)))
        
        self.view.addSubview(circle)
    }
    
    func dragCircle(gesture: UIPanGestureRecognizer) {
        let target = gesture.view!
        
        switch gesture.state {
        case .began:
             if circleAnimator != nil && circleAnimator!.isRunning {
                circleAnimator!.stopAnimation(false)
            }
            circleCenter = target.center
        case .changed:
            let translation = gesture.translation(in: self.view)
            target.center = CGPoint(x: circleCenter!.x + translation.x, y: circleCenter!.y + translation.y)
        case .ended:
            let v = gesture.velocity(in: target)
            // 500 is an arbitrary value that looked pretty good, you may want to base this on device resolution or view size.
            // The y component of the velocity is usually ignored, but is used when animating the center of a view
            let velocity = CGVector(dx: v.x / 500, dy: v.y / 500)
            let springParameters = UISpringTimingParameters(mass: 2.5, stiffness: 70, damping: 55, initialVelocity: velocity)
            circleAnimator = UIViewPropertyAnimator(duration: 0.0, timingParameters: springParameters)
            
            circleAnimator!.addAnimations({
                target.center = self.view.center
            })
            circleAnimator!.startAnimation()
        default: break
        }
    }
}

Drag the circle and let it go. Not only will it bounce back to the starting point. it will even keep the momentum it had when you released, since we passed a velocity argument to initialVelocity: when we initialized the spring timing parameters:

// dragCircle: .ended:
// ...
let velocity = CGVector(dx: v.x / 500, dy: v.y / 500)
let springParameters = UISpringTimingParameters(mass: 2.5, stiffness: 70, damping: 55, initialVelocity: velocity)
circleAnimator = UIViewPropertyAnimator(duration: 0.0, timingParameters: springParameters)
// ...
Springy

At an interval, I drew a small circle at our main circle’s center point in order to trace the animation path for this screenshot. The “straight” lines curve a little, because some momentum was retained as the circle was released and pulled inward by the spring.

UICubicTimingParameters

UICubicTimingParameters allows you to set control points to define a custom cubic Bézier curve. Just note that coordinate points outside of 0.0 – 1.0 are trimmed to that range:

// Same as setting the y arguments to 0.0 and 1.0 respectively
let curveProvider = UICubicTimingParameters(controlPoint1: CGPoint(x: 0.2, y: -0.48), controlPoint2: CGPoint(x: 0.79, y: 1.41))
expansionAnimator = UIViewPropertyAnimator(duration: expansionDuration, timingParameters: curveProvider)

If you’re not happy with those timing curve providers, you can implement and use your own by conforming to the UITimingCurveProvider protocol.

Animation Scrubbing

You can manually set the progress of an paused animation by passing a value between 0.0 and 1.0* to your animator’s fractionComplete property. A value of 0.5, for example, will place the animatable properties halfway towards their final value, regardless of timing curve. Note that the position you set is mapped to the timing curve when you restart an animation, so a fractionComplete of 0.5 does not necessarily mean the remaining duration will be half of the original duration.

Let’s try out a different example. First, let’s initialize our animator at the bottom of viewDidLoad: and pass in two animations:

// viewDidLoad:
// ...
circleAnimator = UIViewPropertyAnimator(duration: 1.0, curve: .linear, animations: {
    circle.transform = CGAffineTransform(scaleX: 3.0, y: 3.0)
})
    
circleAnimator?.addAnimations({ 
    circle.backgroundColor = UIColor.blue()
}, delayFactor: 0.75)
// ...

We aren’t going to call startAnimation this time. The circle should get larger as the animation progresses and start turning blue at 75%.

We need a new dragCircle: implementation as well:

func dragCircle(gesture: UIPanGestureRecognizer) {
    let target = gesture.view!
    
    switch gesture.state {
    case .began:
        circleCenter = target.center
    case .changed:
        let translation = gesture.translation(in: self.view)
        target.center = CGPoint(x: circleCenter!.x + translation.x, y: circleCenter!.y + translation.y)
        
        circleAnimator?.fractionComplete = target.center.y / self.view.frame.height
    default: break
    }
}

Now we’re updating the animator’s fractionComplete to the circle’s vertical position on the view as it’s dragged:

Rev-3 Rev-4

I’ve used the linear timing curve, but this sample would be a good way to get a feel for other curves or a timing curve provider instance. The animation that changes the circle blue follows a compressed version of the animator’s timing curve.

* Custom animators can accept value outside of range 0.0 – 1.0, if they support animations going past their endpoints.

Extensibility

Finally, one of the most interesting things about this release was the apparent philosophy behind it. Don’t like something? Change it. Animation providers and timing curves providers can both be replaced with your own implementation, for example.

In fact, this is almost more of a release of protocols than classes. The underlying protocols got about as much fanfare as everything else, which is great. I love this direction of making more and more built-in functionality accessible to the developer. I’m hoping we see more in the future, and I’m looking forward to seeing what people do with this.


Sign up now and get a set of FREE video tutorials on writing iOS apps coming soon.



Subscribe via RSS

SiriKit Resolutions with Swift 3 and iOS 10 – SiriKit Tutorial (Part 2)

SiriKit Resolutions with Swift 3 in iOS 10 – SiriKit Tutorial (Part 2)

This tutorial written on June 20th, 2016 using the Xcode 8 Beta 1, and is using the Swift 3.0 toolchain.

This post is a follow-up in a multi-part SiriKit tutorial. If you have not read part 1 yet, I recommend starting there.

Resolving requests from SiriKit

In order to make our Siri integration more useful, we can help fill out the content of our message using a callback method from the INSendMessageIntentHandling protocol. Investigating this protocol you can see this show up an optional methods.

resolveRecipients(forSendMessage intent: INSendMessageIntent, with completion: ([INPersonResolutionResult]) -> Swift.Void)
 
resolveContent(forSendMessage intent: INSendMessageIntent, with completion: (INStringResolutionResult) -> Swift.Void)
 
resolveGroupName(forSendMessage intent: INSendMessageIntent, with completion: (INStringResolutionResult) -> Swift.Void)
 
resolveServiceName(forSendMessage intent: INSendMessageIntent, with completion: (INStringResolutionResult) -> Swift.Void)
 
resolveSender(forSendMessage intent: INSendMessageIntent, with completion: (INPersonResolutionResult) -> Swift.Void)

So we can provide SiriKit with further information by implementing as many of these resolutions as we wish. Effectively enabling us to provide information regarding the recipients, content, group name, service name, or sender. These should be relatively self-explanatory.

Let’s try providing some static data for our title and content, to demonstrate how resolutions work.

First, let’s add the resolution for the content of the message, by implementing the resolveContent protocol method.

func resolveContent(forSendMessage intent: INSendMessageIntent, with completion: (INStringResolutionResult) -> Void) {
    let message = "My message body!"
    let response = INStringResolutionResult.success(with: message)
    completion(response)
}

Here we create a string resolution result, and call the success function. This is the simplest way to proceed, but we also have the option of returning a disambiguation, confirmationRequired, or unsupported response. We’ll get to those later, but first let’s actually use the data Siri is providing us.

Siri will send in it’s own transcription of our message in the intent object. We’re interested in the content property, so let’s take that and embed it inside of a string.

func resolveContent(forSendMessage intent: INSendMessageIntent, with completion: (INStringResolutionResult) -> Void) {
    let message = "Dictated text: \(content!)"
    let response = INStringResolutionResult.success(with: message)
 
    completion(response)
}

The content property is an optional, and as such we need to make sure Siri actually provided a transcription. If no transcription was provided then a message won’t be entirely useful, so we need to tell Siri that the information is missing and we need this value. We can do this by returning a resolution result calling the needsValue class method on INStringResolutionResult.

func resolveContent(forSendMessage intent: INSendMessageIntent, with completion: (INStringResolutionResult) -> Void) {
    if let content = intent.content {
        let message = "Dictated text: \(content)"
        let response = INStringResolutionResult.success(with: message)
        completion(response)
    }
    else {
        let response = INStringResolutionResult.needsValue()
        completion(response)
    }
}

SiriKit requesting additional information

Now SiriKit knows when we try to send a message, that the content value is a requirement. We should implement the same type of thing for the recipients. In this case, recipients can have multiple values, and we can look them up in a variety of ways. If you have a messaging app, you would need to take the INPerson intent object that is passed in and try to determine which of your own user’s the message is intended for.

This goes outside the scope of this Siri tutorial, so I’ll leave it up to you to implement your own application logic for the resolveRecipients method. If you want to see an example implementation, Apple have released some sample code here.

More iOS 10 Tutorials

We’ll be continuing to investigate iOS 10 and publish more free tutorials in the future. If you want to follow along be sure to subscribe to our newsletter and follow me on Twitter.

Thanks,
Jameson


Sign up now and get a set of FREE video tutorials on writing iOS apps coming soon.



Subscribe via RSS

Creating an iMessage Sticker App on iOS 10 with Swift – Tutorial (Part 1)

Creating iMessage Apps with XCode 8

This is part of a series of tutorials introducing new features in iOS 10, the Swift programming language, and the new XCode 8 beta, which were just announced at WWDC 16

Intro

Among the most exciting additions in iOS 10 are some of the new app extension types, and the new extension type that really caught my attention was the iMessage app, which allow you to add functionality to the built-in Messages app on iOS.

Unlike other extension types, which requires you to bundle your extension with a standard app, iMessage apps can be released as standalone apps through the the new iMessage App Store, which will be accessible from within the Messages app. While you can include your Messages app extension as part of a standalone app, with this new branch of the app store, for the first time, you don’t have to.

When users use your iMessage app to communicate with people who don’t have it, the recipient will be prompted with a link to download it from the store.

The purpose of this tutorial is to create a quick, searchable guide for creating a sticker app (which you can do without writing a line of code), and very briefly introduce the new Messages framework (Messages.framework) for building interactive apps.

Creating Content and Apps for iMessage

iMessage Sticker Apps

There’s a very easy option for artists (even non-programmers) who want to release content for the iMessage app store. You can offer sticker apps: purchasable packs of stickers (even animated stickers) that users can “peel” (using a long press) and slap onto any balloon in a conversation. Like all iMessage apps, these packs can be a bonus bundled with your existing app, or a standalone package offered for sale. Apart from creating the assets, releasing a sticker app couldn’t be much easier.

Still in Beta

A sticker pane, with a sick computer stuck onto a message

The sticker packs themselves don’t require any code to create. All you really need is a copy of XCode 8 and an Apple Developer account. The few simple guidelines for creating the assets can be found under “Sticker Packs” in the Messages Framework reference. For convenience, I’ve copied a slightly-edited version here.

Each sticker image should meet this criteria:

  • The image must be a PNG, APNG, GIF or JPEG file.
  • The file must be less than 500KB.
  • The image must be between 100×100 and 206×206 points

PNG or APNG (Animated PNG) will generally look best. Make sure your stickers look good against different color backgrounds; they can be applied to photos.

Apple uses points (rather than pixels) for screen coordinates and dimensions on their devices. 10×10 points (pt) @ 2x equals 20×20 pixels (px). The original retina display crammed 4 pixels into the space of one (1×1 became 2×2). Simply by providing ‘@2x’ copies of images at that higher resolution, developers were able to quickly update their old apps to support the sharper display.

Always provide @3x images – between 300×300 and 618×618 pixels. The system generates the @2x and @1x versions by downscaling the @3x images at runtime.

In Xcode’s Attributes inspector, set the Sticker Size for the entire Sticker pack. The system lays out the stickers in the browser based on these sizes. To best match the browser, use sticker images of the specified size. The Sticker Size defaults to Medium.

The recommended maximum dimensions for each size setting are:

  • Small: 100x100pts.
  • Medium: 136x136pts.
  • Large: 206x206pts.

Attributes Inspector

Xcode’s Attributes Inspector after selecting the “Sticker Pack” group in Stickers.xcstickers

Building the App

Start a new project in XCode, and you’ll see there’s a new template for creating a standalone sticker pack.

Sticker Pack Template

You’re almost done. Select Stickers.xcstickers and add some icons so that your sticker pack looks good in Messages and on the store, then drag your stickers into the “Sticker Pack” group.

Note: You can create animated (APNG) stickers with just XCode. Right click inside your sticker pack group and select Add Assets -> New Sticker Sequence. Use the attributes inspector on the right to set options like number of frames, framerate and repetition count.

Sticker Pack Template

Note: If the app won’t run, try the following:

  • Select your “Run” build scheme, ensure that ‘Executable’ is set to [projectname].app (stickers.app in my case)
  • Provide images for at least the 1x versions of the 27×20, 32×24, and 1024×768 point icons, as well as the 2x icon for “Settings” and “Messages” for your target platform
  • Note that 2x means to double the requested dimensions (2x for 29x29pt means provide a 58×58 px image)

If you are ramping up to release your app, make sure to provide images for all icon versions and sizes.

The new version of the simulator for iOS 10 includes a special version of the Messages app which allows you to see both sides of a conversation as you test your iMessage apps.

My sticker app (containing an animated gif and the obligatory cat photo) may not win any awards, but it works. Here is is in action, with the sticker tray open and a few stickers strategically applied:

AAAAAAHHHHH!!

If you’ve made it this far (and provided more/better art) your app is now ready for submission to iTunes Connect and Apple’s review process (pending availability of the release version of XCode 8).

Customizing the Sticker Browser

Developers should be aware that it is possible to customize the appearance of the sticker browser (the window that users peel stickers from) in a full-fledged iMessage app by subclassing MSStickerBrowserViewController. You can also load stickers dynamically by setting your StickerBrowserViewController’s data source to a class which conforms to MSStickerBrowserViewDataSource.

Developing Interactive iMessage Apps

There’s a lot more you can do with iMessage Apps. For now, I just want to briefly discuss Messages.framework. I hope to dive much deeper in the near future.

Note: There is one major limitation that may kill your idea. You won’t be allowed to read the content of user’s messages or even, necessarily, know anything about the participants in the session. Apple takes privacy very seriously; this is as it should be.

A number of the built-in apps provide good examples. Apart from the sticker features, the new drawing capabilities, the music sharing app and the Bing image search, in particular, are good examples of the type of thing that third-party developers could realistically build today.

There is a new, relatively bare-bones, project template in XCode 8: “Messages Application”. MessagesViewController a subclass of MSMessagesAppViewController is your main entry point and does at least provide a number of comments. This is the view controller behind your app’s view in the app drawer and when expanded.

The documentation on these classes is there for anyone getting started and a helpful introduction and associated sample project are now available.


Sign up now and get a set of FREE video tutorials on writing iOS apps coming soon.



Subscribe via RSS

Siri Integration in iOS 10 with Swift – SiriKit Tutorial (Part 1)

Siri integration on iOS 10 – Swift Tutorial

This tutorial written on June 13th, 2016 using the Xcode 8 Beta 1, and is using the Swift 3.0 toolchain.

Get Xcode 8 set up for iOS 10 and Swift 3 compilation.

If you have not yet downloaded Xcode 8 Beta 1, please do so here.

(Optional) Compiling from the command line

To opt in to the Swift 3.0 toolchain you shouldn’t need to change anything unless you want to build from the command line. If you plan to build from the command line, open Xcode-beta and from the OS menu bar select Xcode > Preferences. Then select the Locations tab. At the bottom of the page here you will see “Command Line Tools”. Make sure this is set to Xcode 8.0.

Now if you navigate to the project directory containing the .xcodeproj file, you can optional compile your project by calling xcodebuild from the command line.

(Optional) Migrating from an existing Swift 2 app

If you are working with an existing Swift 2 project and want to add Siri integration with Swift 3.0, click on the root of your project and select Build Settings. Under Swift Compiler – Version, find the field labeled Use Legacy Swift Language Version and set it to No. This will lead to compiler errors most likely that you will need to fix throughout your project, but it’s a step I recommend to keep up with Swift’s ever-changing semantics.

Getting started with SiriKit

First, in your app (or in a new single-view Swift app template if you are starting fresh), switch to the general view by selecting the root of your project. Under this tab you can click the (+) icon in the lower land corner of the side-pane on the left. From the dropdown that appears selection iOS > Application Extension, and then select Intents Extension.

Select Intents Extension

This adds a new intent to the project, and we’ll use it to listen for Siri commands. The product name should be something similar to your app so it’s easy to identify, for example if your app is called MusicMatcher, you could call the Product Name of this intent MusicMatcherSiriIntent. Make sure to also check the checkbox to Include UI Extension. We will need this later in the tutorial, and it’s easiest to just include the additional extension now.

Intents Extension Options

What we’ve created are two new targets as you can see in the project heirarchy. Let’s jump in to the boilerplate code and take a look at the example in the IntentHandler.swift file inside of the Intent extension folder. By default this will be populated with some sample code for the workout intent, allowing a user to say commands such as “Start my workout using MusicMatcher”, where MusicMatcher is the name of our app.

IntentHandler.swift

Run the Template App as-is

It’s helpful at this point to compile this code as-is and try out the command on an actual iOS device. So go ahead and build the app target, by selecting the app MusicMatcher from the Scheme dropdown, and when the target device set to your test iOS device, press the Build & Run button.

Select the MusicMatcher target

You should see a blank app appear, and in the background your extensions will also be loaded in to the device’s file system. Now you can close your app using the Stop button in Xcode to kill the app.

Then, switch your scheme to select the Intent target, and press build & run again.

Select the MusicMatcherSiriIntent target

This will now prompt asking which app to attach to, just select the app you just ran, MusicMatcher. This will present the app again on your device (a white screen/blank app most likely), but this time the debugger will be attached to the Intent extension.

Select the app to run with the extension

You can now exit to the home screen by pressing the home button, or the app may exit on it’s own since you are running the Intent and not the app itself (This is not a crash!!!)

Enable the extension

The extension should now be in place, but we as an iOS device user still may need to enable the extension in our Siri settings. On your test device enter the Settings app. Select the Siri menu, and near the bottom you should see MusicMatcher listed as a Siri App. Make sure the app is enabled in order to enable Siri to pick up the intents from the sample app.

Testing our first Siri command!

Try the Siri command. Activate Siri either by long pressing the Home button, or by saying “Hey Siri!” (note the “Hey Siri!” feature must be enabled in the settings first)

Try out some of the command “Start my workout using MusicMatcher”.

“Sorry, you’ll need to continue in the app.”

If you’re like me this will bail with an error saying “Sorry, you’ll need to continue in the app.” (For some reason this occassionally was not a problem. Ghosts?)

In the console you may see something like this:

dyld: Library not loaded: @rpath/libswiftCoreLocation.dylib
  Referenced from: /private/var/containers/Bundle/Application/CC815FA3-EB04-4322-B2BB-8E3F960681A0/LockScreenWidgets.app/PlugIns/JQIntentWithUI.appex/JQIntentWithUI
  Reason: image not found
Program ended with exit code: 1

We need to add the CoreLocation library to our main project, to make sure it gets copied in with our compiled Swift code.

Select the project root again and then select your main MusicMatcher target. Here under General you’ll find area area for Linked Frameworks and Libraries. Click the (+) symbol and add CoreLocation.framework. Now you can rebuild and run your app on the device, then follow the same steps as above and rebuild and run your intent target.

Finally, you can activate Siri again from your home screen.

“Hey Siri!”
“Start my workout using MusicMatcher”

Siri should finally respond, “OK. exercise started on MusicMatcher” and a UI will appear saying “Workout Started”

MusicMatcher Workout Started

How does it work?

The IntentHandler class defined in the template uses a laundry list of protocols:

First an foremost is INExtension which is what allows us to use the class as an intent extension in the first place. The remaining protocols are all intent handler types that we want to get callbacks for in our class:

INStartWorkoutIntentHandling
INPauseWorkoutIntentHandling
INResumeWorkoutIntentHandling
INCancelWorkoutIntentHandling
INEndWorkoutIntentHandling

The first one is the one we just tested, INStartWorkoutIntentHandling.

If you command-click this protocol name you’ll see in the Apple docs this documentation:

/*!
 @brief Protocol to declare support for handling an INStartWorkoutIntent
 @abstract By implementing this protocol, a class can provide logic for resolving, confirming and handling the intent.
 @discussion The minimum requirement for an implementing class is that it should be able to handle the intent. The resolution and confirmation methods are optional. The handling method is always called last, after resolving and confirming the intent.
 */

Or in other words, this protocol tells SiriKit that we’re prepared to handle the English phrase “Start my workout with AppName Here.”
This will vary based on the language spoken by the user, but the intent will always be to start a workout. The INStartWorkoutIntentHandling protocol calls on several more methods, and they are implemented in the sample code. I’ll leave you to learn more about them if you want to build a workout app, but what I’d rather do in the remainder of this tutorial is add a new intent handler for handling the sending of messages.

Let’s Add a New Message Intent

Now that we’ve confirmed that works, let’s move on to adding a new type of intent for sending messages. The docs here show the following:

Send a message
Handler:INSendMessageIntentHandling protocol
Intent:INSendMessageIntent
Response:INSendMessageIntentResponse

So let’s add the INSendMessageIntentHandling protocol to our class. First we’ll just specify we want to use it by appending it to the list of protocols our class adheres to in IntentHandler.swift. Since I don’t actually want the workout intent’s, I’ll also remove those, leaving us with just this for the class declaration:

class IntentHandler: INExtension, INSendMessageIntentHandling {
    ...

If we just left it at that we wouldn’t be able to compile our code since we stil need to implement the required methods from the INSendMessageIntentHandling protocol.

Again, if you ever need to check what those methods are, just command+click the text INSendMessageIntentHandling and take a look at what method signatures are present that are not marked with the optional keyword.

In this case we find only one required method:

/*!
 @brief handling method
 
 @abstract Execute the task represented by the INSendMessageIntent that's passed in
 @discussion This method is called to actually execute the intent. The app must return a response for this intent.
 
 @param  sendMessageIntent The input intent
 @param  completion The response handling block takes a INSendMessageIntentResponse containing the details of the result of having executed the intent
 
 @see  INSendMessageIntentResponse
 */
public func handle(sendMessage intent: INSendMessageIntent, completion: (INSendMessageIntentResponse) -> Swift.Void)

Adhering to the new Message Intent protocol

So back in our IntentHandler.swift, let’s add a line seperator (useful for navigating code with the jump bar)

// MARK: - INSendMessageIntentHandling

Underneath this MARK, we can implement the function. I find it’s most useful with Xcode 8 to simply begin typing the method name, and let autocomplete take it from there, choosing the relevant option.

Fill out the handle method with autocomplete

In our handler, we’ll need to construct an INSendMessageIntentResponse in order to call back the completion handler. We’ll just assume all messages are successful here and return a success value for the user activity in the INSendMessageIntentResponse constructor, similar to how this is done in the template app. We’ll also add a print statement so we can see when this handle method is triggered by a Siri event:

func handle(sendMessage intent: INSendMessageIntent, completion: (INSendMessageIntentResponse) -> Void) {
    print("Message intent is being handled.")
    let userActivity = NSUserActivity(activityType: NSStringFromClass(INSendMessageIntent))
    let response = INSendMessageIntentResponse(code: .success, userActivity: userActivity)
    completion(response)
}

Adding the intent type to the Info.plist

Before this app will be capable of handling INSendMessageIntent, we need to add the value to our Info.plist. Think of this as something like an app entitlement.

In the Info.plist file of the intent, find and expand the NSExtension key. Then extend NSExtensionAttributes, and then IntentsSupported under that. Here we need to add a new row for our INSendMessageIntent to allow the app to process Message intents.

Add support for INSendMessageIntent aka messages intent

Testing the new intent

Now that we’ve got our new intent set up, let’s give it a try. Recall that you must build the app, run it on the device, and then run the extension in order to debug the extension. If you don’t do run in this order the extension will either not work, or it will not log to the Xcode console.

Try calling upon our intent in Siri, and you will now see a new message window appear! The window is pretty empty, and there isn’t much logic to tie in to our app just yet. We need to implement the remaining callbacks and add some of our app’s messaging logic to provide a better experience. We’ll cover that in Part 2, which is available now. If you want me to email you about it when other tutorials come out as well, sign up for my newsletter to get the scoop.


Sign up now and get a set of FREE video tutorials on writing iOS apps coming soon.



Subscribe via RSS