Monday, April 6, 2015

Using CAAnimationGroups, reveal animations using CShapeLayers as masks, and more.

iOS-CAAnimation-group-demo

This is a demo project that illustrates various animation techniques
It shows 3 different kinds of animations:
  • A simple UIView animation that animates an image in a straight line while increasing the scale of the image and rotating it around it's axis
  • A "clock wipe" animation that gradually reveals an image in a circular arc like a radar display, then hides it again
  • A complex sequence of animations that are managed using a CAAnimationGroup.
It also demonstrates how to detect taps on a layer that is being animated to move across the screen.

UIVIew animation (View Animation button)

The UIView animation is performed by the method -doViewAnimation: (id) sender in viewController.m. It uses the method animateWithDuration:delay:options:animations:completion: to do it's job. UIView animations modify animatable properties of one or more views. It is possible to animate mutliple animatable properties of multiple view objects with a single UIView animation call. the doViewAnimation method animates the view's center, scale, and rotation all at the same time.

Clock Wipe animation (Mask Animation button)

The clock wipe animation is performed in the method - (IBAction)doMaskAnimation:(id)sender;. It works by creating a shape layer (CAShapeLayer) and setting it as the mask for an image view's layer. We set the shape layer to contain a an arc that describes a full circle, where the radius of the arc is 1/2 of the center-to-corner distance of the view. The line width of the arc is set to the arc radius, so the arc actually fills the entire image bounds rectangle.
CAShapeLayers have a properties strokeStart and strokeEnd. Both values range from 0.0 to 1.0. Normally strokeStart = 0 and strokeEnd = 1.0. If you set strokeEnd to a value less than 1, only a portion of the shape layer's path is drawn.
The doMaskAnimation method sets strokeEnd = 0 to start, which means the path is empty, and the entire image view is hidden (masked.) It then creates a CABasicAnimatimation that animates the strokeEnd property from 0.0 to 1.0. That causes the layer's path to animate an ever-increasing arc. Since the line thickness for hte shape layer is very thick, the arc fills the entire bounds of the image view, revealing an ever-increasing portion of the image view.
The animation looks like this:
Clock Wipe animation

CAAnimationGroup animation. (CAAnimation button)

The "CAAnimation" button invokes the method - (IBAction)doAnimation:(id)sender. It performs a whole sequence of animations. It does this buy creating a CAAnimationGroup, and then creating a sequence of individual CAAnimation objects of different flavors. It sets the beginTime property of each animation so that each animation step in the animation group begins when the next animation finishes.

What you will learn:

This project demonstrates a wide variety of animation techniques
  • Using CABasicAnimation to animate a property and move images around on the screen.
  • Using different animation timing functions like kCAMediaTimingFunctionLinear, kCAMediaTimingFunctionEaseIn, and kCAMediaTimingFunctionEaseInEaseOut to get different effects
  • Using CAKeyframeAnimation and a CGPath to animate a layer along a curved path (a figure 8).
  • Creating a custom subclass of UIView that has a CAShapeLayer as it's backing layer so you can draw shapes in a view "for free."
  • Adding a CGPath to a shape layer to draw shapes on the screen.
  • Using CAAnimationGroup to create a linked series of animations that run in sequence
  • Creating a very clean "per animation" completion block scheme using the fact that CAAnimation objects support the setValue:forKey: method. I add a code block to an animation object and set up the animation delegate's animationDidStop:finished method to check for a special key/value pair with the key kAnimationCompletionBlock.
  • Using the cumulative property on animations to create a single repeating animation that continuously rotates a layer by any desired amount.
  • Using a CATapGestureRecognizer to detect taps on a view.
  • Detecting taps on a view while it animates "live" by using the hitTest method of the view's presentation layer
  • Pausing and resuming animation on a layer.

Cropping Images from Swift

CropImg


This post describes A sample application for cropping images, written in Swift.
The application is available on Github, called CropImg

The CropppableImageView class:

The main class is the CropppableImageView class, which is a subclass of UIView.
To use a CroppableImageView in your project, drag a UIView into your XIB/storyboard. Then use the "Idenity Inspector" to change the type to CroppableImageView.
If you need to be notified if there is a valid crop area defined, set up a delegate object that conforms to the CroppableImageViewDelegateProtocol. That protocol only has 1 method, haveValidCropRect(). The CroppableImageView will call your haveValidCropRect() method when the user selects/deselects a crop rectangle. You can use the haveValidCropRect()method to enable/disable a crop button, for example.
The CropppableImageView has a method croppedImage() that returns a new image containing the portion of the source image the user has selected, or nil if the selection rectangle isn't valid.

The CornerpointView class:

The CropppableImageView class uses another class, CornerpointView, to draw the cornerpoints of the image view, and allow dragging of the cornerpoints. A CropppableImageView sets up 4 CornerpointView objects and adds them as subviews in it's init method.
The initalizers for CornerpointView create pan gesture recognziers and connect them to the view so  CornerpointView objects are automatically draggable. The CornerpointViewcenterPoint property is optional and is initially nil. The centerPoint property has a didSet method that hides the CornerpointView if the centerPoint is nil and un-hides the corner point if the centerPoint is not nil.
The CornerpointView class has an optional cornerpointDelegate property. (If you set a conerpointDelegate, it must conform to the CornerpointClientProtocol.) The CropppableImageView sets itself up as the delegate of it's CornerpointViews.
The only method in the CornerpointClientProtocol is cornerHasChanged. It simply tells the delegate that the user has moved the corner point. It passes a pointer to itself so the delegate can tell which corner has changed.

The ViewController class:

The 'ViewControllerclass coordinates between theCropppableImageView` and the button that triggers image cropping.
The 'ViewController` class also offers a button to load a new image into the image view.
Loading a new image is handled by the handleSelectImgButton IBAction method. This method uses the new UIAlertControllerclass, added in iOS 8 instead of the now-deprecated UIAlertView. (Note that if you want your app to run under iOS 7 and 8, you will still have to use a UIAlertView, or write code that uses a UIAlertView on iOS 7 and a UIAlertController under iOS 8)
UIAlertController uses a modern block-based design pattern, where you create one or more UIAlertAction objects and attach them to the UIAlertController. These UIAlertAction objects are usually drawn as buttons, and inlcude a block of code that's executed when the user chooses that option.
The "Take a New Picture" action and the "Select Picture from library" action both call the method pickImageFromSource. This method creates and displays a UIImagePickerController. The docs for UIImagePickerController say that you must use a popover to display the picker controller in a popover on iPad for anything but taking a picture with the camera. However, I've found that displaying a full-screen picker works on iPad, and it gives the user more room to navigate their photo library.
The crop button on the view controller's view is linked to the handleCropButton() IBAction method. The handleCropButton()method calls the CropppableImageViewcroppedImage() mehod to create a croppped image. It then plays a shutter sound, displays a white view on top of the image to simulate a flash of light, then finally calls the Cocoa Touch method UIImageWriteToSavedPhotosAlbum to save the cropped image to the user's photo album.
There is code at the bottom of the handleCropButton() method that will save the cropped image to the user's documents directory instead, in case that's what you need to do in your app.

Friday, April 3, 2015

Testing Swift's performance against C/Objective-C


When Apple first announced Swift, they claimed dramatic speed improvements. 

I was skeptical. I'm an old assembler programmer. I know how to write code that runs fast. 

When I write code that's churning through large amounts of data, I tend to use C arrays and pointer math, rather than trying to use NSArrays. It seems obvious that the overhead of everything being an object and message passing is going to slow things down.

Thus, I suspected that Apple's claims that Swift is faster than Objective-C was based on naive code written in Objective-C, where everything is an object.

Several years ago I wrote a prime number generator as a teaching aid for my son, who was starting to get interested in programming. It's mostly in C, with some Objective-C wrappers for displaying information and such-like.

I decided to implement the same algorithm in Swift and see which is faster. This algorithm makes heavy use of arrays. It creates an array of known prime numbers, and tests each candidate number to see if it can be evenly divisible by existing known primes. If not, it gets added to the list of primes. It ends up doing millions of array accesses, and an array append for each new prime. 

I ended up with a test project that lets you run the same algorithm in either Swift or Objective-C, and use either array objects or memory buffers in either language. (Swift has a collection wrapper for memory buffers that gives you array syntax.)

You can download the project from Github and try it yourself. It's called 


The bottom line is this:

With the standard Debug settings, where optimizations are turned off, Swift performance is horrible. Code that takes a few seconds in Objective C takes minutes in Swift.  (6 min 7 sec for Swift vs about 9.2 sec for Objective-C). For array-intensive algorithms, Swift in debug mode is all but unusable.

All my tests therefore use release mode, with optimizations of "Fastest" (or fastest+unchecked; see below.)

C code beats out the fastest Swift code I could come up with, but the differences are modest. In Xcode 6.2, using Swift 1.1, the Swift version takes about 1.45 times longer than the Objective-C code using C arrays.

If you turn off array index range checking with a compiler switch the Swift code gets a little faster, and if you use the Xcode 6.3 beta, it gets a little faster still. In that case the Swift code (using memory buffers with an array interface)  is very close to Objective-C/C. Swift takes about 1.08 x times longer than my C/Objective-C code. That's awfully close. However, turning off array bounds checking is dangerous, and I don't know of a way to do that for only one array object. As far as I know this change is for the entire project. It's probably possible to do change compiler settings for a single source file, but I don't know how to do that off the top of my head.

Things are much different if you use the array classes in both languages.

In that case, Swift is faster than Objective-C, and by a lot. To generate 2 million primes, Swift takes about 7.35 seconds, down to about 5.9 seconds if you use Xcode 6.3 and use the compiler setting that turns off array bounds checking. Objective-C using NSArrays takes around 31 seconds to do the same job. Objective-C with NSArrays takes about 4.3 times longer than Swift using it's Array class.

So the bottom line is that if you're using Array objects in both languages, Swift is much faster. If you're writing optimized code to squeeze the fastest performance out of each, though, C/Objective-C still has a slight edge.

Drawing pie charts: An early experiment in Swift

Lately I've been studying Swift. There was a question on Stack Overflow on how to generate a Pie chart, so I decided to make that a test project. Thus the app PieChart was born.

PieChart

A sample iOS app written in swift that generates pie charts.

You can download the project from Github: PieChart

This program demonstrates a number of techniques, both in using the Swift language and using UIKit and Core Animation.
The screen looks like this:


The app defines a structure Slice which describes a single slice of a pie chart:

struct Slice
{
  var radius: CGFloat
  var width:  CGFloat
  init(
    radius:     CGFloat = 1.0,
    width:      CGFloat = 0.125
    )
  {
    self.radius = radius
    self.width = width
  }
}

Both the radius and width settings define default values, so you can create a slice object with

Slice()
Slice(radius: 1.0)
Slice(width: 1.0)

Or

Slice(radius: 0.2 width: .5)

If you don't specify a radius, all the slices use the largest radius size. If you don't specify a width, all slices get the same width value, so each slice is has the same arc angle.
The class PieChartView, a subclass of UIView, does most of the work.
It has a property slices: [Slice] that is an array of Slice objects.
The slices property of the PieChartView has a didSet property obserer, so when you change the slices array, the view updates the pie chart to reflect the changes.
This is a cool trick with Swift. This simple property declaration

var slices: [Slice] = []
{
  didSet
  {
    self.updatePath()
  }
}

...defines the property obsever.
The property observer invokes the method updatePath() if you change the slices array or any of it's elements. The updatePath() method rebuilds the PieChart path and displays it.
The pie chart graph is drawn using a CAShapeLayer attached to the view. The PieChartView creates a UIBezierPath that contains "pie wedge" shapes for each slice in the graph.
It then installs the CGPath from the UIBezierPath into the path property of the view's UIShapeLayer.
If you change the values int the slices array without changing the number of elements, the PieChartView animates the changes to the graph by creating a CABasicAnimation that animate the change to the CAShapeLayer's path property.
Animating a CAShapeLayer's path property only works properly if the starting and ending path have the same number and type of control points.