Lecture #16: Action Sheets, Image Picker, Core Motion

Please note, this blog entry is from a previous course. You might want to check out the current one.

Lecture sixteen is named “16. Action Sheets, Image Picker, Core Motion (November 17, 2011)” and can be found at iTunes. Its slides are available at Stanford.

This lecture is a continuation of the previous one providing more insight into timers, but also giving an overview about alerts and action sheets, image pickers and Core Motion.

Perform after delay is an alternative time to NSTimer, discussed in the previous lecture:

- (void)performSelector:(SEL)aSelector
             withObject:(id)argument
             afterDelay:(NSTimeInterval)seconds;

It executes on the current thread (only the main thread should be used) after the specified delay. Like NSTimer it is not a real time timer. Even when a delay of zero seconds is specified it will not execute immediately but allows a task to reschedule itself.

It is possible to cancel scheduled tasks:

+ (void)cancelPreviousPerformRequestsWithTarget:(id)target
                                       selector:(SEL)aSelector
                                         object:(id)object;
+ (void)cancelPreviousPerformRequestsWithTarget:(id)target;

Where the first methods allows to cancel a specific request, the second cancels all scheduled requests. However there is no way to know which request is outstanding.

The demo at this point of the lecture enhances the Kitchen-Sink application from the previous lecture showing the various timer mechanisms as well as further animations.

Action Sheets and Alerts provide two kinds of pop ups allowing to interact with the user. Action sheets slide in from the bottom of the screen for iPhones and appear in a pop over on iPads. Alerts always pop up in the middle of the screen. Where action sheets can ask questions with more than two answers, alerts are limited to at most two answers. However alerts should be used with care as they tend to be disruptive to the user interface.

Buttons on action sheets are setup in the initializer or can be added proprogrammatically grammatically later on:

     -(id)initWithTitle:(NSString *)title
               delegate:(id <UIActionSheetDelegate>)delegate
      cancelButtonTitle:(NSString *)cancelButtonTitle
 destructiveButtonTitle:(NSString *)destructiveButtonTitle
otherButtonTitles:(NSString *)otherButtonTitles, ...;

- (void)addButtonWithTitle:(NSString *)buttonTitle;

For the iPhone – as mentioned above – the action sheet has to be shown differently:

[actionSheet showInView:(UIView *)];

than on the iPad:

[actionSheet showFromRect:(CGRect) inView:(UIView *) animated:(BOOL)];
[actionSheet showFromBarButtonItem:(UIBarButtonItem *) animated:(BOOL)];

Where showFromRect: could also be used for the iPhone showFromBarButtonItem: won’t work.

The return values from an action sheet are received by delegates:

- (void)actionSheet:(UIActionSheet *)sender clickedButtonAtIndex:(NSInteger)index;

using indexes to access the pressed button:

@property NSInteger cancelButtonIndex;
@property NSInteger destructiveButtonIndex;
@property (readonly) NSInteger firstOtherButtonIndex;
@property (readonly) NSInteger numberOfButtons;
- (NSString *)buttonTitleAtIndex:(NSInteger)index;

The action sheet can be dismissed via code:

- (void)dismissWithClickedButtonIndex:(NSInteger)index animated:(BOOL)animated;

Action sheets do not need a cancel button because they are dismissed automatically by clicking outside the action sheet. Special care has to be taken for when a click to open a action sheet is repeated as this would open multiple action sheets. Note that currently the human interface guidelines do not allow to close an action sheet on clicking its button a second time.

Setting up and using an alert view is similar to action sheets:

-(id)initWithTitle:(NSString *)title
           message:(NSString *)message
          delegate:(id <UIActionSheetDelegate>)delegate 
 cancelButtonTitle:(NSString *)cancelButtonTitle 
 otherButtonTitles:(NSString *)otherButtonTitles, ...;

- (void)addButtonWithTitle:(NSString *)buttonTitle;

[alertView show];

The demo at this point of the lecture sets up an action sheet controlling the functionality of the Kitchen-Sink application.

The UIImagePickerController is a modal view to get media from the camera or photo library, which depend heavily on the capabilities of the used device, e.g.:

+ (BOOL)isSourceTypeAvailable:(UIImagePickerControllerSourceType)sourceType;
+ (NSArray *)availableMediaTypesForSourceType:(UIImagePickerControllerSourceType)sourceType;

Note that it might be necessary to add the Mobile Core Services framework and import MobileCoreServices/MobileCoreServices.h depending on your setup.

Further available device checks are:

+ (BOOL)isCameraDeviceAvailable:(UIImagePickerControllerCameraDevice)cameraDevice;
+ (BOOL)isFlashAvailableForCameraDevice:(UIImagePickerControllerCameraDevice);
+ (NSArray *)availableCaptureModesForCameraDevice:(UIImagePickerControllerCameraDevice); 

When the device capabilities are assessed, it is time to set the source and the type of the media, e.g.:

UIIPC *picker = [[UIIPC alloc] init];
picker.delegate = self;
if ([UIIPC isSourceTypeAvailable:UIIPCSourceTypeCamera]) {
    picker.sourceType = UIIPCSourceTypeCamera;
}
NSString *desired = (NSString *)kUTTypeMovie;
if ([[UIIPC availableMediaTypesForSourceType:picker.sourceType] 
                              containsObject:desired]) {
    picker.mediaTypes = [NSArray arrayWithObject:desired];
    // proceed to the actual pick up
} else {
    // failure
}

Note that the delegate above needs to implement both protocols UIImagePickerControllerDelegate and UINavigationControllerDelegate.

The user can be allowed to edit the media before it is sent to the delegate:

@property BOOL allowsEditing;

The actual capturing can be set to be limited:

@property UIIPCQualityType videoQuality;
@property NSTimeInterval videoMaximumDuration;

The picker is presented as modal view – on iPads camera and photo library have to presented separately.

When the user has finished the delegate is called who has to dismiss the modal view, e.g.:

- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    // handle returned media
    [self dismissModalViewControllerAnimated:YES];
}

- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
    [self dismissModalViewControllerAnimated:YES];
}

The info dictionary above provides the information about the captured video:

UIImagePickerControllerMediaType
UIImagePickerControllerOriginalImage
UIImagePickerControllerEditedImage
UIImagePickerControllerCropRect
UIImagePickerControllerMediaMetadata
UIImagePickerControllerMediaURL
UIImagePickerControllerReferenceURL

The demo at this point of the lecture adds the possibility to take pictures and add them to the kitchen sink. Note that this part of the demo works only on real devices with camera support and will not work in the simulator.

The Core Motion framework allows to access the motion sensing hardware namely the accelerometer, the gyroscope and the magnetometer. Like above not all devices have all those features thus it is necessary to check which are available before using them:

@property (readonly) BOOL {accelerometer,gyro,magnetometer,deviceMotion}Available;

The sensors are started and stopped via

- (void)start{Accelerometer,Gyro,Magnetometer,DeviceMotion}Updates;
- (void)stop{Accelerometer,Gyro,Magnetometer,DeviceMotion}Updates;

Stopping the data collection when ever possible is essential for performance improvement.

The properties

@property (readonly) BOOL {accelerometer,gyro,magnetometer,deviceMotion}Active;

tell which sensors are currently active.

The accelerometer provides

@property (readonly) CMAccelerometerData *accelerometerData;

including

@property (readonly) CMAcceleration acceleration; 
typedef struct { double x; double y; double z; } CMAcceleration; // in ā€œgā€


The gyroscope provides


@property (readonly) CMGyroData *gyroData;


including


@property (readonly) CMRotationRate rotationRate; 
typedef struct { double x; double y; double z; } CMRotationRate; // in radians/second

The magnetometer provides

@property (readonly) CMMagnetometerData *magnetometerData;

including

@property (readonly) CMMagneticField magneticField; 
typedef struct { double x; double y; double z; } CMMagneticField; // in microteslas

A “intelligent” combination of those is provided via

@property (readonly) CMDeviceMotion *deviceMotion;

including

@property (readonly) CMAcceleration gravity;
@property (readonly) CMAcceleration userAcceleration;
struct { double x; double y; double z; } CMAcceleration;

@property CMRotationRate rotationRate;
typedef struct { double x; double y; double z; } CMRotationRate;

@property CMAttitude *attitude;
@interface CMAttitude : NSObject
@property (readonly) double roll;
@property (readonly) double pitch;
@property (readonly) double yaw;
@end

@property (readonly) CMCalibratedMagneticField magneticField; struct {
    CMMagneticField field;
    CMMagneticFieldCalibrationAccuracy accuracy;
} CMCalibratedMagneticField;
enum {
    CMMagneticFieldCalibrationAccuracyUncalibrated,
    Low, Medium, High
} CMMagneticFieldCalibrationAccuracy;

To use a sensor it/they has/have to be registered:

- (void)startAccelerometerUpdatesToQueue:(NSOperationQueue *)queue
                             withHandler:(CMAccelerometerHandler)handler;
typedef void (^CMAccelerationHandler)(CMAccelerometerData *data, NSError *error); 

- (void)startGyroUpdatesToQueue:(NSOperationQueue *)queue
                    withHandler:(CMGyroHandler)handler;
typedef void (^CMGyroHandler)(CMGyroData *data, NSError *error);

- (void)startMagnetometerUpdatesToQueue:(NSOperationQueue *)queue
                            withHandler:(CMMagnetometerHandler)handler;
typedef void (^CMMagnetometerHandler)(CMMagnetometerData *data, NSError *error);

- (void)startDeviceMotionUpdatesToQueue:(NSOperationQueue *)queue
                            withHandler:(CMDeviceMotionHandler)handler;
typedef void (^CMDeviceMotionHandler)(CMDeviceMotion *motion, NSError *error); 

- (void)startDeviceMotionUpdatesUsingReferenceFrame:(CMAttitudeReferenceFrame)frame
                                            toQueue:(NSOperationQueue *)queue
                                        withHandler:(CMDeviceMotionHandler)handler;
enum {
    CMAttitudeReferenceFrameXArbitraryZVertical,
                            XArbitraryCorrectedZVertical,
                            XMagneticZVertical,
                            XTrueNorthZVertical
}

@property (nonatomic) BOOL showsDeviceMovementDisplay;

The queue in those function should be

[[NSOperationQueue alloc] init] or [NSOperation mainQueue (or currentQueue)]

The rate of the data collection is set via

@property NSTimeInterval accelerometerUpdateInterval;
@property NSTimeInterval gyroUpdateInterval;
@property NSTimeInterval magnetometerUpdateInterval;
@property NSTimeInterval deviceMotionUpdateInterval;

The code of the demo shown during this lecture is available at Stanford. An extended version showing the image picker in a pop up and adding Core Motion functionality is also available at Stanford and on github.

Facebooktwittergoogle_plusredditpinterestlinkedintumblrmailFacebooktwittergoogle_plusredditpinterestlinkedintumblrmail

Flattr this!

Leave a Reply

Your email address will not be published. Required fields are marked *