Lecture #5: Protocols and Gestures

Please note, this blog entry is from a previous course. You might want to check out the current one.

The first lecture of the third week of the course is named “Protocols and Gestures (October 11, 2011)” and can be found at iTunes. Its slides are available at Stanford.

It starts with a theoretical part presenting

  • Autorotation,
  • Protocols and
  • Gesture Recognizers.

When a device rotates you can choose how your application should react:

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)orientation
{
    // only support portrait return YES;
    return UIInterfaceOrientationIsPortrait(orientation);  
    // support all orientations
    return YES;
    // anything but
    return (orientation != UIInterfaceOrientationPortraitUpsideDown);
}

The adjustment of the all subviews is based on their struts and springs set in storyboard or via code. However, normally they are not created newly with according to their new size parameters, but will be stretched, squished or moved. This behavior can be changed via the

@property (nonatomic) UIViewContentMode contentMode; 

setting it to

  • UIViewContentMode{Left,Right,Top,Right,BottomLeft,BottomRight,TopLeft,TopRight} or
  • UIViewContentModeScale{ToFill,AspectFill,AspectFit} or
  • UIViewContentModeRedraw.

The later will call drawRect: to recreate your content.

Protocols define additional APIs which an somebody else who wants to use it has to or optionally might want to implement to use this protocol.

Gesture recognizers defines to which gestures the application should react and what to do when they happen.

The lecture concludes with a 30-min demo on what was presented during this and the previous lecture. The code form the lecture can be downloaded directly from Stanford or from github.

FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

Leave a Reply

Your email address will not be published.