Updating Mac Apps for Retina Displays, Including Lots of CGImageRefs

June 21, 2012

As expected, Apple made the surprise announcement of retina MacBook Pros at last week’s WWDC. This is tremendous.

This change is pretty straightforward: for every 1 pixel you used to have, you now have 4 pixels. Most classes in Cocoa get this awesomeness “for free” (which means you don’t really have to think about it all that much; plop down some “@2x” graphics and you’re golden for the most part. LOL, I know that’s easier done than said.)

The Mac version of Lens•Lab, unfortunately, did not get much for free. I use a lot of CGImageRef objects and these are not “retina aware.” So basically, I had to figure out what parts of Lens•Lab’s drawing routines would need to be updated and I also had to figure out what other things needed to be altered to make the app behave the same for users with retina displays. That’s what I did this afternoon.

Step 0: prepare your development environment.

Make sure you’ve downloaded Xcode’s Graphics Utilities which includes Quartz Debug which allows you to enable HiDPI display modes. You’ll need a pretty big monitor for these modes to be useful. For example, my monitor is a 24″ that has a native resolution of 1920×1200. Chopping that in half on both axes makes for a HiDPI resolution of 960×600. Not huge but enough to work with.

Step 0.5: Run your app!

This seems kind of silly but you should just run it to see if there are any major problems with it. I did and here’s what I found:

Obviously, something is going wrong. Let’s try to find out what!

Step 1: separate points from pixels.

Up to now, whenever you’ve created apps for the Mac you could safely assume that 1 pixel in your app would be represented by 1 pixel on your monitor. Retina displays change all that and we need to start thinking in terms of “points” rather than “pixels.” Interface elements like windows, views, and controls should actually be measured in points and the backing pixel data should be left to the operating system to figure out. Most of the time.

In my case with Lens•Lab, all the magic is happening in the main NSView’s drawRect method. The basic order of operations is this:

1) draw the background image (the blue/green gradient thing). No problem, it’s an NSImage drawing into an NSView. Both those get retina for free.

2) draw the vector artwork at the appropriate scale. This is also no problem, it’s a CGPathRef; all vectors.

3) create bitmap of the background. I do this by [self lockFocus] on the the view, getting an NSBitmapImageRep from that, creating a CGImageRef from that. Here’s our first bit of hairiness. NSBitmapImageRep is retina aware but CGImageRef is not. What’s going to happen is that we’re going to create a CGImageRef that has the number of pixels appropriate for the display, but has no information about the display’s native scaling factor. For example, on a normal display the NSView might be 800 points wide and our CGImageRef of it would be 800 pixels. But on a retina display the NSView might be 800 points wide but the CGImageRef will be 1600 pixels wide! At this point it would be nice to know about the display’s scaling factor so we can compensate for it. Note: normally we would not care but in our case, we need to know because we’re using CGImageRefs that are retina ignorant.

Here’s how I get the display scaling information:

CGFloat displayScale = 1.f;
if ([[NSScreen mainScreen] respondsToSelector:@selector(backingScaleFactor)]) {
displayScale = [NSScreen mainScreen].backingScaleFactor;
}

I use the respondsToSelector if statement because the backingScaleFactor property is 10.7 only and I still target 10.6 for distribution (there are no 10.6 machines with retina so this effectively sets the displayScale to 1 for those machines.)

4) after we get a bitmap of the background + vector artwork we chop it into two pieces, one piece is for the blurred foreground and the other is for the blurred background. Since the background bitmap we created in step 3 has pixels but no scaling information, from here on we have to adjust to this fact. We create these two pieces by using CGImageCreateWithImageInRect(). The CGRects that we use to define the foreground and background blur areas will need to be adjusted to account for possibly different scales. We basically multiply each member of the CGRect struct by the scaling factor.

5) blur the foreground and background bitmaps by a set amount. You’d think this would be straightforward but it’s not. When we blur with our blur operation (it’s a subclass of NSOperation) we pass in blur radius information. Since a retina image has twice the resolution, we need to multiply the blur radius by the scaling factor for the apparent radius to remain the same. So, blurRadius gets multiplied by the scaling factor. Cool.

6 draw the blurred images in the NSView. Now we’re going from our high-resolution images and drawing them to an NSView that doesn’t know that they are hi-res. To draw these bitmaps we use CGContextDrawImage() and need to pass in rects to draw our images into. The measurements of these rects are going to be the measurements of the blurred pieces *divided* by the scaling factor. This maintains the high pixel count but halves the dimensions which is perfect.

7) draw everything else. Everything else is just vector drawing commands and drawing text. No big deal, all retina aware! Here’s the finished product:

Step 2: THERE IS NO STEP 2!!!

The new retina MacBook Pros are a major advancement in personal computing. The only other similarly weighty advancement I can think of is the use of SSDs. For programmers it may take a bit of work to make your apps take advantage of this new advance in display technology. Especially if you get down to the pixel level, you’ll have to put some thought into it. But the benefits are more than worth it!

Lens•Lab for the Mac with updated retina display has been submitted to the Mac OS X App Store. I expect a week until it is released.


On the Value of Apple Laptops

June 19, 2012

Three years ago at WWDC, Apple announced the unibody 13″ MacBook Pro. I ordered one within hours of the announcement.

I had been using a white MacBook for a few years and I thought it was a great machine. After the aluminum unibody machines came out I started lusting after those designs. I had had experience with 15″ laptops before and I just thought they were too big. I needed FireWire. All of these needs and wants were distilled perfectly in the 13″ unibody MacBook Pro.

I purchased the lowest-end configuration with my wife’s student discount (2GB) and immediately upgraded the RAM to 4GB with memory purchased from NewEgg. Over the years I’ve upgraded the RAM again to 8GB ($35) and added a 80GB Intel SSD (X-25MG2). I just now replaced the old battery (1,052 charge/discharge cycles) myself with a new battery for $66. I’ll be installing Mountain Lion as soon as the golden master is released.

With that new battery, my three year old MacBook Pro is better than it ever was new. It’s totally fast enough for me and the work I do on it (which includes software development and SLR photography.) It has kept its value so well, it is currently worth $850 according to the internet. Subtract that from the total cost of $1450 ($1100 for the laptop, $35 for the RAM, $65 for the battery, $250 for the SSD) and you get $600 is what I have spent on computer technology for the past 3 years. That’s $200 a year.


NSString Concatenation (Category)

June 6, 2012

Hi folks. Just got back from a trip but I thought I would write you and tell you about this thing.

Sometimes, you just want to concatenate a bunch of strings together. There are several ways to do this. You can use stringByAppendingFormat with NSStrings or you can use append with NSMutableStrings. You can also add a bunch of strings to an array and generate a new string based on the array. Some languages overload the “+” operator to allow strings to be added together. Since Objective-C is a strict superset of C we can’t overload that operator.

So, what I’ve done is simply create a category on the NSString class. It contains a single class method called “concatenateStrings” that takes any number of arguments. Any number of NSString or NSMutableString arguments are concatenated together and returned as an NSString.

Get the example project from here.