Simple SVG Parser for iOS

September 29, 2013

I’ve added a simple SVG Parser for iOS on github. Get it here:

Here’s how it works:

1.) Call the class method to get back an array of styled shapes

self.shapes = [JMSVGParser parseFileNamed:@"awesome_tiger"]

2.) In your view’s – (void)drawRect:(CGRect)rect method, tell each shape to draw, like this:

for (JMStyledPath *styledPath in self.shapes) {
    [styledPath drawStyledPath];

Screen Shot 2013-09-29 at 11.00.06 PM

That’s it! Comments, questions? Leave a note!

WWDC 2013: I’ll be there!

April 26, 2013

Thanks to my awesome employer and a bit of luck, I’ll be heading to WWDC 2013 in June! I was one of those people who was there at 10:00AM PDT but never made it through the ticket purchase process. Later that day, I got a call from Apple that said I would be getting an email with purchase instructions. I got the email, purchased the ticket, and it’s totally activated!

Hanging out with fellow Apple nerds at WWDC has been a dream of mine for over a decade. I’m super excited to learn and hang out with fellow developers.

Do you have any tips or suggestions on how I can make the most of it? Are you going and want to have a beer or two? Drop a comment or message!

Canon EOS M Not-Quite-A-Review

April 7, 2013

I normally talk about software development and computing generally here but I thought I would post a quick note about this new piece of camera gear that I got recently.

It’s a Canon EOS M, their first mirrorless interchangeable lens digital camera. I hadn’t planned on getting any new camera gear but I saw this camera with a 22mm ƒ/2 lens on sale at for $450 so I grabbed one.

I’m not going to go over the basics (plenty of others have covered those aspects admirably), I’ll just point out a few things I’ve noticed in the week I’ve been playing around with it.

#1: the 22mm ƒ/2 lens that it comes with is really a nice lens. It’s a pancake lens so it doesn’t take up much room at all. On the APS-C sensor size of the EOS M the focal length equivalent comes out to 35mm which is a great “walk around and be able to catch everything” kind of lens. The ƒ/2 maximum aperture is quite fast considering its size. Overall I like it and will likely keep it on the camera as the main lens I use for photography.

#2: yes, the focusing system does suck that bad. The biggest complaint with this camera is that focusing is slow and unreliable and I’d have to agree with that wholeheartedly. I’m used to my 10D and 5D being able to focus in basically 1/4 a sec maximum (unless it’s REALLY dark), so waiting around for 2 to 6 seconds to get the focus right (assuming it doesn’t fail) is very very aggravating. Especially if you are using bigger apertures, getting the focus right is really important and the EOS M is the worst camera I’ve ever used in this respect.

#3: I’ll have to stop shooting in RAW mode. The 10D I’ve used for years produces RAW files of 4-6MB. The 5D just about doubles that. The EOS M doubles that again so now I’m looking at 20-30MB for each photograph I take with this camera. This is just unsustainable especially in the era of solid state hard drives. A high quality JPEG takes up just 3-6MB and I guess I’m going to have to live with that.

#4: The video from this camera is just amazing ASSUMING YOU FOCUS BEFOREHAND. You’ll get super high bitrate (around 45megabits/sec according to my tests) 1080p video from your favorite lenses. My favorite setting is “Neutral” with just a hint of sharpness added, a touch more saturation, and “Auto Optimized Lightness” which seems to boost the low/mids without blowing out the highlights (which seems to more closely match the gamma of film.) Shooting video with my Sigma 10-20mm ƒ/4-5.6 lens attached is my new favorite thing in the world. BUT, if you need to focus while you’re shooting video be prepared for frustration.

#5: Using Lens•Lab has really helped out with getting a feel for depth of field when shooting video. It’s a great feeling to use something you’ve made yourself to solve real world problems.

#6: strapping a normal lens (like the Sigma 10-20mm or the Canon 24-85mm) to the tiny EOS M body is a bit ridiculous.

All in all, I’d say the EOS M is a much more impressive video camera than it is a photography camera. It’s handy to carry around with the 22mm pancake prime (I’ll definitely use it in places where I don’t want to carry around the bag with the 5D, 10D, and lenses, batteries, etc.) But the focusing problems are really hard to look past.

It’s an odd duck. The tech specs and interchangeable lenses say semipro/pro but the tiny size and abhorrent focusing say “mediocre consumer photography gear.”

I still have a 5D and 10D around for doing “real” photography. But I think the EOS M will find a place in the jacket pocket or glove box or backpack far more often than the more pro level gear.

Check out some photos from the camera here: and some video here:

New App, Etc.

March 8, 2013

New app has been on the app store for a couple of weeks. It’s a QR Code Clock. You read that right.

Also, I still have more Android retrospective stuff and a review of the new 13″ Retina MacBook Pro I got! Wowzers!

New App

February 9, 2013

Silly new app has been submitted to Apple’s App Store for review. I’ll tell you about it in a few days.

Dot Notation in Objective-C

February 4, 2013

I’m going to jump right into this contentious issue. There has been a lot written about using dot notation in Objective-C. Before we get into the arguments, let me first show you something from some actual code I’m working on right now:

“Normal” bracket notation Objective-C:

[fetchRequest setSortDescriptors:[NSArray arrayWithObject:sort]];

Dot notation Objective-C 2.0:

fetchRequest.sortDescriptors = @[sort];

Alright, so I used some literals in there as well. But that’s really part of my point. The language evolves over time and just because something has “always been done this way” doesn’t mean it’s necessarily the best way to do it.

The dot notation line is almost half as long and, to my mind, much easier to read. Having been typing Objective-C for a few years now I can tell you that brackets are tough to manage. Xcode seems to get its “bracket helping” wrong almost half the time. And they are just visually cluttery.

Some say “OK, use dots but just for properties!” Me, I’m getting used to using the dots for everything except (void) methods with no arguments.

Everything else is up for grabs. I guess because I think of everything as a property and/or object to grab. For instance if I use:


I immediately think “OK, we’re peeling off a blackColor from the UIColor class.”

Here’s a more involved example. I have a “SensorManager” class that is a singleton. This class basically wraps up the functionality and behavior of the location manager and the motion manager. Here’s some actual code from the app I’m working on:

SensorSample *newSample =;
sensorSample.latitude = @(SensorManager.sharedInstance.locationManager.location.coordinate.latitude);

This may make you cringe or it may make you go “ah-ha!”

When I look at it, I think to myself, “OK, we’ve got a new sample that we’re peeling off of the SensorSample class. We’re setting it’s latitude to a boxed up NSNumber that we get from the Sensor Manager’s shared instance’s location manager’s location’s latitude coordinate.

The other way to write this is:

SensorSample *newSample = [[SensorSample alloc] init];
[sensorSample setLatitude:[NSNumber numberWithFloat:[[[[SensorManager sharedInstance] locationManager] location] coordinate].longitude]];

ARE YOU FUCKING KIDDING ME?!?!? I couldn’t even type that second line without Xcode fucking up the brackets. Only it works fine IF YOU KNOW EXACTLY HOW MANY BRACKETS YOU HAVE TO TYPE FROM THE BEGINNING.

Also, the only time I needed to use the dot was to get the longitude out of the struct. Are the dot-notation-naysayers really saying that I have to type all those brackets because of that single dot attached to the struct? Whatever.

Geeze. This post went from casual to rage in a hurry.

A better UIPickerView

February 3, 2013

Working on a new/old project. Thought I would wrap my UIPickerView customizations into a self-contained class. Here it is.

I’ve improved UIPickerView by adding in presentation and interaction functionality that it should frankly have had from the beginning. Here’s what it does:

1.) Added a (void)show and (void)hide method. This makes the pickerView pop up from the bottom of the screen and pop back when asked. When the pickerView pops up, the background of the view controller (or navigation controller if that’s what we’re presenting in) dims and becomes a dismissal tap target.

2.) Added three delegate methods to notify clients if the pickerView has popped up, if it has disappeared, or if the selection indicator was tapped.

To use it in your view controller, just make your VC conform to the JMPickerViewDelegate protocol, and init it with a delegate and view controller. Easy peasy!

Get the code on GitHub or download the Xcode Project here.

A better SeekBar for Android

January 27, 2013

Still working on Part 2 of my Android development retrospective but I thought I would put this out here real quick.

One of the smaller pain points in developing for Android was the SeekBar object. The SeekBar is a UI widget object that is roughly analogous to the UISlider in Cocoa Touch.

The UISlider has settable minimum and maximum values (in float) and the slider’s current value is expressed as a float.

The SeekBar for Android always has a minimum of zero and the maximum can be set but only to an int. The current progress is always expressed as an int.

This kind of sucks so I made a class that extends SeekBar and named it BetterSeekBar. You can set minimum and maximum values as float and get the current value as a float as well. Internally, it sets the max to an int of 10,000 and calculates the current value based on floating point offsets and ratios.

You can get the class right from here (it’s 2KB.) I hope you find it useful!

So long, App-A-Week challenge. Hello, App-A-Month challenge!

July 18, 2012

Hello, dear readers. I have some news.

You haven’t heard much from me lately because I recently (like 3 weeks ago) got a really kick-ass job doing mobile app development with a really cool company. That means I have far less time to work on my own iOS projects.

BUT, I still intend to work on them. What I’ll have to do is modify my original challenge to have a release every month instead of every week.

This months release is the retina-graphics version of Mac Lens•Lab!

Anyway, thanks for reading!

On the Value of Apple Laptops

June 19, 2012

Three years ago at WWDC, Apple announced the unibody 13″ MacBook Pro. I ordered one within hours of the announcement.

I had been using a white MacBook for a few years and I thought it was a great machine. After the aluminum unibody machines came out I started lusting after those designs. I had had experience with 15″ laptops before and I just thought they were too big. I needed FireWire. All of these needs and wants were distilled perfectly in the 13″ unibody MacBook Pro.

I purchased the lowest-end configuration with my wife’s student discount (2GB) and immediately upgraded the RAM to 4GB with memory purchased from NewEgg. Over the years I’ve upgraded the RAM again to 8GB ($35) and added a 80GB Intel SSD (X-25MG2). I just now replaced the old battery (1,052 charge/discharge cycles) myself with a new battery for $66. I’ll be installing Mountain Lion as soon as the golden master is released.

With that new battery, my three year old MacBook Pro is better than it ever was new. It’s totally fast enough for me and the work I do on it (which includes software development and SLR photography.) It has kept its value so well, it is currently worth $850 according to the internet. Subtract that from the total cost of $1450 ($1100 for the laptop, $35 for the RAM, $65 for the battery, $250 for the SSD) and you get $600 is what I have spent on computer technology for the past 3 years. That’s $200 a year.


Get every new post delivered to your Inbox.