Canon EOS M Not-Quite-A-Review

April 7, 2013

I normally talk about software development and computing generally here but I thought I would post a quick note about this new piece of camera gear that I got recently.

It’s a Canon EOS M, their first mirrorless interchangeable lens digital camera. I hadn’t planned on getting any new camera gear but I saw this camera with a 22mm ƒ/2 lens on sale at newegg.com for $450 so I grabbed one.

I’m not going to go over the basics (plenty of others have covered those aspects admirably), I’ll just point out a few things I’ve noticed in the week I’ve been playing around with it.

#1: the 22mm ƒ/2 lens that it comes with is really a nice lens. It’s a pancake lens so it doesn’t take up much room at all. On the APS-C sensor size of the EOS M the focal length equivalent comes out to 35mm which is a great “walk around and be able to catch everything” kind of lens. The ƒ/2 maximum aperture is quite fast considering its size. Overall I like it and will likely keep it on the camera as the main lens I use for photography.

#2: yes, the focusing system does suck that bad. The biggest complaint with this camera is that focusing is slow and unreliable and I’d have to agree with that wholeheartedly. I’m used to my 10D and 5D being able to focus in basically 1/4 a sec maximum (unless it’s REALLY dark), so waiting around for 2 to 6 seconds to get the focus right (assuming it doesn’t fail) is very very aggravating. Especially if you are using bigger apertures, getting the focus right is really important and the EOS M is the worst camera I’ve ever used in this respect.

#3: I’ll have to stop shooting in RAW mode. The 10D I’ve used for years produces RAW files of 4-6MB. The 5D just about doubles that. The EOS M doubles that again so now I’m looking at 20-30MB for each photograph I take with this camera. This is just unsustainable especially in the era of solid state hard drives. A high quality JPEG takes up just 3-6MB and I guess I’m going to have to live with that.

#4: The video from this camera is just amazing ASSUMING YOU FOCUS BEFOREHAND. You’ll get super high bitrate (around 45megabits/sec according to my tests) 1080p video from your favorite lenses. My favorite setting is “Neutral” with just a hint of sharpness added, a touch more saturation, and “Auto Optimized Lightness” which seems to boost the low/mids without blowing out the highlights (which seems to more closely match the gamma of film.) Shooting video with my Sigma 10-20mm ƒ/4-5.6 lens attached is my new favorite thing in the world. BUT, if you need to focus while you’re shooting video be prepared for frustration.

#5: Using Lens•Lab has really helped out with getting a feel for depth of field when shooting video. It’s a great feeling to use something you’ve made yourself to solve real world problems.

#6: strapping a normal lens (like the Sigma 10-20mm or the Canon 24-85mm) to the tiny EOS M body is a bit ridiculous.

All in all, I’d say the EOS M is a much more impressive video camera than it is a photography camera. It’s handy to carry around with the 22mm pancake prime (I’ll definitely use it in places where I don’t want to carry around the bag with the 5D, 10D, and lenses, batteries, etc.) But the focusing problems are really hard to look past.

It’s an odd duck. The tech specs and interchangeable lenses say semipro/pro but the tiny size and abhorrent focusing say “mediocre consumer photography gear.”

I still have a 5D and 10D around for doing “real” photography. But I think the EOS M will find a place in the jacket pocket or glove box or backpack far more often than the more pro level gear.

Check out some photos from the camera here: http://www.flickr.com/photos/jmenter and some video here: http://www.youtube.com/jeffmenter

Advertisements

13″ Retina MacBook Pro (Late 2012) sort of review: That’s a lotta pixels!

March 10, 2013

I kind of want to do a review of the 13″ Retina MacBook Pro I got a few weeks ago. But I want to do it in a way that doesn’t rehash what others have already written. Let’s see how this goes.

There are two aspects of computers that have really not improved over the past two decades: hard drive speeds and display densities. The hard drive problem has been ameliorated by SSDs so that really leaves only displays as the final piece that needs to play catch up. And catch up they have! But first things first.

Apple announced the 15″ Retina MacBook Pro models last summer. The announcement filled my heart with joy but I was also a little disappointed for two reasons. One, I’m not a fan of the 15″ form factor (it’s just too damn big). And two, the starting price of $2,200 was way above what I’d ever be willing to pay for a laptop.

When the 13″ Retina MacBook Pro was announced last November, one of those disappointments floated away. But I was still left with the prospect of paying $1700 for a new laptop. For the record, the most I’ve ever paid for a laptop (of any kind) was $1600 for a iBook back in late 2001. My normal laptop purchase M.O. is to get the lowest end 13″ or so model and do upgrades myself.

Recently the price for the 13″ retina dropped by $200. Also, it’s interesting to note that the educational discount on this model is $100. So, what that means is that one could get a 13″ Retina MacBook Pro for $1400 with an educational discount. It looks like the planets are aligning nicely!

So, after having had it for a few weeks I can safely say that this is by far the best laptop I have ever had. The screen still blows me away every day and the form factor is almost perfect. Let’s go over the highlights, shall we?

The display: oh my god the display

The star of the show really is the 13.3″ 2560 x 1600 IPS display. The size is the same as my old 13″ MacBook Pro but there are four times as many pixels and the color and viewing angles are significantly better. At normal viewing distances it’s nearly impossible to resolve individual pixels.

Here are some observations I’ve made over the past few weeks in no particular order.

There is really something strange about the retina display and I think I have figured out what it is: solid colors (like the solid white background of this page) seem to “disappear” because there is no pixel grain for your eyes to focus on. That is to say: with a normal low resolution display, solid blocks of color still have the grid inherent in the display technology as a thing that your eyes can focus on. With a retina display, there is just solid color and nothing else. This is not to say that the display is hard to focus on (quite the opposite), I’m just saying that there’s something strange about not being able to focus as easily on solid colors.

wordpress

I thought that web browsing was going to be a painful experience (since the web is mostly not retina aware, i.e.; images and other in page UI elements will display at poor resolutions), but it has actually been pretty damn good. It helps that all text renders at high resolution. But many more sites are serving up high resolution assets than I thought would be the case. Just for example: I’m typing this into WordPress and most of the UI elements are being drawn in high resolution. Other sites like Facebook, Flickr, and Imgur are retina aware to a greater or lesser extent.

Apple has done a fantastic job of creating high resolution UI assets. All the normal buttons, sliders, widgets, etc. that you’ve grown to love over the years have been given a high resolution makeover and the results are just amazing.

Apps that don’t use the Cocoa frameworks directly display in the horrible “normal” resolution. This was clearly evident when I installed the Android Development Toolkit (which is based on Eclipse.) There is a fix though, and it’s as easy as modifying a value in the app’s plist. There’s a handy app that does this for you called Retinizer.

Screen Shot 2013-03-10 at 4.15.22 PM

As much of a revelation as browsing and developing in Xcode was, I was totally unprepared for how amazing it was going to be to load up my photos in Aperture. Being able to view and work on photos on a display that is basically not having to do any kind of downsampling is just amazing. It’s like looking at an actual printed photo but so much better. It’s truly hard to get across to you how amazing it is.

Screen Shot 2013-03-10 at 4.17.07 PM

The way apple implemented HIDPI display scaling is pretty much mind blowing. In the displays settings panel, in addition to the “native” resolution of 1280 x 800, I can choose two scaled resolutions of 1440 x 900 and 1680 x 1050. What this does is create a virtual retina display at either 2880 x 1800 or 3300 x 2100 and scale those pixels to fit the display in real time. (It looks like it’s doing bilinear scaling which is perfect.) These alternate modes are very handy when running Xcode or tools like Photoshop or Illustrator.

Speaking of Illustrator: if you get a chance, totally run the retina aware version of Illustrator on a Mac with a retina display. It’s kind of like these two things were made for each other. If you run it at the HIDPI 1680 x 1050 you’re basically running Illustrator on a virtual 7 megapixel display. Holy cow!

Screen Shot 2013-03-10 at 3.18.19 PM

Other various observations

Being able to run multiple external displays is really kind of neat. And also makes my pile of Dell 20″ displays more useful. I’ve got a suspicion that it’s possible to run three displays in clamshell mode if two of them are DisplayPort but I haven’t yet been able to test this hypothesis. I would need either two Thunderbolt displays or two displays using active DisplayPort adapters + HDMI.

Thunderbolt is cool but for right now USB3 is where it’s at. Being able to buy a 2.5″ USB3 enclosure for $20 and slap an SSD in it and get most of the performance out of it over USB 3 is very handy. I’ve got my Aperture library on an external USB3 enclosure with a 250GB Samsung 840 drive in it. I get 200MB/sec reads and writes and random access is as speedy as you would expect. WIN!

8522064220_5fa8bd6d2c_o

The Intel HD4000 graphics do fine at 2D and merely adequate at 3D. I play computer-based games mostly on my Windows machine but it’s fun to play some Portal occasionally at 2560 x 1600.

The battery life and i5 CPU are both fine. And boring.

The built-in 128GB SSD is quite speedy (we’re talking reads and writes at over 400MB/sec).

The machine boots up from cold boot to desktop in under 10 seconds. Yowzers!

The iPhone/iPad simulator in Xcode is retina aware so basically I can run a simulated retina iPad in landscape mode without any kind of scaling or clipping. Big win!

I don’t miss having an optical drive one little bit.

To sum up, this is just an amazing piece of technology. And it’s really only possible to do something like this when you have a single company that:

  1. Controls the hardware
  2. Controls the software
  3. Cares about quality, design, aesthetics, and user experience

The two pieces of technology that have been lagging horribly over the past two decades—hard drives and displays—are finally becoming a thing of the past: SSDs are now mainstream and high resolution displays are not only affordable, they have been made immediately useful by being backed by excellent operating system software support.

PCs as we know them traditionally still have years of life left. I’m excited that these years are going to be spent with beautiful and functional displays.


New App, Etc.

March 8, 2013

New app has been on the app store for a couple of weeks. It’s a QR Code Clock. You read that right.

https://itunes.apple.com/us/app/qrcodeclock/id601461609?mt=8

Also, I still have more Android retrospective stuff and a review of the new 13″ Retina MacBook Pro I got! Wowzers!


Dot Notation in Objective-C

February 4, 2013

I’m going to jump right into this contentious issue. There has been a lot written about using dot notation in Objective-C. Before we get into the arguments, let me first show you something from some actual code I’m working on right now:

“Normal” bracket notation Objective-C:

[fetchRequest setSortDescriptors:[NSArray arrayWithObject:sort]];

Dot notation Objective-C 2.0:

fetchRequest.sortDescriptors = @[sort];

Alright, so I used some literals in there as well. But that’s really part of my point. The language evolves over time and just because something has “always been done this way” doesn’t mean it’s necessarily the best way to do it.

The dot notation line is almost half as long and, to my mind, much easier to read. Having been typing Objective-C for a few years now I can tell you that brackets are tough to manage. Xcode seems to get its “bracket helping” wrong almost half the time. And they are just visually cluttery.

Some say “OK, use dots but just for properties!” Me, I’m getting used to using the dots for everything except (void) methods with no arguments.

Everything else is up for grabs. I guess because I think of everything as a property and/or object to grab. For instance if I use:

UIColor.blackColor

I immediately think “OK, we’re peeling off a blackColor from the UIColor class.”

Here’s a more involved example. I have a “SensorManager” class that is a singleton. This class basically wraps up the functionality and behavior of the location manager and the motion manager. Here’s some actual code from the app I’m working on:

SensorSample *newSample = SensorSample.new;
sensorSample.latitude = @(SensorManager.sharedInstance.locationManager.location.coordinate.latitude);

This may make you cringe or it may make you go “ah-ha!”

When I look at it, I think to myself, “OK, we’ve got a new sample that we’re peeling off of the SensorSample class. We’re setting it’s latitude to a boxed up NSNumber that we get from the Sensor Manager’s shared instance’s location manager’s location’s latitude coordinate.

The other way to write this is:

SensorSample *newSample = [[SensorSample alloc] init];
[sensorSample setLatitude:[NSNumber numberWithFloat:[[[[SensorManager sharedInstance] locationManager] location] coordinate].longitude]];

ARE YOU FUCKING KIDDING ME?!?!? I couldn’t even type that second line without Xcode fucking up the brackets. Only it works fine IF YOU KNOW EXACTLY HOW MANY BRACKETS YOU HAVE TO TYPE FROM THE BEGINNING.

Also, the only time I needed to use the dot was to get the longitude out of the struct. Are the dot-notation-naysayers really saying that I have to type all those brackets because of that single dot attached to the struct? Whatever.

Geeze. This post went from casual to rage in a hurry.


Ooh!

January 26, 2013

Look what else we’re doing:

iOS Simulator Screen shot Jan 26, 2013 10.32.39 AM


Lens•Lab 1.1 for iOS is here!

January 25, 2013

I guess we didn’t have to wait that long! Lens•Lab for iOS has been bumped to version 1.1! We’re super excited about this release and hope you like it too!


Lens•Lab for iOS. Submitted, now we wait.

January 21, 2013

Just submitted version 1.1 of Lens•Lab for iOS. If you’re following along you can look forward to these groovy features in a week or so!

 

Lens•Lab, the world’s most advanced yet simple to use depth of field tool has been updated! Finally, after almost two long years. Wowsers! 

Here’s what’s new: 

• iPhone 5 wideness! 

• Focal length constraints! You can customize the virtual lens’s minimum and maximum focal length. Values from 1mm to “very large number” are supported. 

• More sensor formats! By popular demand, we’ve added APS-H and DX formats. 

• Better imperial measurements! 8.7 feet? What was I thinking? Now we report imperial measurements in feet, feet and inches, and 1/4, 1/2, and 3/4 inches. Metric measurements are unchanged. 

• Better aperture controls! Pick an arbitrary value from ƒ/0.95 to ƒ/64! 

• Precision sliders! The slider controls now have precision built-in. Just slide your finger vertically to make the slider more or less precise. Check out this neato feature! 

• Better interaction! Slide your finger in the left 1/4 of the screen to adjust focal length (up & down) and aperture (left & right). 

This is the best version of Lens•Lab yet! We hope you like it and it makes your life better. We appreciate awesome ratings!