We’re still alive/random info…

September 30, 2011

I haven’t made a post in a couple of weeks. I’m still working on updates to Lens•Lab. I had a minor setback when my iPhone 4 died but got a new one thanks to Apple’s out-of-warranty service. w00t!

What’s in my photography gear bag? I don’t know if you wondered, but here it is:

  • Two Canon EOS 10D
  • Sigma 30mm ƒ/1.4 prime
  • Sigma 10-20mm ƒ/4-5.6 wide angle
  • Canon 24mm-85mm ƒ/3.5-4.5 zoom
  • Canon 430EX flash
  • Canon 430EX II flash
  • Canon ST-E2 wireless transmitter
  • 2x2GB CF card, 1x4GB CF card
  • Extra batteries, etc.

Anyhoo, in the meantime you can check out my photos on Flickr!


Lens•Lab Calculations

September 16, 2011

It occurs to me that some customers and potential customers might be wondering how Lens•Lab figures its calculations. In the spirit of openness, honesty, and transparency, I present to you the methods I use.

First of all, the primary sources I used for the formulas are:

Additionally, I used the following websites to either double check my calculations or to understand how crop factor influences depth of field:

Here is how I do the calculations:

hyperfocalDistance = ((focalLength * focalLength) / (apertureInFStops * circleOfConfusion)) + focalLength

depthOfFieldNear = ((hyperfocalDistance – focalLength) * (focalDistance)) / ((hyperfocalDistance + focalDistance) – (2.f * focalLength))

depthOfFieldFar = ((hyperfocalDistance – focalLength) * focalDistance) / (hyperfocalDistance – focalDistance)

depthOfFieldTotal = depthOfFieldFar – depthOfFieldNear

All distances are in mm and circleOfConfusion is set based on the sensor format:

  • Full Frame = 0.029mm
  • APS-H = 0.023mm
  • APS-C = 0.019mm
  • Four Thirds = 0.015mm

I’ve double-checked the calculations that Lens•Lab produces with other depth of field calculators and they match up.

There you have it! Questions or comments? That’s what the box below is for!


September 12, 2011

I’ve had some time to work on the next version of Lens•Lab for iOS devices recently. It’s coming along well and I think you’ll be pleased with the results.

One thing I find myself doing—and this may partly explain why it takes so long for me to produce new software releases—is sort of obsessing over the tiniest of details in the creation of software.

By tiniest of details i mean: how many pixels should I use to line up the controls in the settings panel? What color and/or texture should the background be? At what imperial distance should I start reporting integers instead of %1.1f floating point numbers? When I report fractional imperial numbers should 1.125 inches be reported as 1″ or 1 1/4″? How should the text fields that control the virtual lens constraints behave? How much input validation should I do when the field is done with editing versus when the panel is dismissed?

But there’s also details in the writing of the code itself. These are details that the user will never see but I write as if they will see it and judge me on it. Am I using typedefs appropriately in dealing internally with distances and aperture values? Do my variable and method names make sense? Should this handy method be generalized and made into its own class?

There are other details which I spend an inordinate amount of time on: does using NSOperation to blur the background incur any performance penalties vs. using a function in the UIView class? What about using GCC vs. Apple’s LLVM 3.0 compiler? To figure these things out, I have to use the performance monitoring tools and take measurements to see what effect these things have on performance.

Since the display of Lens•Lab is updated in realtime, it’s very sensitive to performance. Though most may not notice, there’s a pretty big difference between being able to update the display at 15 fps and 7 fps. If I can gain 3 fps by spending 8 hours tweaking compiler settings or altering how memory is accessed in the blur algorithm I will totally do it.

Why? Why do I obsess over these things? Why do I spend hours trying to make the user experience just a little bit better? Part of it has to do with the platform itself. Apple has set a very high bar with iOS and I want to make sure my app is up to snuff quality-wise. (I think it is and going by the reviews, I think others do as well.)

I think another part of this obsession has to do with trying to find “the ideal.” The perfect (or ideal) Lens•Lab application exists in the Universe, I just have to find it. When I’m engaged in these sorts of activities (obsessing over pixels), I feel like I’m more in tune with the “perfect” Universe. It’s part of my lifelong search for Truth/Beauty/Ideals.

We could go into why I search for these things but I think I’ll leave that for another post. For now, I’ll leave this and get back to work. 🙂