As operating systems mature, for every new iteration that is being released, I find it a lot more difficult to get excited. Take Apple's OS X for example, the system I'm writing on right now. I remember being very enthusiastic about new versions up to 10.5 or so, where every new version would introduce features that I would actually use. After a while though, the system matured, and some time around 10.7, I lost track of which version I'm actually using. In fact, I would be hard pressed to tell the version I'm using right now, or what has changed compared to its predecessor. For the most part, I'm just happy if none of the changes broke my workflow.
The same is happening to mobile OSes as they mature. I happen to be an Android user, and while it made huge leaps during the earlier development cycles, it now reached a point where it's becoming harder and harder to tell one release from the previous one. There's some small changes to the icons here and there, but in the grand scheme of things, not much has really changed in how I use my phone in the last 3 years or so.
Incidentally, the reason in both cases is the same - both companies realized that their users stopped getting excited about new color schemes, and have shifted their focus on applications instead. For both companies, it is thus only natural to start focusing on applications, too, which means working on OS-level services and accompanying APIs, which is exactly what was the focus of the most recent releases.
The side effect is that while the applications coming bundled with the OS are stagnating, the third party applications can flourish in a wealth of new APIs. Take Google's camera APIs on Lollipop for example, which break with a year-long tradition of designing the API around a handful of use-cases and making it just good enough to enable the bundled application. In Lollipop, the bundled native camera application is still a horribly limited application, but the underlying APIs enable a whole new breed of apps like the very nice Manual Camera, which effectively turn the phone into a fixed-lens shooter with full manual controls and DNG support.
Naturally, I started using my phone as my every-day-camera, and when not being forced to take the pre-cooked ISP algorithms' decisions on the final look of the images, the results can actually be pretty exciting.
Soon enough though, the geek within me started to wonder whether there's a way to further improve on the image quality, and I quickly found that while there are no supplied profiles for the particular lens/sensor of the Nexus 5 bundled in Lightroom, Adobe actually provides a free Lens Profile Creator, as well as a DNG Profile Editor, which can be used to calibrate and correct for geometric distortion in the lens, and color shifts in the lens/sensor, respectively.
Yesterday, I finally found the time to play around with those, and got out the soft boxes to create the profiles:
I used the same setup for the color profile, only with an X-Rite ColorChecker Classic, and then had Adobe's tools do their thing. The output can be found here:
The results are actually a little less dramatic than I thought. Looking at the distortion only, here's an example of a tiled wall, as it is coming uncorrected straight out of the phone:
Note that there is a pin cushion distortion on the upper edge, and a barrel distortion on the right edge, but both are suprinsingly low already in the uncorrected image. There is some pin cushion bending going on in the center of the frame though, which is actually noticable when shooting things that are more exciting than naked walls.
Either way, when looking at the upper edge for example, the lens profile is doing a pretty decent job of straightening out the warped lines:
It's not perfect, but surely good enough for me.