Blurred frames

A few weeks ago, Google released a discreet camera app for its Android devices, in an effort to maintain a consistent experience across the range of devices. One of the new features added was Lens Blur, a unique approach to solving the problem of shallow depth of field on mobile cameras. I highly recommend reading their blog post about how they managed it.

The key issue revolved around the problem of a single camera with minimal lenses trying to capture depth information about a scene. The new HTC One (M8) manages this by having two lenses, one which is entirely devoted to getting depth information. But of course, for the vast majority of consumer handsets, this isn't an option.

To solve this, the function instructs the user to raise the device slowly after taking the initial shot. This motion tracks which objects and features stay in a similar position in the camera frame (distant) and which move most (near). It can then construct a depth map of the scene based on this data. Very clever.


It is not a replacement for proper lenses on an SLR, but it does produce impressive results in the right conditions. As with many single camera applications, washed-out or too-dark pixels won't yield as much data as those clearly lit, but the resulting images are impressive.

What's more impressive, from our vantage, is that this feature was rolled out to everyone with Android, for free, and gives them a very cool new function that would otherwise be a key camera feature of any new handset, all without changing one bit of hardware, all using the camera to its fullest potential. Naturally, based on what we do at Peekabu, we highly approve.