New Workshop: Photographing Group Portraits!

News & Insight

Why Your Selfies Look Awful & What Can And Is Being Done About It

By Kishore Sawh on March 28th 2018

For the average human and even the prettiest and most handsome humans, few things can be as humbling as the front facing phone selfie. There’s the ‘you’ that you see in the washroom mirror, and the ‘you’ you see in a selfie, and often they looks like different people. “Why is this?” you might wonder, and the answer can go quite a way to explaining why we like certain focal lengths for photographing people, and where photo tech is heading.

At the very base level of all of this is the matter of lens and perspective distortion. Perspective distortion deals with your proximity to the subject and relative to surroundings. The closer something within our field of view gets to our faces the larger it appears in relation to the objects behind it, and the inverse of that is also true. The relative distance from the lens and the field of view of that lens are what will affect the look of the subject within the image.

A longer lens requires you to be further away for the same framing and that tends to create a situation where facial features (in this instance) seem closer more proportionate, where Wider lenses have a wider field of view and as you can get closer, the features that are closest to the lens are what seem larger, and thus, disproportionate to the face. This is what’s going on with selfies.


lens-compression-example-photography-101

Your average phone camera uses a rather wide lens in the front, somewhere around 32mm for an iPhone (equiv from. 2.87mm) and that’s quite wide, (especially given that it’s acceptable to approximate the human eye’s FOV as about 45mm). Given that FOV and the distance you can hold the camera from your face is going to top out at around 2.5 feet for most people, your features closest to the camera will seem quite exaggerated, and is not an accurate representation of proportions.

But there’s another reason it really stands out, and it has to do with how different our brains work from a camera lens. If you have ever kissed anyone, you might notice that as you go in for the kiss their faces don’t all of a sudden seem to be incredibly disproportionate – they don’t look alien. That’s because a) the eyes don’t work like cameras and lenses, and b) your brain steps in to correct.

If you remember your anatomy classes, babies see the world upside down for the first little while after birth, but after a bit the brain learns to correct and flips it as it processes the data. It does some magic when it comes to the kissing scenario also. It’s somewhat accepted that the human brain will remember the proportions of a human face as it looks from about 12-15 feet away, and as your ‘betrothed’ comes within a breath away your brain is essentially correcting what you see – lying to you.

Lenses and sensor’s, however, can’t do that. But, computers can…

So you can either get the framing different for your selfies (which is hard to do given physical limitations), or rely on software.

[REWIND: How To Take Studio Portraits With One Light & Basic Gear | No Fuss Tutorial]

There is no doubt in my mind that the future of imaging lies in computational photography, and in this instance software could be used in the near future to assess the selfie you’ve taken and adjust it much like your brain would. It would seem the tech is already on the way. Check out the video below, and, perhaps, skip the nose job.

About

Kishore is, among other things, the Editor-In-Chief at SLR Lounge. A photographer and writer based in Miami, he can often be found at dog parks, and airports in London and Toronto. He is also a tremendous fan of flossing and the happiest guy around when the company’s good.

No Comments

Please or register to post a comment.

[i]
[i]