Skip to main content

Why Smartphone Night Photos Are So Good Now

Taking photos at night on your phone used to look terrible, but recent phones have much improved capabilities. Julian Chokkattu, reviews editor at WIRED, explains how smartphone camera technology has gotten so much better.

Released on 03/25/2022

Transcript

Taking photos at night on your phone

used to look terrible.

But if you purchased a new smartphone recently,

you may have noticed that your night photos have improved.

Ah, much better.

You can even take photos of stars.

I'm Julian Chokkattu, reviews editor at Wired,

and I've been reviewing smartphones for over five years.

How has smartphone photography gone from this,

to this beautiful photo?

Before we get into the technology

behind the new night modes,

let's first have a little chat about bad photos.

Take a look at this photo here,

taken on an iPhone Five around 2014.

A couple elements stand out to me,

like that classic lens flare, or the blur.

No matter how nice or advanced the camera is,

it's always going to need a good source of light.

That's exposure, the amount of light

that reaches your camera sensor.

Right now, this lovely crew has lit me really well.

Let me show you.

[soft music]

If they cut the lights, now I'm back lit and underexposed.

This is the iPhone 3G in low light,

and this is the iPhone 13 Pro in low light.

Let's get the lights back on.

Part of the reason the iPhone 3G looks so underexposed

is because it didn't spend a lot of time taking the photo.

That's shutter speed.

That's the length of time the camera's little door is open,

exposing light onto the camera sensor.

One of the main reasons night mode on your phone

asks you to stay still is because

the longer you have the shutter open,

the more light you can let in,

which will produce a brighter photo.

But here's the thing.

In night photos, the seconds it's asking you to wait,

it's actually taking more and more photos

to make a composite with machine learning algorithms.

So night mode is a part of the field

of computational photography.

I'm going to call up Ramesh Raskar at the MIT Photo Lab

to get into the technical element of how it works.

[Ramesh] Hi Julian.

Would you be able to tell me

what exactly is happening when you take a night photo

in a modern day smartphone?

There are three elements in any photography.

There is capture, there is process,

and then there's display.

And what we have seen over the last 10 years

is there is amazing improvement in all three areas.

So how is the software actually changing

what the photo will look like?

You will hear all these terms, HDR, HDR plus, night mode,

smart HDR, but all of them are roughly doing the same thing.

This key idea of so-called deep fusion,

where you're fusing the photos by using machine learning

and computer vision, is really the breakthrough

into today's low light photography.

Could you explain HDR?

So HDR, traditionally high dynamic range, simply means

whether it's bright scene or a dark scene,

you can capture that in a single photo.

A smartphone, it has seen millions of photos of a sunset,

or a food, or a human face.

It has learned over time, what are the best ways

to enhance such a photo, and how to either

reduce the graininess, or how to make it look more vibrant

and choose the right saturation?

Choosing those parameters is basically machine learning

when it comes to photography.

Now let's take a look at this machine learning in action

by comparing some photos.

The one on the left is the iPhone 3G,

so quite a long time ago.

And the one on the right is the iPhone 12.

What are your first thoughts

in what they're doing differently?

So you can see that the previous phones

just gave you a photo from a single instant.

The photo on the right is actually not physically real,

in the sense that there were different things.

People were bobbling their heads,

and the lights were flashing.

And so the photo's actually composed by multiple instances.

So when you try to fuse these multiple photos,

the light in one photo could be one direction,

light in the later photo could be in a different direction.

And it's taking some clever decisions

to create an illusion, as if this photo was taken

at that single instant.

Here you can also see the HDR into effect,

where the audience is completely dark

in the iPhone 3G photo, whereas you can actually see

everyone's heads in the other one.

If an AI is learning how to color correct a night scene

based on what it thinks it should be,

are we moving away from photo realism?

Julian, I think photo realism is dead.

We should just bury it, and it's all about hallucination.

The photo you get today has almost nothing to do

with what the physics of the photo says.

It's all based on what these companies are saying

the photo should look like.

So yeah, I took one of these with the Pixel Six

and one of these with the iPhone 13 Max Pro.

What happened there that would've caused those colors

to be very different between the two photos?

These two companies have decided to give you

a very different photo experience.

The Pixel might have taken 20 photos.

It's also recognizing certain features

whether there's a sky, is it outdoor?

What kind of wide balance it has?

There's some automatic beautification also being applied.

So most of the photos we see are hallucinations,

but not the physical representation of the world out there.

These companies are providing us with ways to

control some of that, like turn off

that beautification feature or maybe make it even stronger.

Do you think that's where the compromise will lie

with the people that do want to maybe

tailor some of their own shots to give them that control,

and those options to tweak their settings?

The innovations in all these three areas

have actually taken the control away from us.

But in reality, it's not that difficult

for these companies to provide those controls back to us.

They're just making an assumption

that most consumers would like to just take a photo,

click a button, and get something they really

would like to see, whether it matches the reality or not.

I think the thing that we really care about is

we go on a trip, and you reach Paris,

and the Eiffel Tower is in a haze.

And what you would like to see is take a photo

with your family with Eiffel Tower in the back

as if it's a bright sunny day, right?

And that's where as a consumer,

you yourself are willing to separate the physics,

the reality from hallucination,

because if somebody can paste just a bright, sunny photo

of Eiffel Tower behind your family,

you'll be pretty happy about it.

So we focused on night photography.

Every time we look at the nighttime photos,

those actually do seem to be improving year over year.

But broadly, what would you say are some of those challenges

that are left for photography in general

when it comes to smartphones?

In terms of night mode,

there are lots of challenges right now.

If you want do something that's high speed,

it's very difficult to capture that at nighttime.

It's also difficult to capture very good color

in nighttime, because nighttime photos have,

when they use burst mode, the challenge with burst mode

is that every frame has a so-called read noise.

So there's a cost a camera pays

every time it reads the photos.

But the other technique many companies are using

is just using lots of tiny lenses.

Now some phone companies have five lenses,

and that's one trick to capture just five times more light.

How does that affect the rest of the phone's capabilities?

What can we expect in the future?

Photography or imaging should give us superhuman powers,

so we should be able to see through fog,

we should be able to see through rain.

we should be able to see a butterfly

and see all the spectrums, not just the three colors.

I think the notion that

we should just see what we are seemingly experiencing

is not in different displays,

but I would like to see a beautiful view finder.

If I'm in Paris and as I'm moving my view finder,

it should tell me, hey, if I take a picture

of the Eiffel Tower, it's very jaded.

A lot of people are out taking a photo.

But if you keep rolling and there is this tiny statue,

actually not enough people have taken the photo of this.

So I think we're going to see this very interesting progress

in capture, processing, and display.

And I'm very excited about

what photography of tomorrow will look like.

[soft music]

I'm going to show you some of my favorite features

with the iPhone 13 Pro and the Google Pixel Six.

We're doing low light photography, so let's cut the lights.

Let's open up the camera

and see what happens with night mode.

You can see that I'm already in a pretty dark area,

so night mode has been triggered here.

Once you tap it,

you can actually control the length of the exposure.

So if you think that you might need a longer shot,

sometimes that might produce a brighter image.

If I tap on the background, it'll expose for the background

and it will also change the focus there.

So you can actually slide it up and down

to change the brightness, or the shadows in the shot.

Those are just a couple of features

in the camera app themselves.

All right, let's bring the lights back on.

So we have to talk about tripods.

Tripods are an easy way to up your photo game,

especially at night.

Of course, a large problem of taking photos at night

is the hand shake of when you're taking a photo.

Once more, can we cut the lights?

Can I get a volunteer?

So now I'm going to first take a photo without a tripod,

and see how it reacts then.

So you can just basically switch over to the night site mode

and tap the photo.

But now if I switch over to a tripod,

it's going to be much more stable.

And if I tap the button, it knows that it's on a tripod,

and you can see it is taking a lot longer to take the photo.

It's taking multiple, multiple images

of different exposures.

Shooting handheld is a problem, because the shutter speed

is trying to take in as much light as possible.

And that means your hands are shaking,

and that's influencing the shot.

That's what makes it impossible

taking photos of stars without a tripod.

Certain phones like the Pixel Six

let you take photos of the star

with a certain astrophotography mode.

And essentially it's doing what night mode is doing,

but for a much longer period of time,

like two, three, sometimes even five minutes.

And what it really needs is the phone to be on a tripod.

If you're curious about what some of our favorite phones are

for taking photos, or maybe just looking at

other camera gear that might help you take

some of these better photos,

well, we have guides on wired.com.

And as Ramesh said, it's going to be really interesting

to see how our cameras improve in the future,

whether they'll completely decide on their own

exactly what photo you should take,

or if you'll have any control left.

Photo realism is dead.

No, that's dark.

Jesus.

I hope this video helped you understand a little bit more

about night photography, and I hope

you continue going out there taking lots and lots of photos.

[soft music]

Up Next