What is Happening with iPhone Camera

Here is the rewritten content in a structured article format:

The iPhone's Camera: A Premium Device That Falls Short

The debate about smartphone cameras has been ongoing for years, with various tests conducted to compare their performance. In one notable test, an "Intro SFX" was used to set the scene, followed by a question about what is happening with the iPhone's camera.

According to the results of this blind test in bracket format, the iPhone consistently loses in the first round. This suggests that the camera may not be as capable as expected, at least when it comes to raw performance.

On the other hand, a more scientific version of the test was conducted, which garnered over 20 million votes from participants. In this case, the iPhone finishes in the middle of the pack.

"WEBVTTKind: captionsLanguage: en(Intro SFX)- Okay, what exactly is happeningwith the iPhone's camera?Like we've done years ofblind smartphone camera testsin bracket format and the iPhone,supposedly one of the premium camerasin the entire smartphone industry,consistently loses in the first round.Then we do a scientificversion with 20 million+ votesand it finishes in the middle of the pack.\"And yet, Marques, you namedit the fourth-time runningbest overall smartphonecamera system in 2022and gave it a trophy.What's up with that?\"A concerning amount of people have startedto notice that the iPhones camerafeels like it's takena few steps back latelyand I agree with them.I think we should takea closer look at this.(relaxed music)So first of all, camerashave come a really long wayto the point where smartphone camerasaren't just cameras anymore.See, back in the day, acamera was basically a sensorthat would travel aroundcovered all the timeand when you wanted to take a photo,you would expose that sensitive bitto the environment around itand it would collectthe light and close it.Then the photo would be a representationof how much light hiteach part of the sensor.The better the sensor, thebetter an image you could get,the more light information, super simple.These days though,it's turned into a wholecomputational event.Your smartphone sensor issampling the environment,not once, but often several timesin rapid succession at different speeds.It's taking that light information,merging exposures together.It's doing tone mapping, noisereduction, HDR processingand putting it all togetherinto what it thinks willbe the best looking image.This, of course,is a very differentdefinition of a picture.So now it's not just abouthaving the best sensorthat gathers the most light information,it's at the point where softwaremakes a much bigger differenceto the way the image looksat the end of the daythan anything else.Like next time you watcha smartphone reveal event, for example,keep an eye on all the newadditions that get madeand just how many ofthem are pure software.So Google basically struckgold when they firststarted using the IMX363 sensorway back in the day withthe Pixel 3's camerabecause they got their softwaretuning with it just rightand it was an instant smash hit.So they kept using that great camera comboin every Pixel since then.The 3, the 3a, the 4, the 4a,the 5, the 5a, and even the Pixel 6a.So year after year of new phones,same sensor, same software tuning combobecause it just worked.If it ain't broke, don't fix it.So when you saw the Pixel 6awin December's scientificblind smartphone camera test,what you saw was a four-year-old sensorand software tuning combothat is still so goodthat in a postage-stamp-sized comparisonof compressed side-by-side imageswhere you can't really judge sharpnessor depth of field too much,basically just appreciating the basics,this combo absolutely nailed the basicsbetter than anyone else.Now, when the Pixel 6came along, stay with me,Google finally updated theirdesign and their brandingand they finally changed to a new sensorwith this new camera system.So they go from thetried-and-true 12 megapixelto this massive new 50 megapixel sensorand it kind of threw a wrench into things.- So it looks to me that thePixel is over sharpening.I think the one on theleft looks too crunchy.- The camera on thePixel 6 does have a habitof making things just look HDR-y.I dunno if there's reallya technical term for that.- And if you look at all the photos,it's clear the Pixel isstill doing Pixel things.- I think Google's still running allof their camera algorithms at 11,like when they don't need to anymore.- Right now, new phoneswith much bigger sensorsare still processing liketheir smaller older ones.- The basic principle is:they were doing all thisprocessing with the old sensorsas if they were not getting a lot of lightand then suddenly they hadthis massive new sensorwhich is getting waymore light informationbut they were still runningall of this processing.They would still do high-sensitivity stuffand then they'd do noise reductionbecause if you have high sensitivity,you need noise reduction.But then since you'redoing noise reduction,you need to do sharpening on top of thatto make up for itand just overall you'redoing way too much.And so the photos areliterally overprocessed.So this fancy new phone would come outwith a new camera system,but you could argue, legitimately,that the older Pixel stilltook better looking photos.So Google had to work reallyhard at the drawing boardand make some adjustments andsome updates to the softwareto dial in this new sensor.It took a while, butnow with the Pixel 7 outa full year later with thesame huge 50 megapixel sensor,they're back on track.And hey would you look at that,Pixel 7 right behind the Pixel6a in the blind camera test.So when I see iPhone 14 Pro photoslooking a little inconsistentand a little overprocessed right now,I actually see a lot of the same stuffthat Google just wentthrough with the Pixel.Because the iPhone story iskind of along the same lines,they used a small 12 megapixel sensorfor years and years and years.Then the 13 Pro sensor got a little biggerbut this year, the iPhone 14 Prois the first time they're bumping upto this dramaticallylarger 48 megapixel sensor.And so guess what?Some iPhone photos this yearare looking a little too processedand it's nothing extreme, but it's realand they will have to work on this.I suspect that by the time we getto iPhone 15 Pro, you know, a year later,they'll have some new softwarestuff they're working on.And I bet there's one newword they use on stage.You know, we finally have Deep Fusionand pixel-binning and all this stuff,I bet there's one new word they useto explain some softwareimprovement with the camera.But anyway, I think thiswill continue improvingwith software updates over timeand they'll continue to get it dialedand I think it'll be fine.But that's only half my theory.This does not explain whyall the previous 12 megapixel iPhonesalso all lost in the first roundin all those other bracket style tests.And this is a separate issuethat I'm actually alittle more curious aboutbecause as you might recall,all of our testing photoshave been photos of me.Now, this was on purpose, right?Like we specifically designed the teststo have as many potential factorsto judge a photo as possible.Like if it was just apicture of this figurinein front of a white wall,the winner would probably justbe whichever one's brighter,maybe whichever one has abetter gold color, basically.But then if we take the figurinewith some falloff in the backgroundnow we're judging bothcolor and background blur.Maybe you add a sky to the background,now you're also testingdynamic range and HDR.So yeah, with our latestphoto, it's a lot.It's two different skin tones.It's two different colored shirts.It's some textures for sharpness,the sky back there for a dynamic range,short-range falloff on the left,long-range falloff on the right.I mean with all these factors,whichever one people pick as a winnerideally is closer tothe best overall photo.I also wanted thepictures to be of a humanjust because I feel likemost of the importantpictures that people take,most often, that they careabout are of other humans.But as it turns out,using my own face as a subject for theserevealed a lot about howdifferent smartphoneshandle taking a picture of a human face.Because as I've already mentioned,these smartphone camerasare so much software nowthat the photo that you getwhen you hit that shutter buttonisn't so much realityas much as it is thiscomputer's best interpretationof what it thinks youwant reality to look like.And each company goes to a different levelof making different choicesand different optimizationsto change their picturesup to look different ways.They used to actually be alittle more transparent about it.There are phones thatwould literally identifywhen you're taking a landscape photoand they'd pump up any greensthey can find of the grassor they'd identify anypicture with a sky in itand pump up the bluesto make it look nicer.I did a whole video onsmartphone cameras versus realitythat I'll link below the Like buttonif you wanna check it out.But the point is, when yousnap that photo on your phone,you're not necessarilygetting back a captureof what was really in front of you.They're really bending it in many ways.The iPhone's thing iswhen you take a photoit likes to identify facesand evenly light them.It tries every time.And so this feels like apretty innocent thing, right?Like if you ask people normally,\"What do you think shouldlook good in a photo?\"And you say, \"Oh, I'll evenlylight all the faces in it.\"That sounds fine, right?And a lot of time it looks finebut it's a subtle thing like in a photowhere you can see the light iscoming from one side clearly,where you can see from the Pixel's camera,there's a shadow on theright side of the face.With the iPhone though,it's almost like someone walked upand added a little bounce fill, (chuckles)just a really nice littlesubtle bounce fill.But sometimes it looks a little off.Like look, this is thelow-light photo test we didfrom our blind camera test.On the left is the Pixel 7 again,which looks like all the other top dogs.And on the right is the iPhone 14 Prothat finished in the middle of the pack.It might be hard at firstto see why it looks so weirdbut look at how theycompletely removed the shadowfrom half of my face.I am clearly being litfrom a source that's to the side of me,and that's part of reality.But in the iPhone'sreality, you cannot tell,at least from my face, wherethe light is coming from.Every once in a while youget weird stuff like this.And it all comes back to the factthat it's software making choices.And the other half of that is skin tones.So you've heard me sayfor a few years in a rowthat I mostly prefer photoscoming from the Pixel's camera,and we've done lots of testswhere I have me as a sample photoand you can tell it looks really good.Turns out Google's done thisthing over the past few yearswith the Pixel camera called Real Tone.It doesn't get that much attention,but it turns out to be makinga pretty big difference here.Historically, a real issue forfilm cameras back in the daywas that they were calibratedfor lighter skin tonesand people with darker skin toneswould typically beunderexposed in those pictures.So now fast forward today,cameras are all software.Smartphone cameras are softwareso they can all make adjustmentsto account for different varietyof skin tones, of course.But they still all do it todifferent varying degrees.Like you might have noticeda lot of phones sold in Chinawill just brighten upfaces across the boardbecause that's what people preferin photos in that region very often.Google goes the extra mileto train their camerasoftware on data setsthat have a large variety of skin tonesto try to represent themcorrectly across the board.And that's what it's calling Real Tone.And Apple's cameras,from what I've observed,simply just like to evenlylight faces across the boardand doesn't necessarily accountfor different white balancesand exposures necessaryto accurately representdifferent types of skin toneswhen I think they totally could.So basically, it turnsout this is a big partof what we were observing in Pixel'sand a lot of the phonesthat do accurately represent my skin tonefinishing higher in thisblind voting thing that we didbecause they happen todo that really well.And that's a thing thatpeople really consideredwhen they voted on them.I haven't said this a lot,but I think this is oneof the earliest reasonsthat I actually reallyliked RED cameras was,you know, obviously 8K is great,Color Science is great,but the way it representsand renders my skin toneaccurately over a lot of,you know, the Sonys and the ARRIsand Canons that I've tried,that's actually one of the thingsthat really drew me to these cameras.So all this software stuffis why photo comparisonsbetween modern smartphones is so hard.Like there are a lot of channelsthat do a really good jobwith the side-by-sidephoto test, you know,but even as you're trying tolike pick one over the other,you've probably noticed this,you might like the way one of themrenders landscape photos over the otherbut the way a different one renders photoswith your own skin tone and then the waya different one renders photosof your pet, for example.So I'm sure Apple will defendeverything they're doing nowwith their current camerasas they typically do.But I'm gonna keep aneye on what I'm also surewhich is they're for sure workingon tuning these newcameras, dialing it in,and eventually getting it betterwith the iPhone 15 and 15 Pro.So back to the original questionfrom the beginning of the video,we can't leave that unanswered, which is,\"All right, the Pixel 6a,you like the Pixel photos, Marques,it won the blind scientific camera testbut you still gave the trophyfor best overall camerasystem to the iPhone,the very 14 Pro thatwe've been talking aboutthis whole video, why?\"And if you listen carefully,you already got it,which is that scientific test that we didtested one specific thing,it tested the the smallpostage-stamp-sized,you know, exposure andcolors general thingwith a bunch of different factors,but sharpness and detailwith all the compressionthat we did wasn't tested.Also, speed of autofocus,reliability of autofocus wasn't tested.The open-close time of the camera app,how fast and reliable you canget a shot, wasn't tested.And also video wasn't tested.So the microphone quality, video quality,speed and reliability of autofocus there,file formats, sharpness, HDR,all that stuff, wasn't tested.Maybe someday we will testall that, but until then,the lesson learned is the pretty picturesthat come from the Pixel orwhatever phone's in your pocketare partly photons, butprimarily processing.(relaxed music)Thanks for watching.Catch you guys the next one.Peace.(record crackling)\n"