Quantcast
Channel: Should higher ISOs really be preferred (all other things being equal)? - Photography Stack Exchange
Viewing all articles
Browse latest Browse all 11

Should higher ISOs really be preferred (all other things being equal)?

$
0
0

My understanding of the ISO setting on digital cameras is that, unlike film-cameras, changing the ISO does not evoke any physical change in the camera. Rather, it simply tells the camera to multiply the analog-voltages it reads from the sensors by a constant number, which increases the brightness of that pixel in the output JPEG image. And since the RAW files store the actual voltages read, before any changes to the brightness take place, the values in the RAW file will be the same regardless of the ISO setting. Thus, if you're snapping pictures in RAW format only, the ISO setting does absolutely nothing(Also it would mean digital cameras with higher ISOs are completely a marketing gimmick).

However, this highly-voted post contradicts that. It shows the following image:
Noise at ISO settings
which claims that the ISO setting does affect the RAW output! † It also states "to minimise noise get as much light into the camera as possible then use the highest ISO you can without overexposing."

If my understanding is correct, all images with the same shutter-speed will result in the same (RAW) image, regardless of ISO setting. However, if the explanation in the above post is correct, (RAW) images that were taken with an incorrectly-exposed ISO setting can't be corrected in software without introducing extra noise. (I found this thread online in which several 'experts' argue back and forth about which understanding is correct, but never reach a conclusion)


To figure out which understanding is correct, I tried shooting an image at various ISOs and shutter-speeds, in RAW+JPEG mode. I then loaded the RAW files into Photoshop and applied auto-correct within "Camera Raw"(before the JPEG conversion).

These were the results (click to enlarge):

RAW images w/ PS Auto-Correct
(All shots taken with Sony a390 DSLR. Aperture f/5.6, 18-55mm zoom lens set to 55mm)

And for comparison, here were the JPEGs created for those same shots by the camera (no Photoshop correction applied):

JPEG images w/ PS Auto-Correct

It appears we're both wrong (What!?!??!!?). The ISO setting definitely did have a major difference in the final RAW image, but it seems that even when it causes under-exposure, using the lowest ISO setting still resulted in the least amount of noise!!

I assume that, to understand why that is, I need to know exactly how the ISO setting works in DLSRs - could someone please explain that to me? Is the sensor somehow physically made more sensitive, or is it a simple digital (or possibly analog) amplification of the voltage signal? Or does it work differently in different cameras (mine is a rather low-tier DLSR)? If the sensor doesn't physically become more sensitive, why does the ISO setting affect the RAW image? Why did an underexposed ISO100 image result in less noise (after photoshop correction) than the same image with the same aperture/shutter at (a correctly-exposed) ISO3200?


(At least, I think that's what he's saying. The post is ambiguous as to whether the auto-correction was done to the RAW or the JPEG file. I'm just assuming it was done to the RAW, though, as doing it to the JPEG would just be stupid - he'd be amplifying the compression+quantization noise, not the camera noise, which would make the entire post incorrect)


Viewing all articles
Browse latest Browse all 11

Latest Images

Trending Articles





Latest Images