There has been a lot of buzz recently about whether ISO is meaningful or not in a DLSR world, and coming from a background in electronics this I’d like to add my own perspective. To explain how ISO impacts an image we have to look at the way the DSLR sensor converts light.
A DSLR sensor consists of millions of elements, each capable of converting light into an analog voltage output. This output, varying with light intensity, is then fed to an AD converter to present a digital output for further processing.
Let's look deeper at a single element. As light strikes this it will create an output voltage in proportion to the intensity of light striking it. As with any imperfect system there will be an added component of electrical noise introduced into this voltage. In most circumstances the voltage created by noise will be minor compared to that created by light but as the light intensity falls then the percentage of signal that represents noise increases when compared to that created by the light.
To visualize, let say on a scale of 0 to 1000 the brightest light the sensor can capture is 1000, and in an absence of light there is an average sensor output (noise) of 1. In a strong light scenario the signal to noise ratio is 1000/1, the noise being insignificant. However, in a low light scenario our comparable light might drop down to 10, giving us a SN ratio of 10/1. Obviously in this case noise becomes a more significant part of the image. So now we understand that the lower the light, the higher the impact of the noise component.
So what does this have to do with ISO? As a reminder of history in the film days ISO referred to the sensitivity of the film itself. In the DLSR world this is a throwback to those days and does not really relate. Changing the ISO on a DLSR does not change the sensitivity of the sensor in any way, but simply informs the processor how signal output should be multiplied (ie to replicate the effects in a film scenario).
Taking our low-light example above it does not matter whether we shoot the image at ISO 100, 200 or even 3200, the signal output from the sensor is the same. If we assume that the baseline for this example is ISO 100, what actually happens when we shoot at ISO 200 is that a multiplication factor of 2 will be applied to the sensor output. What is important to understand is that this multiplication factor will be applied to the noise also - the more we have to amplify the light/signal to get within acceptable range the greater the introduction of noise.
As pointed out in recent debates, image quality should not differ whether you shoot at a high ISO, or whether instead you drive up the exposure in post-processing to accomplish the same exposure (assuming that noise introduced in the post-processing process is so minimal it can be discounted which appears to be the case).
So if we are saying that ISO is somewhat irrelevant why should we consider it when shooting? Well all is well and good with the theory but we still need to see the histogram and the image on the LCD display - for both we need output so adjusting ISO (output multiplier) to get within desired range makes perfect sense.
At the end of the day the old story remains true; more and more noise will be introduced as you increase ISO (OR push in post-processing), and that for the highest quality image you have to have the highest quality light.