All the pixels in a photo have certain level of brightness ranging from black to white. These values of the pixels serve as the input for the computer monitor. Due to technical limitations, CRT monitors output these values in a nonlinear way:
Output = Input ^ Gamma
Most CRT monitors have a gamma of 2.5 (when they are not adjusted) which means that pixels with a brightness of 0.5, will be displayed with a brightness of only 0.18 (0.5^2.5). Gamma encoding was originally developed to compensate for the input–output characteristic of a cathode ray tube (CRT) display. The electron-gun current, and thus light intensity, varies nonlinearly with the applied anode voltage. Altering the input signal by gamma compression cancels this nonlinearity, such that the output picture has the intended luminance.
LCDs, in particular those on notebooks, tend to have rather irregularly shaped output curves. Calibration via software or hardware ensures that the monitor outputs the image based on a predetermined gamma curve. This is typically 2.2 for Windows, which is approximately the inverse of the response of the human vision. The sRGB and Adobe RGB color spaces are also based on a gamma of 2.2.
A monitor with a gamma of 1.0 would respond in a linear way (Output = Input) and images created on a system with a gamma of 2.2 would appear “flat” and overly bright.