The goal of the study was to compare a cathode-ray-tube (CRT) digital display with film by using task-dependent image quality assessment methods. Contrast-detail analysis was utilized. Human observers performed a simple detection task, specifically, detecting a pillbox target in a uniform Poisson field, using either film or a digital display that employed a CRT monitor. Observers performed equally well on both film and CRT when the window settings of the digital display were established subjectively by a radiologist. Changing the window settings of the digital display to match the average background luminance of a film-illuminator combination decreased the luminance contrast of the targets and observer performance was reduced, though these effects were probably not linked. The "gold standard" film had lower luminance contrast than the CRT displayed images, yet observer performance was never lower for film than for the CRT. Therefore we concluded that luminance contrast was not a limiting factor for observer performance in this study. The CRT monitor changed fairly rapidly after it was calibrated. During a period of six months the gamma of the display increased from 1.82 to 2.42 and the maximum luminance decreased from 319 to 228 cd/m2. Low luminance output demonstrated a larger percentage decrease (approximately equal to 85%) than high luminance output (approximately equal to 29%) over the same time period. These observations suggest that standard window settings should be reviewed from time to time to ensure that the display is used optimally. No special look-up table setup such as perceptual linearization was used.