Assessment of medical image quality and how changes in image appearance impact performance are critical but assessment can be expensive and time-consuming. Could an animal (pigeon) observer with well-known visual skills and documented ability to distinguish complex visual stimuli serve as a surrogate for the human observer? Using sets of whole slide pathology (WSI) and mammographic images we trained pigeons (cohorts of 4) to detect and/or classify lesions in medical images. Standard training methods were used. A chamber equipped with a 15' display with a resistive touchscreen was used to display the images and record responses (pecks). Pigeon pellets were dispensed for correct responses. The pigeons readily learned to distinguish benign from malignant breast cancer histopathology in WSI (mean % correct responses rose 50% to 85% over 15 days) and generalized readily from 4X to 10X and 20X magnifications; to detect microcalcifications (mean % correct responses rose 50% to over 85% over 25 days); to distinguish benign from malignant breast masses (3 of 4 birds learned this task to around 80% and 60% over 10 days); and ignore compression artifacts in WSI (performance with uncompressed slides averaged 95% correct; 15:1 and 27:1 compression slides averaged 92% and 90% correct). Pigeons models may help us better understand medical image perception and may be useful in quality assessment by serving as surrogate observers for certain types of studies.