This poster is published under an open license
. Please read the disclaimer
for further details.
Radiation physics, Breast, Mammography, Observer performance, Screening, Physics
E. Salvagnini1, H. Bosmans1, C. Van Ongeval1, A. van Steen1, K. Michielsen1, L. Cockmartin1, L. Struelens2, N. Marshall1; 1Leuven/BE, 2Mol/BE
Aims and objectives
Theoretical indexes of detectability,
such as signal-difference-to-noise ratio (SDNR) and threshold gold thickness obtained from c-d analysis,
show a decrease of object detectability/visibility as compressed breast thickness increases.
This is the consequence of the automatic exposure control (AEC) setup of most digital mammographic system which are currently programmed to maintain constant pixel value at the detector as a function of thickness [1,2]. While simple AEC tests,
for example using a thin square of Aluminium imaged on a uniform background,
show a reduction in object detectability,
it is not clear whether clinical data follow the same trend.
Given the importance of optimizing lesion detectability for all women invited for breast screening,
the main aim of the study was to investigate lesion detectability as a function of compressed breast thickness via simulation of lesions into real patient data (mammograms).