Objective image quality assessment (IQA) aims to rate the visual quality of an image automatically, which plays an important role in various applications, such as monitoring users’ quality of experience,1 supporting adaptive image/video transmission and compression,2 and measuring the diagnostic quality of medical images.3 Based on the availability of the reference image, IQA methods can be classified as full-reference IQA (FR-IQA) and no-reference IQA (NR-IQA). Among them, FR-IQA methods assume full access to the reference image, whereas NR-IQA algorithms do not require the reference information.4 In many practical applications, reference images are usually unavailable. Thus, NR-IQA, also called blind IQA (BIQA), can have a broader range of applications than FR-IQA.
Abstract:
The image quality assessment (IQA) for medical images has been challenging due to their usage for diagnostic purposes. Traditional convolutional-neural-network-based IQA ...Show MoreMetadata
Abstract:
The image quality assessment (IQA) for medical images has been challenging due to their usage for diagnostic purposes. Traditional convolutional-neural-network-based IQA models usually assess the image quality on a global scale, which is unsuitable for inferring the local quality of medical images from the diagnostic attention perspective. To alleviate this problem, we design and introduce a region-of-interest (ROI)-guided attention mechanism in medical IQA and propose a novel dual attention IQA model. In addition, we have also constructed a diagnosis-oriented medical image dataset containing 1132 images, all rated by experienced radiologists subjectively, to allow effective training for the proposed model. Our experimental results show that the proposed ROI-guided dual attention IQA model can well reflect radiologists’ attention for diagnostic purposes. Compared with the state-of-the-art IQA models, the IQA output of the proposed model is significantly more consistent with subjective evaluation by experienced radiologists.
Published in: IEEE MultiMedia ( Volume: 30, Issue: 4, Oct.-Dec. 2023)