Loading…
Underwater image clarifying based on human visual colour constancy using double‐opponency
Underwater images are often with biased colours and reduced contrast because of the absorption and scattering effects when light propagates in water. Such images with degradation cannot meet the needs of underwater operations. The main problem in classic underwater image restoration or enhancement m...
Saved in:
Published in: | CAAI Transactions on Intelligence Technology 2024-06, Vol.9 (3), p.632-648 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Underwater images are often with biased colours and reduced contrast because of the absorption and scattering effects when light propagates in water. Such images with degradation cannot meet the needs of underwater operations. The main problem in classic underwater image restoration or enhancement methods is that they consume long calculation time, and often, the colour or contrast of the result images is still unsatisfied. Instead of using the complicated physical model of underwater imaging degradation, we propose a new method to deal with underwater images by imitating the colour constancy mechanism of human vision using double‐opponency. Firstly, the original image is converted to the LMS space. Then the signals are linearly combined, and Gaussian convolutions are performed to imitate the function of receptive fields (RFs). Next, two RFs with different sizes work together to constitute the double‐opponency response. Finally, the underwater light is estimated to correct the colours in the image. Further contrast stretching on the luminance is optional. Experiments show that the proposed method can obtain clarified underwater images with higher quality than before, and it spends significantly less time cost compared to other previously published typical methods. |
---|---|
ISSN: | 2468-2322 2468-2322 |
DOI: | 10.1049/cit2.12260 |