Reference-based super-resolution for remote sensing images using a hybrid edge-aware loss function

Authors

  • Rajalaxmi Padhy Odisha University of Technology and Research, Bhubaneswar, Odisha, India. https://orcid.org/0009-0001-9672-979X
  • Sanjit Kumar Dash Odisha University of Technology and Research, Bhubaneswar, Odisha, India. https://orcid.org/0000-0003-4244-7591
  • Mohammed Altaf Ahmed Department of Computer Engineering, College of Computer Engineering and Sciences, Prince Sattam bin Abdulaziz University, Al-Kharj, Saudi Arabia. https://orcid.org/0000-0003-0355-7835
  • Sultan Alqahtani Department of Computer Engineering, College of Computer Engineering and Sciences, Prince Sattam bin Abdulaziz University, Al-Kharj, Saudi Arabia.

DOI:

https://doi.org/10.18488/76.v13i2.4929

Abstract

Acquiring high-resolution images is crucial for accurate remote sensing analysis; however, such data are often limited by sensor constraints, atmospheric conditions, and acquisition costs. Reference-based super-resolution (RefSR) addresses this limitation by using auxiliary high-resolution images, but the presence of domain mismatches due to changes in illumination, viewpoint, and sensor characteristics severely limits its performance, often resulting in blurred edges and structural distortions. To overcome these problems, this paper presents a reference-based super-resolution framework that integrates a hybrid edge-aware loss function into a domain-adaptive transfer super-resolution architecture. The proposed method first employs grayscale transformation for domain matching, followed by Whitening and Coloring Transform and Phase Replacement for efficient domain adaptation and texture alignment. To supervise the edges and overall structure more closely during image reconstruction, the authors have combined Sobel and Laplacian edge constraints in a new hybrid loss function. Experiments on the DIV2K dataset using a 4× scaling factor reveal that the method presented in this paper consistently generates better results than the baseline DATSR model, with substantial improvements in PSNR and SSIM metrics and visually sharper, more structurally coherent images. Furthermore, qualitative analyses verify that the images obtained from super-resolution preserve better edges with less boundary blurring. This approach serves as an efficient and computationally feasible solution for improving image quality in situations of domain mismatch, making it suitable for high-resolution remote sensing applications such as urban monitoring, environmental analysis, and industrial innovation in line with sustainable development goals.

Keywords:

Domain adaptation module, Domain matching module, Feature aggregation, Hybrid loss function, Industry and Innovation, Remote sensing, SDG, Super-resolution image.

Downloads

Download data is not yet available.

Published

2026-04-23

How to Cite

Padhy, . . R., Dash, . . S. K., Ahmed, . . M. A. ., & Alqahtani, S. (2026). Reference-based super-resolution for remote sensing images using a hybrid edge-aware loss function . Review of Computer Engineering Research, 13(2), 22–36. https://doi.org/10.18488/76.v13i2.4929