Recently, the face hallucination method based on deep learning understands the mapping between low-resolution (LR) and high-resolution (HR) facial patterns by exploring the priors of facial structure. However, how to maintain the face structure consistency after the reconstruction of face images at different scales is still a challenging problem. In this letter, we propose a novel multi-scale structure prior learning (MSPL) for face hallucination. First, we propose a multi-scale structure prior block (MSPB). Considering the loss of high-frequency information in the LR space, we mainly process the input image in three different scale ascending dimensional spaces, and map the image to the high dimensional space to extract multi-scale structural prior information. Then the size of feature maps is recovered by downsampling, and finally the multi-scale information is fused to restore the feature channels. On this basis, we propose a local detail attention module (LDAM) to focus on the local texture information of faces. We conduct extensive face hallucination reconstruction experiments on a public face dataset (LFW) to verify the effectiveness of our method.
Yuexi YAO
Wuhan Institute of Technology
Tao LU
Wuhan Institute of Technology
Kanghui ZHAO
Wuhan Institute of Technology
Yanduo ZHANG
Wuhan Institute of Technology
Yu WANG
Wuhan Institute of Technology
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Yuexi YAO, Tao LU, Kanghui ZHAO, Yanduo ZHANG, Yu WANG, "Face Hallucination via Multi-Scale Structure Prior Learning" in IEICE TRANSACTIONS on Fundamentals,
vol. E106-A, no. 1, pp. 92-96, January 2023, doi: 10.1587/transfun.2022EAL2039.
Abstract: Recently, the face hallucination method based on deep learning understands the mapping between low-resolution (LR) and high-resolution (HR) facial patterns by exploring the priors of facial structure. However, how to maintain the face structure consistency after the reconstruction of face images at different scales is still a challenging problem. In this letter, we propose a novel multi-scale structure prior learning (MSPL) for face hallucination. First, we propose a multi-scale structure prior block (MSPB). Considering the loss of high-frequency information in the LR space, we mainly process the input image in three different scale ascending dimensional spaces, and map the image to the high dimensional space to extract multi-scale structural prior information. Then the size of feature maps is recovered by downsampling, and finally the multi-scale information is fused to restore the feature channels. On this basis, we propose a local detail attention module (LDAM) to focus on the local texture information of faces. We conduct extensive face hallucination reconstruction experiments on a public face dataset (LFW) to verify the effectiveness of our method.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.2022EAL2039/_p
Copy
@ARTICLE{e106-a_1_92,
author={Yuexi YAO, Tao LU, Kanghui ZHAO, Yanduo ZHANG, Yu WANG, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Face Hallucination via Multi-Scale Structure Prior Learning},
year={2023},
volume={E106-A},
number={1},
pages={92-96},
abstract={Recently, the face hallucination method based on deep learning understands the mapping between low-resolution (LR) and high-resolution (HR) facial patterns by exploring the priors of facial structure. However, how to maintain the face structure consistency after the reconstruction of face images at different scales is still a challenging problem. In this letter, we propose a novel multi-scale structure prior learning (MSPL) for face hallucination. First, we propose a multi-scale structure prior block (MSPB). Considering the loss of high-frequency information in the LR space, we mainly process the input image in three different scale ascending dimensional spaces, and map the image to the high dimensional space to extract multi-scale structural prior information. Then the size of feature maps is recovered by downsampling, and finally the multi-scale information is fused to restore the feature channels. On this basis, we propose a local detail attention module (LDAM) to focus on the local texture information of faces. We conduct extensive face hallucination reconstruction experiments on a public face dataset (LFW) to verify the effectiveness of our method.},
keywords={},
doi={10.1587/transfun.2022EAL2039},
ISSN={1745-1337},
month={January},}
Copy
TY - JOUR
TI - Face Hallucination via Multi-Scale Structure Prior Learning
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 92
EP - 96
AU - Yuexi YAO
AU - Tao LU
AU - Kanghui ZHAO
AU - Yanduo ZHANG
AU - Yu WANG
PY - 2023
DO - 10.1587/transfun.2022EAL2039
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E106-A
IS - 1
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - January 2023
AB - Recently, the face hallucination method based on deep learning understands the mapping between low-resolution (LR) and high-resolution (HR) facial patterns by exploring the priors of facial structure. However, how to maintain the face structure consistency after the reconstruction of face images at different scales is still a challenging problem. In this letter, we propose a novel multi-scale structure prior learning (MSPL) for face hallucination. First, we propose a multi-scale structure prior block (MSPB). Considering the loss of high-frequency information in the LR space, we mainly process the input image in three different scale ascending dimensional spaces, and map the image to the high dimensional space to extract multi-scale structural prior information. Then the size of feature maps is recovered by downsampling, and finally the multi-scale information is fused to restore the feature channels. On this basis, we propose a local detail attention module (LDAM) to focus on the local texture information of faces. We conduct extensive face hallucination reconstruction experiments on a public face dataset (LFW) to verify the effectiveness of our method.
ER -