Figure: Architecture of the Semi-Cycled Generative Adversarial Network (SCGAN) for unsupervised face super resolution.
We establish two independent degradation branches in the forward and backward cycle-consistent reconstruction processes, respectively, while the two processes share the same restoration branch. Our Semi-Cycled Generative Adversarial Networks (SCGAN) is able to alleviate the adverse effects of the domain gap between the real-world LR face images and the synthetic LR ones, and to achieve accurate and robust face SR performance by the shared restoration branch regularized by both the forward and backward cycle-consistent learning processes.
Semi-Cycled Generative Adversarial Networks for Real-World Face Super-Resolution
H Hou, J Xu, Y Hou, X Hu, B Wei, D Shen
IEEE Transactions on Image Processing
[Arxiv] [Paper] [Project Page]
Clone this repo.
git clone https://github.com/HaoHou-98/SCGAN.git
cd SCGAN/
Please install dependencies by
pip install -r requirements.txt
The prepared test set and trainning set can be directly downloaded here. After unzipping, put the imgs_test
and imgs_train
folders in the root directory.
The pre-trained model can be directly downloaded here. After unzipping, put the pretrained_model
folder in the root directory.
Once the dataset and the pre-trained model are prepared, the results be got using pretrained model.
-
Inference.
python test.py
-
The results are saved at
./test_results/
.
To train the new model, you need to put your own high-resolution and low-resolution face images into ./imgs_train/HIGH
and ./imgs_train/LOW
, respectively, and then
python train.py
The models are saved at ./train/models
Will be released soon.
If you use this code for your research, please cite our papers.
@ARTICLE{10036448,
author={Hou, Hao and Xu, Jun and Hou, Yingkun and Hu, Xiaotao and Wei, Benzheng and Shen, Dinggang},
journal={IEEE Transactions on Image Processing},
title={Semi-Cycled Generative Adversarial Networks for Real-World Face Super-Resolution},
year={2023},
volume={32},
number={},
pages={1184-1199},
doi={10.1109/TIP.2023.3240845}}
The code is released for academic research use only.