Skip to content

Commit

Permalink
Read me fisrt
Browse files Browse the repository at this point in the history
  • Loading branch information
yihuacheng authored Dec 18, 2019
1 parent 5636596 commit b238d8d
Showing 1 changed file with 21 additions and 26 deletions.
47 changes: 21 additions & 26 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@


## Introduction
This is the README file for the official code associated with the ECCV2018 paper, "Appearance-Based Gaze Estimation via Evaluation-Guided Asymmetric Regression".
This is the README file for the official code associated with the ECCV2018 paper, "Appearance-Based Gaze Estimation via Evaluation-Guided Asymmetric Regression".<br>

Our academic paper which describe ARE-Net in detail and provides full result can be found here: \[[PAPER]( http://openaccess.thecvf.com/content_ECCV_2018/papers/Yihua_Cheng_Appearance-Based_Gaze_Estimation_ECCV_2018_paper.pdf)\].
Our academic paper which describe ARE-Net in detail and provides full result can be found here: \[[PAPER]( http://openaccess.thecvf.com/content_ECCV_2018/papers/Yihua_Cheng_Appearance-Based_Gaze_Estimation_ECCV_2018_paper.pdf)\].<br>


## Usage
We also ask that you cite the associated paper if you make use of this dataset; following is the BibTeX entry:
We also ask that you cite the associated paper if you make use of this dataset; following is the BibTeX entry:<br>
```
@inproceedings{eccv2018_are,
Author = {Yihua Cheng and Feng Lu and Xucong Zhang},
Expand All @@ -19,7 +19,7 @@ Booktitle = {European Conference on Computer Vision (ECCV)}
```

## Enviroment
To using this code, you should make sure following libraries are installed first.
To using this code, you should make sure following libraries are installed first.<br>
```
Python>=3
Tensorflow-GPU>=1.10
Expand All @@ -28,37 +28,32 @@ numpy, os, math etc., which can be found in the head of code.
```

## Code
You need to modify the **config.yaml** first especially *data/label* and *data/root* params.
*data/label* represents the path of label file.
*data/root* represents the path of image file.
You need to modify the **config.yaml** first especially *data/label* and *data/root* params.<br>
*data/label* represents the path of label file.<br>
*data/root* represents the path of image file.<br>

A example of label file is **data** folder. Each line in label file is conducted as:
A example of label file is **data** folder. Each line in label file is conducted as:<br>
```
p00/left/1.bmp p00/right/1.bmp p00/day08/0069.bmp -0.244513310176,0.0520949295694,-0.968245505778 ... ...
```
Where our code reads image data form `os.path.join(data/root, "p00/left/1.bmp")` and reads gts of gaze direction from the rest in label file.
Where our code reads image data form `os.path.join(data/root, "p00/left/1.bmp")` and reads gts of gaze direction from the rest in label file.<br>

**Train**: We use mode 1 to represent train mode. You can train model with:
```
python main.py -m 1
```
## Run the code
We provide two optional args, which are -m and -n.<br>
-m represet the running mode. We use 1 for train mode, 2 for predict mode and 3 for evaluate mode.<br>

**Predict**: We use mode 2 to predict result. You can predcit result with
```
python main.py -m 2
```
Note that, you can not get accuracy in this mode.
You can train model with 1, get predicted result with 2 and get model accuracy with 3.<br>

**Evaluate**: We use mode 3 to evaluate trained model. You can use this mode to obtain accuracy of our model.
For example, you can use the code like:<br>
```
python main.py -m 3
python main.py -m 13
```
to train and evaluate model together.<br>

Meanwhile, you can also use the code like:
-n represet the number of test file in 'leave-one-person-out' strategy.<br>
For example, *data/label* provide 15 label file. Use
```
python main.py -m 13
python main.py -m 13 -n 0
```
to train and evaluate model together.



you train and evaluate the model with using the first person (p00.label) as test file. <br>
Note that, we add a loop in **main.py** to perform `leave-one-person-out` automatically. You can delete it for your individual usage.<br>

0 comments on commit b238d8d

Please sign in to comment.