Quantization aware training and CNN model converter for FPGA based Intuitus hardware accelerator.
Quantization aware training is based on https://github.com/SpursLipu/YOLOv3v4-ModelCompression-MultidatasetTraining-Multibackbone.git.
- Training and testing multiple Yolov3 or Yolov4 implementations
- Quantization-aware training
- Testing expected FPGA behaviour using Torch model with GPU supported computations
- Converting Torch model into Keras
- Post-training quantization using tensorflow quantizer
- Translation of quantized Torch or Keras models into FPGA interpretable commands
- conv2d (kernel sizes: [1x1,3x3,5x5]; strides: [1,2])
- inplace maxpool2d (stride 2 only)
- maxpool2d
- upsample
- concat
- split
- inplace yolo layer
- fully connected
- inverse bottleneck
- residual
cd Intuitus-converter
pip install -e .
- Select a pretrained model and download pretrained weights
- Use train_torch_yolo.py for quantization aware training
- Fuse batchnormalization layer and retrain till convergence if neccessary (set quantize 0, FPGA = False)
- Change activation to ReLU by editing cfg file (neccessary for FPGA) and retrain till convergence (keep quantize = 0)
- Set quantize to 1 and retrain till convergence
- Use torch_convert_postscale.py to fuse quantization scaling to a single scale (shift)
- Set quantize to 2 and test if mAP has changed to to scale fusing (should not have changed)
- Set FPGA = True to test which mAP can be achieved using the Intuitus hardware accelerator
- Generate commands for the Intuitus hardware accelerator using generate_intuitus_from_torch.py
- Test the results on hardware following the steps in the related projects
Project | link |
---|---|
Intuitus Interface | https://github.com/LukiBa/Intuitus-intf.git |
Yolov3-tiny example application for Zybo-Z7-20 board | https://github.com/LukiBa/zybo_yolo.git |
Intuitus device driver | link to kernel module comming soon (contact author for sources) |
Vivado Example project | https://github.com/LukiBa/zybo_yolo_vivado.git |
Intuitus FPGA IP | encrypted trial version comming soon (contact author) |
Lukas Baischer lukas_baischer@gmx.at