Skip to content

Yixli9811/DeepDiffChrome

Repository files navigation

Final Project for COMSW 4995: DeepDiffChrome

DeepDiff: Deep-learning for predicting Differential gene expression from histone modifications

@article{ArDeepDiff18,
author = {Sekhon, Arshdeep and Singh, Ritambhara and Qi, Yanjun},
title = {DeepDiff: DEEP-learning for predicting DIFFerential gene expression from histone modifications},
journal = {Bioinformatics},
volume = {34},
number = {17},
pages = {i891-i900},
year = {2018},
doi = {10.1093/bioinformatics/bty612},
URL = {http://dx.doi.org/10.1093/bioinformatics/bty612},
eprint = {/oup/backfile/content_public/journal/bioinformatics/34/17/10.1093_bioinformatics_bty612/2/bty612.pdf}
}

Training Model

To train, validate and test the model for celltypes "Cell1" and "Cell2":

      python train.py --cell_1=Cell1 --cell_2=Cell2 --model_name=raw_d --epochs=120 --lr=0.0001 --data_root=data/ --save_root=Results/

Other Options

  1. To specify DeepDiff variation:
    --model_name=
          raw_d: difference of HMs
          raw_c: concatenation of HMs
          raw: raw features- difference and concatenation of HMs
          raw_aux: raw features and auxiliary Cell type specific prediction features
          aux: auxiliary Cell type specific prediction features
          aux_siamese: auxiliary Cell type specific prediction features with siamese auxiliary
          raw_aux_siamese: raw features and auxiliary Cell type specific prediction features with siamese auxiliary

  2. To save attention maps:
          use option --save_attention_maps : saves Level II attention values in .txt file

  3. To change rnn size:
          --bin_rnn_size=32

Testing

To only test on a saved model:
python train.py --test_on_saved_model --model_name=raw_d --data_root=data/ --save_root=Results/

Note: Due to the acceptable file sizes, we could not upload our training data and the model checkpoints.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published