Releases: jbrry/Irish-BERT
Releases · jbrry/Irish-BERT
BERT, ELECTRA and RoBERTA training with WikiBERT upstream processing (as submitted to LREC)
The v0.1.0
release of Irish-BERT supports training BERT and ELECTRA models and uses the upstream preprocessing pipeline wiki-bert-pipeline
as well as optional OpusFilter
filtering. This release also supports some trial experiments on training a RoBERTA model using Jax/Flax, see scripts/jax-flax/
.
To use the v0.1.0
release of Irish-BERT, please also use the appropriate versions of wiki-bert-pipeline
and OpusFilter
which are listed below:
-
wiki-bert-pipeline
https://github.com/jbrry/wiki-bert-pipeline/releases/tag/v0.1.0 -
OpusFilter
: https://github.com/jbrry/OpusFilter/releases/tag/1.1.0.
This release contains a snapshot of the code for the models that were described in gaBERT — an Irish Language Model.