The following examples are IPU ready versions of the original transformers examples.
Here is the list of the examples:
Task | Example datasets |
---|---|
language-modeling | WikiText-2 |
multiple-choice | SWAG |
question-answering | SQuAD |
summarization | XSUM |
text-classification | GLUE |
translation | WMT |
audio-classification | SUPERB KS |
image-classification | CIFAR-10 |
speech-pretraining | LibriSpeech ASR |
For each example, you will need to install the requirements before being able to run it:
cd <example-folder>
pip install -r requirements.txt
Compared to transformers, one extra argument that you will need to pass to all of the examples is --ipu_config_name
, which specifies compilation and parallelization information for a given model.
You can find an example for all the model architectures we support on the 🤗 Hub under the Graphcore organization. For instance, for bert-base-uncased
you can use Graphcore/bert-base-uncased
.