Skip to content

Files

Latest commit

 

History

History

examples

Examples

The following examples are IPU ready versions of the original transformers examples.

Table of Tasks

Here is the list of the examples:

Task Example datasets
language-modeling WikiText-2
multiple-choice SWAG
question-answering SQuAD
summarization XSUM
text-classification GLUE
translation WMT
audio-classification SUPERB KS
image-classification CIFAR-10
speech-pretraining LibriSpeech ASR

Tips

Requirements

For each example, you will need to install the requirements before being able to run it:

cd <example-folder>
pip install -r requirements.txt

Finding the right IPUConfig

Compared to transformers, one extra argument that you will need to pass to all of the examples is --ipu_config_name, which specifies compilation and parallelization information for a given model. You can find an example for all the model architectures we support on the 🤗 Hub under the Graphcore organization. For instance, for bert-base-uncased you can use Graphcore/bert-base-uncased.