Skip to content

Commit

Permalink
Fix --bf16 option support for Neuron after PR #22300
Browse files Browse the repository at this point in the history
This PR fixes the "RuntimeError: No CUDA GPUs are available"
when running with --bf16 option on Neuron.

Related PRs:
#20684
#22300
  • Loading branch information
jeffhataws committed Mar 23, 2023
1 parent 0dcb46e commit fd81746
Showing 1 changed file with 6 additions and 1 deletion.
7 changes: 6 additions & 1 deletion src/transformers/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -585,7 +585,12 @@ def __init__(

if args.fp16 or args.bf16:
if args.half_precision_backend == "auto":
if args.device == torch.device("cpu"):
if is_torch_neuroncore_available():
if args.fp16:
raise ValueError("Tried to use `fp16` but this option is not yet supported on Neuron.")
else:
args.half_precision_backend = "cpu_amp"
elif args.device == torch.device("cpu"):
if args.fp16:
raise ValueError("Tried to use `fp16` but it is not supported on cpu")
elif _is_native_cpu_amp_available:
Expand Down

0 comments on commit fd81746

Please sign in to comment.