RuntimeError: self and mat2 must have the same dtype, but got Float and BFloat16
when training with torch_compile
#35382
Labels
System Info
transformers
version: 4.48.0.dev0Who can help?
@ArthurZucker @muellerzr @SunMarc
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
When I try continual pretraining ModernBERT for a MLM objective with the
torch_compile
flag of myTrainingArguments
set toTrue
, I get the below error:This does not occur when finetuning for a classification task.
I am using
bfloat16
mixed precision.Expected behavior
The training works.
The text was updated successfully, but these errors were encountered: