tokenizer
should be replaced to processing_class
in Seq2SeqTrainer
?
#35446
Labels
tokenizer
should be replaced to processing_class
in Seq2SeqTrainer
?
#35446
System Info
transformers
version: 4.47.1Who can help?
@amyeroberts @ArthurZucker
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
In
trainer_seq2seq.py
file, there is still callingself.tokenizer.
which produces deprecation warning "Trainer.tokenizer is now deprecated. You should use Trainer.processing_class instead."Expected behavior
I believe self.tokenizer should be replaced to self.processing_class
Is it okay for me to make a PR for this issue? 😄
The text was updated successfully, but these errors were encountered: