Chat template not being used despite passing --apply_chat_template
option
#2579
Labels
bug
Something isn't working.
--apply_chat_template
option
#2579
I am having an issue when evaluating several instructed models using the LM Evaluation Harness and the
--apply_chat_template
option. I am passing this option and I ensured that the models have chat templates available in their tokenizer configurations, but when I check the results JSON file, thechat_template
field is null, which, as I understand, means that the chat template was not used in the evaluation to obtain such results.For example, I am evaluating the utter-project/EuroLLM-1.7B-Instruct model, which has a chat template defined in its
tokenizer_config.json
. I'm using a local clone of this model and I've confirmed thechat_template
field is there:I ran evaluation on the
catcola
task with the following command:But in the results file, the chat template is null.
I'm attaching a zip file with the results file and the two log files from the Slurm job, with stdout in
chat_template_test.out
and stderr inchat_template_test.err
. There doesn't seem to be anything about chat template issues in the logs.Would appreciate any help to solve this issue!
chat_template_test.zip
The text was updated successfully, but these errors were encountered: