You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
row_dict.get("extra_info", {}).get("apply_chat_template", True)
if apply_chat_template:
prompt_with_chat_template = self.tokenizer.apply_chat_template(chat, add_generation_prompt=True, tokenize=False)
else:
assert isinstance(chat, str), "If not applying chat_template, the prompt inside the dataentry should be a string"
prompt_with_chat_template = chat
训练数据预处理部分有个make_prefix,给出了<|im_start|>system等前缀,但事实上只要训练数据给定了"prompt": [{
"role": "system",
"content": system,
},{
"role": "user",
"content": question,
}],verl会使用qwen- instruct的apply_chat_template对prompt打包出前缀,不需要手动做这个prefix,不知道是不是我理解的有问题。
The text was updated successfully, but these errors were encountered: