Prompt template in plain text?

#15
by apepkuss79 - opened

We'd like to support this model in LlamaEdge. The GGUF version of the model has been converted at second-state/EXAONE-3.0-7.8B-Instruct-GGUF. Could you please provide the prompt template in plain text? Thanks a lot!

LG AI Research org

Hello, apepkuss79.

You can find the chat template of the EXAONE in tokenizer_config.json.

This is the chat template up to 2nd turn.

[|system|]system_prompt_text[|endofturn|]\n[|user|]user_1st_turn_text\n[|assistant|]assistant_1st_turn_text[|endofturn|]\n[|user|]user_2nd_turn_text\n[|assistant|]assistant_2nd_turn_text[|endofturn|]

This is an example up to 2nd turn.

[|system|]You are EXAONE model from LG AI Research, a helpful assistant.[|endofturn|]
[|user|]Hello.
[|assistant|]Hello! I'm EXAONE 3.0, an advanced language model developed by LG AI Research.[|endofturn|]
[|user|]Who are you?
[|assistant|]I'm EXAONE 3.0, a large-scale multimodal AI model designed to understand and generate human-like text.[|endofturn|]

Have a good day!

Is it necessary to set a reverse prompt to stop the generation? Thanks!

LG AI Research org

You just set the stop token to '[|endofturn|]', which is the special token of EXAONE.

Sign up or log in to comment