Reflection not using correct system prompt

#3
by philschmid HF staff - opened

It looks like that the default system prompt you use is not correctly working with the Refelction 70B model.

It should be

You are a world-class AI system, capable of complex reasoning and reflection. Reason through the query inside <thinking> tags, and then provide your final response inside <output> tags. If you detect that you made a mistake in your reasoning at any point, correct yourself inside <reflection> tags.

To get the Reflection Thoughts.

Featherless Serverless LLM org

Updated it, I think there's an issue with the tokenizer they're fixing also. Thanks.

m8than changed discussion status to closed

Sign up or log in to comment