What is the version of the HuggingChat?

#39
by aledane - opened

Hi,

I would like to understand what is the configuration used in the HuggingChat, since it gives to me some results that are very different from what I am getting and I'd like to obtain the same.

Thank you.

aledane changed discussion title from What is the version of the chat? to What is the version of the HuggingChat?
Mistral AI_ org

Could u be more clear? How different are they?

Yes exactly. I want to understand if, with respect to the standard model we find in this page, for the HuggingChat they are using some set of parameters or if everything is set as default, or if there is an intermediate step (like a fine tuning). Thanks.

Mistral AI_ org

Sorry, I might not have been clear, I meant to ask what exactly differs from the answers the model in huggingchat provided compared to this one? Cause it shouldnt be a fine tuned version or anything, they should be the same I believe.

Ok, I try to explain my use case at high level.

I am trying to use this model as binary classifier. So, I am using a prompt in such a way that the LLM takes a text and classifies it in a class or the other one.
I used the model with the original setting of parameters (by using the HuggingFace pipeline) within the "text-generation" task.
However, I get different answers by the HuggingChat with respect to my model deployed on AWS Sagemaker (I tried on some texts and the classification is often different).
For this reason, I would like to understand if the model deployed in the HuggingChat has some particular configration or other that differs from that one you obtain by using the "pipeline" mode of HuggingFace.

I hope to have been a bit more clear.

Mistral AI_ org

I see, it should be the same for as far as I know it. If you want deterministic results however (the same result every time you give it the same input) you must set the temperature to 1, and just to be sure, are you using the prompt format provided? It would be really helpful if I could have some code to see what's going on.

Mistral AI_ org

For information, you can access the source code of certain spaces by yourself on hugging face. For example: https://huggingface.co/spaces/huggingchat/chat-ui-template has a lot of information about the parameters used in https://huggingface.co/spaces/huggingchat/chat-ui-template/tree/main/defaults .

I cannot share a link unfortunately since I am working as consultant.
Anyway, the configuration setting for now is just the following (I noticed that I need a max_new_tokens set otherwise the answer was too short).

                            "torch_dtype":torch.bfloat16,
                            "return_full_text" : True,
                            "device_map" : "auto",
                            "max_new_tokens" : 64

and for the prompt format, I am using this:

                "model_type" : "hugging-face",
                "prompt_format" : "<s>[INST] {instructions}\nInput:\n{input}\n[/INST]"

where instructions is the prompt where I ask to classify in two classes.
Then I have some Python classes implemented to call the model, but basically I am doing this:

self.hf_pipeline_llm = HuggingFacePipeline(pipeline=self.pipeline_llm)
self.pipeline_llm = pipeline(task=self.model_task,
model=self.model_name,
**self.model_params)

Mistral AI_ org

I see, I just found the parameters you asked for by the way:
MODELS=[ { "name": "mistralai/Mistral-7B-Instruct-v0.1", "displayName": "mistralai/Mistral-7B-Instruct-v0.1", "description": "Mistral 7B is a new Apache 2.0 model, released by Mistral AI that outperforms Llama2 13B in benchmarks.", "websiteUrl": "https://mistral.ai/news/announcing-mistral-7b/", "preprompt": "", "chatPromptTemplate" : "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first }}{{#if @root .preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}}{{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s>{{/ifAssistant}}{{/each}}", "parameters": { "temperature": 0.1, "top_p": 0.95, "repetition_penalty": 1.2, "top_k": 50, "truncate": 3072, "max_new_tokens": 1024, "stop": ["</s>"] }, "promptExamples": [ { "title": "Write an email from bullet list", "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)" }, { "title": "Code a snake game", "prompt": "Code a basic snake game in python, give explanations for each step." }, { "title": "Assist in a task", "prompt": "How do I make a delicious lemon cheesecake?" } ] } ]

You can find it here: https://huggingface.co/spaces/huggingchat/chat-ui/blob/main/.env

Maybe this will help you out !

Mistral AI_ org
β€’
edited Jan 15

Also, I noticed your prompt format is wrong.

The example provided in the READ_ME is:
text = "<s>[INST] What is your favourite condiment? [/INST]"
"Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!</s> "
"[INST] Do you have mayonnaise recipes? [/INST]"


As it suggests, the <s></s> section is only required for the past discussions, the instruction you are making now does not require, to be simple, you can do this if you do not care about the past exchanges with the bot:
"[INST] {instructions}\nInput:\n{input}\n[/INST]"

But I recommend you do this instead for example:
"<s>[INST] I will give you the description of people, and I want you to respond with a json. For example: "A 18 years old woman"[/INST]{"age":"woman","genre":"woman"}</s>[INST]{new sentence} [/INST] "

Thank you very much for both comments!
Few questions:

I see, I just found the parameters you asked for by the way:
MODELS=[ { "name": "mistralai/Mistral-7B-Instruct-v0.1", "displayName": "mistralai/Mistral-7B-Instruct-v0.1", "description": "Mistral 7B is a new Apache 2.0 model, released by Mistral AI that outperforms Llama2 13B in benchmarks.", "websiteUrl": "https://mistral.ai/news/announcing-mistral-7b/", "preprompt": "", "chatPromptTemplate" : "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first }}{{#if @root .preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}}{{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s>{{/ifAssistant}}{{/each}}", "parameters": { "temperature": 0.1, "top_p": 0.95, "repetition_penalty": 1.2, "top_k": 50, "truncate": 3072, "max_new_tokens": 1024, "stop": ["</s>"] }, "promptExamples": [ { "title": "Write an email from bullet list", "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)" }, { "title": "Code a snake game", "prompt": "Code a basic snake game in python, give explanations for each step." }, { "title": "Assist in a task", "prompt": "How do I make a delicious lemon cheesecake?" } ] } ]

You can find it here: https://huggingface.co/spaces/huggingchat/chat-ui/blob/main/.env

Maybe this will help you out !

By chance do you have also that one for the v02? Because this is the v01 but I do not manage to find the right repo

Also, I noticed your prompt format is wrong.

The example provided in the READ_ME is:
text = "<s>[INST] What is your favourite condiment? [/INST]"
"Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!</s> "
"[INST] Do you have mayonnaise recipes? [/INST]"


As it suggests, the <s></s> section is only required for the past discussions, the instruction you are making now does not require, to be simple, you can do this if you do not care about the past exchanges with the bot:
"[INST] {instructions}\nInput:\n{input}\n[/INST]"

But I recommend you do this instead for example:
"<s>[INST] I will give you the description of people, and I want you to respond with a json. For example: "A 18 years old woman"[/INST]{"age":"woman","genre":"woman"}</s>[INST]{new sentence} [/INST] "

Concerning this, in your last prompt, do you suggest to use the example as a few shot and put in the "new sentence" part the text I want to classify?

Mistral AI_ org

Exactly, it gives it more context of what it should do. So you put the instruction, then an example and the bot's response yourself, so it knows what to do next time. Give it a shot.

Mistral AI_ org

Thank you very much for both comments!
Few questions:

I see, I just found the parameters you asked for by the way:
MODELS=[ { "name": "mistralai/Mistral-7B-Instruct-v0.1", "displayName": "mistralai/Mistral-7B-Instruct-v0.1", "description": "Mistral 7B is a new Apache 2.0 model, released by Mistral AI that outperforms Llama2 13B in benchmarks.", "websiteUrl": "https://mistral.ai/news/announcing-mistral-7b/", "preprompt": "", "chatPromptTemplate" : "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first }}{{#if @root .preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}}{{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s>{{/ifAssistant}}{{/each}}", "parameters": { "temperature": 0.1, "top_p": 0.95, "repetition_penalty": 1.2, "top_k": 50, "truncate": 3072, "max_new_tokens": 1024, "stop": ["</s>"] }, "promptExamples": [ { "title": "Write an email from bullet list", "prompt": "As a restaurant owner, write a professional email to the supplier to get these products every week: \n\n- Wine (x10)\n- Eggs (x24)\n- Bread (x12)" }, { "title": "Code a snake game", "prompt": "Code a basic snake game in python, give explanations for each step." }, { "title": "Assist in a task", "prompt": "How do I make a delicious lemon cheesecake?" } ] } ]

You can find it here: https://huggingface.co/spaces/huggingchat/chat-ui/blob/main/.env

Maybe this will help you out !

By chance do you have also that one for the v02? Because this is the v01 but I do not manage to find the right repo

Here you go : https://huggingface.co/spaces/huggingchat/chat-ui/blob/main/.env.template

There is the parameters for all of them.

Thanks a lot for everything!

Sign up or log in to comment