Batch inference via huggingface.

#2
by anki08 - opened

Is it possible to send the input as a batch of questions and answer pairs instead of 1 question and 1 answer at a time from the huggingface model ?

anki08 changed discussion title from Using input in batches to Batch inference.
anki08 changed discussion title from Batch inference. to Batch inference via huggingface.
OpenSafetyLab org

Sorry for the delay, it might be the same as the official batch inference, we only provide the simpliest way to use here.

Plus, you can refer to our repo for more efficient inference at: https://github.com/OpenSafetyLab/SALAD-BENCH

Sign up or log in to comment