Can the sequence length be extended in SentenceTransformers?

#10
by zilch42 - opened

Hi there,

Can the sequence length be extended in SentenceTransformers or only in the Transformers implementation?

If so is as simple as setting model.max_seq_length = 8192?

Nomic AI org

if you want to change it to something different than 8192, it seems like you can update the attribute: https://github.com/UKPLab/sentence-transformers/blob/master/sentence_transformers/SentenceTransformer.py#L1242-L1253

if you want 8192, it should already be configured

zpn changed discussion status to closed

Sign up or log in to comment