Edit model card

t5-small-dirac4.0-epochs30

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0585

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 256
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.0843 2.5575 2000 0.0800
0.0747 5.1151 4000 0.0707
0.0712 7.6726 6000 0.0681
0.0664 10.2302 8000 0.0655
0.0652 12.7877 10000 0.0624
0.0636 15.3453 12000 0.0603
0.0625 17.9028 14000 0.0602
0.0614 20.4604 16000 0.0602
0.0611 23.0179 18000 0.0591
0.0603 25.5754 20000 0.0585
0.06 28.1330 22000 0.0585

Framework versions

  • Transformers 4.43.4
  • Pytorch 2.4.0+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1
Downloads last month
10
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for Jawaker/t5-small-dirac4.0-epochs30

Base model

google-t5/t5-small
Finetuned
(1386)
this model