pszemraj's picture
Update README.md
1d91418 verified
|
raw
history blame
No virus
3.61 kB
metadata
language:
  - en
license: odc-by
task_categories:
  - text2text-generation
dataset_info:
  - config_name: Commercial-Flan-Collection-SNI
    features:
      - name: inputs
        dtype: string
      - name: targets
        dtype: string
      - name: user_parent
        dtype: string
      - name: assistant_parent
        dtype: int64
      - name: sha256
        dtype: string
    splits:
      - name: train
        num_bytes: 831464242.212652
        num_examples: 270970
    download_size: 495528717
    dataset_size: 831464242.212652
  - config_name: all
    features:
      - name: inputs
        dtype: string
      - name: targets
        dtype: string
      - name: sha256
        dtype: string
    splits:
      - name: train
        num_bytes: 12929219074
        num_examples: 5422366
    download_size: 7993530510
    dataset_size: 12929219074
  - config_name: flan
    features:
      - name: inputs
        dtype: string
      - name: targets
        dtype: string
      - name: task
        dtype: string
      - name: sha256
        dtype: string
    splits:
      - name: train
        num_bytes: 895686276.9988031
        num_examples: 633211
    download_size: 552537280
    dataset_size: 895686276.9988031
  - config_name: flan-v2
    features:
      - name: inputs
        dtype: string
      - name: targets
        dtype: string
      - name: task_name
        dtype: string
      - name: task_source
        dtype: string
      - name: template_type
        dtype: string
      - name: template_idx
        dtype: int64
      - name: sha256
        dtype: string
    splits:
      - name: train
        num_bytes: 11345947198.707159
        num_examples: 4708060
    download_size: 6892653722
    dataset_size: 11345947198.707159
  - config_name: niv2_submix_original
    features:
      - name: inputs
        dtype: string
      - name: targets
        dtype: string
      - name: task_source
        dtype: string
      - name: task_name
        dtype: string
      - name: template_type
        dtype: string
      - name: sha256
        dtype: string
    splits:
      - name: train
        num_bytes: 2259110556.8104787
        num_examples: 849729
    download_size: 1406102153
    dataset_size: 2259110556.8104787
  - config_name: t0_submix_original
    features:
      - name: inputs
        dtype: string
      - name: targets
        dtype: string
      - name: task_source
        dtype: string
      - name: task_name
        dtype: string
      - name: template_type
        dtype: string
      - name: sha256
        dtype: string
    splits:
      - name: train
        num_bytes: 771692762.3727111
        num_examples: 305764
    download_size: 464190712
    dataset_size: 771692762.3727111
configs:
  - config_name: Commercial-Flan-Collection-SNI
    data_files:
      - split: train
        path: Commercial-Flan-Collection-SNI/train-*
  - config_name: all
    data_files:
      - split: train
        path: all/train-*
  - config_name: flan
    data_files:
      - split: train
        path: flan/train-*
  - config_name: flan-v2
    data_files:
      - split: train
        path: flan-v2/train-*
  - config_name: niv2_submix_original
    data_files:
      - split: train
        path: niv2_submix_original/train-*
  - config_name: t0_submix_original
    data_files:
      - split: train
        path: t0_submix_original/train-*
tags:
  - flan

flan subsets: deduped

see config all for the aggregated & deduped dataset

  • all configs/subsets have columns inputs and targets
  • deduped on inputs
  • filtered for lang en if contents more than 5 chars.
    • filter out any row with less than 1 char for either column
  • clean-text applied to both columns

dedup command

deduped with:

python -m text_dedup.minhash \
  --path $ds_name \
  --name $dataset_config \
  --split $data_split \
  --cache_dir "./cache" \
  --output $out_dir \
  --column $text_column \
  --ngram 3 --threshold 0.6 \
  --hash_func xxh3 --hash_bits 32 --num_perm 192 \
  --batch_size 50000