2024-03-15 19:43:39,790 INFO [train_char.py:832] (1/2) Training started 2024-03-15 19:43:39,790 INFO [train_char.py:842] (1/2) Device: cuda:1 2024-03-15 19:43:39,816 INFO [lexicon.py:168] (1/2) Loading pre-compiled data/zh-HK/lang_char/Linv.pt 2024-03-15 19:43:39,825 INFO [train_char.py:856] (1/2) {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 50, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'warm_step': 2000, 'env_info': {'k2-version': '1.24.4', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '2989b0b1186fa6022932804f5b39fbb2781ebf42', 'k2-git-date': 'Fri Nov 24 11:34:10 2023', 'lhotse-version': '1.22.0.dev+git.d8ed1bbb.dirty', 'torch-version': '1.11.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.9', 'icefall-git-branch': 'dev/cv-zipformer', 'icefall-git-sha1': '6993183d-clean', 'icefall-git-date': 'Fri Mar 15 19:31:35 2024', 'icefall-path': '/star-home/jinzengrui/lib/miniconda3/envs/dev39/lib/python3.9/site-packages/icefall-1.0-py3.9.egg', 'k2-path': '/star-home/jinzengrui/lib/miniconda3/envs/dev39/lib/python3.9/site-packages/k2-1.24.4.dev20231207+cuda10.2.torch1.11.0-py3.9-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/star-home/jinzengrui/lib/miniconda3/envs/dev39/lib/python3.9/site-packages/lhotse-1.22.0.dev0+git.d8ed1bbb.dirty-py3.9.egg/lhotse/__init__.py', 'hostname': 'de-74279-k2-train-2-1207150844-f49d8c4f4-c49d5', 'IP address': '10.177.22.19'}, 'world_size': 2, 'master_port': 12354, 'tensorboard': True, 'num_epochs': 50, 'start_epoch': 1, 'start_batch': 0, 'exp_dir': PosixPath('zipformer/exp_val'), 'lang_dir': 'data/zh-HK/lang_char', 'use_validated_set': True, 'use_invalidated_set': False, 'base_lr': 0.045, 'lr_batches': 7500, 'lr_epochs': 3.5, 'ref_duration': 600, 'context_size': 1, 'prune_range': 5, 'lm_scale': 0.25, 'am_scale': 0.0, 'simple_loss_scale': 0.5, 'ctc_loss_scale': 0.2, 'seed': 42, 'print_diagnostics': False, 'inf_check': False, 'save_every_n': 4000, 'keep_last_k': 30, 'average_period': 200, 'use_fp16': True, 'num_encoder_layers': '2,2,3,4,3,2', 'downsampling_factor': '1,2,4,8,4,2', 'feedforward_dim': '512,768,1024,1536,1024,768', 'num_heads': '4,4,4,8,4,4', 'encoder_dim': '192,256,384,512,384,256', 'query_head_dim': '32', 'value_head_dim': '12', 'pos_head_dim': '4', 'pos_dim': 48, 'encoder_unmasked_dim': '192,192,256,256,256,192', 'cnn_module_kernel': '31,31,15,15,15,31', 'decoder_dim': 512, 'joiner_dim': 512, 'causal': False, 'chunk_size': '16,32,64,-1', 'left_context_frames': '64,128,256,-1', 'use_transducer': True, 'use_ctc': False, 'language': 'zh-HK', 'cv_manifest_dir': PosixPath('data/zh-HK/fbank'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 1000, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': True, 'num_workers': 2, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'input_strategy': 'PrecomputedFeatures', 'blank_id': 0, 'vocab_size': 3904} 2024-03-15 19:43:39,825 INFO [train_char.py:858] (1/2) About to create model 2024-03-15 19:43:40,429 INFO [train_char.py:862] (1/2) Number of model parameters: 72526519 2024-03-15 19:43:45,498 INFO [train_char.py:877] (1/2) Using DDP 2024-03-15 19:43:45,792 INFO [asr_datamodule.py:414] (1/2) About to get validated cuts (with dev/test removed) 2024-03-15 19:43:45,812 INFO [asr_datamodule.py:229] (1/2) Enable MUSAN 2024-03-15 19:43:45,812 INFO [asr_datamodule.py:230] (1/2) About to get Musan cuts 2024-03-15 19:43:48,110 INFO [asr_datamodule.py:254] (1/2) Enable SpecAugment 2024-03-15 19:43:48,111 INFO [asr_datamodule.py:255] (1/2) Time warp factor: 80 2024-03-15 19:43:48,111 INFO [asr_datamodule.py:265] (1/2) Num frame mask: 10 2024-03-15 19:43:48,111 INFO [asr_datamodule.py:278] (1/2) About to create train dataset 2024-03-15 19:43:48,111 INFO [asr_datamodule.py:305] (1/2) Using DynamicBucketingSampler. 2024-03-15 19:43:48,923 INFO [asr_datamodule.py:322] (1/2) About to create train dataloader 2024-03-15 19:43:48,923 INFO [asr_datamodule.py:430] (1/2) About to get dev cuts 2024-03-15 19:43:48,925 INFO [asr_datamodule.py:353] (1/2) About to create dev dataset 2024-03-15 19:43:49,282 INFO [asr_datamodule.py:370] (1/2) About to create dev dataloader 2024-03-15 19:43:49,282 INFO [train_char.py:779] (1/2) Sanity check -- see if any of the batches in epoch 1 would cause OOM. 2024-03-15 19:45:08,628 INFO [train_char.py:807] (1/2) Maximum memory allocated so far is 23198MB 2024-03-15 19:45:10,449 INFO [train_char.py:807] (1/2) Maximum memory allocated so far is 23198MB 2024-03-15 19:45:12,753 INFO [train_char.py:807] (1/2) Maximum memory allocated so far is 23198MB 2024-03-15 19:45:14,881 INFO [train_char.py:807] (1/2) Maximum memory allocated so far is 23198MB 2024-03-15 19:45:17,492 INFO [train_char.py:807] (1/2) Maximum memory allocated so far is 23198MB 2024-03-15 19:45:20,153 INFO [train_char.py:807] (1/2) Maximum memory allocated so far is 23198MB 2024-03-15 19:46:04,276 INFO [train_char.py:689] (1/2) Epoch 1, batch 0, loss[loss=9.745, simple_loss=8.855, pruned_loss=8.879, over 24364.00 frames. ], tot_loss[loss=9.745, simple_loss=8.855, pruned_loss=8.879, over 24364.00 frames. ], batch size: 158, lr: 2.25e-02, grad_scale: 1.0 2024-03-15 19:46:04,276 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 19:46:17,626 INFO [train_char.py:721] (1/2) Epoch 1, validation: loss=9.614, simple_loss=8.743, pruned_loss=8.698, over 657665.00 frames. 2024-03-15 19:46:17,626 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 23277MB 2024-03-15 19:46:21,183 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.whiten.whitening_limit, batch_count=0.0, ans=7.5 2024-03-15 19:46:23,845 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=0.0, ans=0.2 2024-03-15 19:46:25,990 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.whiten, num_groups=1, num_channels=192, metric=6.46 vs. limit=4.0 2024-03-15 19:46:30,211 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=144, metric=5.04 vs. limit=5.0 2024-03-15 19:46:32,375 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=33.333333333333336, ans=0.4984375 2024-03-15 19:46:36,388 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.043e+03 5.585e+03 5.856e+03 6.424e+03 6.791e+03, threshold=2.342e+04, percent-clipped=0.0 2024-03-15 19:46:39,683 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer1.prob, batch_count=33.333333333333336, ans=0.4984375 2024-03-15 19:46:51,379 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.692e+03 4.037e+03 5.585e+03 6.394e+03 6.867e+03, threshold=2.234e+04, percent-clipped=0.0 2024-03-15 19:47:15,875 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=256, metric=253.64 vs. limit=7.575 2024-03-15 19:47:23,595 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=12.73 vs. limit=3.02 2024-03-15 19:47:24,899 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=123.76 vs. limit=4.053333333333334 2024-03-15 19:47:27,249 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.372e+02 2.141e+03 3.502e+03 6.193e+03 2.448e+04, threshold=1.401e+04, percent-clipped=2.5 2024-03-15 19:47:32,534 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=327.83 vs. limit=5.066666666666666 2024-03-15 19:47:34,131 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.whiten, num_groups=1, num_channels=512, metric=16.79 vs. limit=4.053333333333334 2024-03-15 19:47:37,854 INFO [train_char.py:689] (1/2) Epoch 1, batch 50, loss[loss=0.8874, simple_loss=0.7905, pruned_loss=0.8699, over 24387.00 frames. ], tot_loss[loss=3.371, simple_loss=3.113, pruned_loss=2.543, over 1086188.70 frames. ], batch size: 158, lr: 2.48e-02, grad_scale: 0.25 2024-03-15 19:47:49,347 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.whiten, num_groups=1, num_channels=512, metric=26.83 vs. limit=4.066666666666666 2024-03-15 19:47:50,880 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.whiten.whitening_limit, batch_count=166.66666666666666, ans=7.5625 2024-03-15 19:47:57,119 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=438.76 vs. limit=7.575 2024-03-15 19:47:58,552 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=287.87 vs. limit=7.575 2024-03-15 19:47:59,731 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.skip_rate, batch_count=200.0, ans=0.5 2024-03-15 19:48:00,174 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.whiten1.whitening_limit, batch_count=200.0, ans=5.05 2024-03-15 19:48:04,953 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=101.04 vs. limit=7.575 2024-03-15 19:48:14,543 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff2_skip_rate, batch_count=233.33333333333334, ans=0.09475 2024-03-15 19:48:17,678 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.attention_skip_rate, batch_count=233.33333333333334, ans=0.19125 2024-03-15 19:48:19,150 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.ff3_skip_rate, batch_count=233.33333333333334, ans=0.09475 2024-03-15 19:48:28,551 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=512, metric=371.42 vs. limit=7.7 2024-03-15 19:48:29,472 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer2.prob, batch_count=266.6666666666667, ans=0.4875 2024-03-15 19:48:32,441 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff3_skip_rate, batch_count=266.6666666666667, ans=0.094 2024-03-15 19:48:40,140 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=384, metric=89.64 vs. limit=7.725 2024-03-15 19:48:41,552 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=384, metric=102.85 vs. limit=7.725 2024-03-15 19:48:44,346 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=256, metric=90.00 vs. limit=7.6125 2024-03-15 19:48:49,915 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer2.min_positive, batch_count=300.0, ans=0.098125 2024-03-15 19:48:49,995 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=300.0, ans=0.4859375 2024-03-15 19:48:51,374 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module1.balancer1.min_positive, batch_count=333.3333333333333, ans=0.04895833333333333 2024-03-15 19:48:52,187 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=232.21 vs. limit=7.625 2024-03-15 19:48:52,518 INFO [train_char.py:689] (1/2) Epoch 1, batch 100, loss[loss=0.6887, simple_loss=0.5838, pruned_loss=0.8228, over 24412.00 frames. ], tot_loss[loss=1.918, simple_loss=1.755, pruned_loss=1.538, over 1906160.01 frames. ], batch size: 158, lr: 2.70e-02, grad_scale: 0.5 2024-03-15 19:48:54,752 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.conv_module2.whiten, num_groups=1, num_channels=512, metric=103.01 vs. limit=7.625 2024-03-15 19:48:56,896 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 3.613e+01 5.979e+01 1.394e+02 2.944e+03 2.448e+04, threshold=2.787e+02, percent-clipped=0.0 2024-03-15 19:49:07,573 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 19:49:09,858 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.whiten, num_groups=1, num_channels=192, metric=5.35 vs. limit=4.1466666666666665 2024-03-15 19:49:12,350 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=305.06 vs. limit=7.6375 2024-03-15 19:49:16,344 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.prob, batch_count=366.6666666666667, ans=0.4828125 2024-03-15 19:49:19,346 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer1.min_positive, batch_count=366.6666666666667, ans=0.04885416666666667 2024-03-15 19:49:19,702 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.conv_module2.whiten, num_groups=1, num_channels=384, metric=51.75 vs. limit=7.6375 2024-03-15 19:49:24,427 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=232.53 vs. limit=7.65 2024-03-15 19:49:25,577 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=226.43 vs. limit=7.65 2024-03-15 19:49:40,463 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=289.87 vs. limit=7.6625 2024-03-15 19:49:47,948 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.65 vs. limit=5.108333333333333 2024-03-15 19:49:56,363 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=11.38 vs. limit=5.116666666666666 2024-03-15 19:50:01,353 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=384, metric=22.75 vs. limit=7.675 2024-03-15 19:50:08,477 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer2.prob, batch_count=466.6666666666667, ans=0.478125 2024-03-15 19:50:09,031 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.conv_module2.whiten, num_groups=1, num_channels=384, metric=39.49 vs. limit=7.675 2024-03-15 19:50:09,092 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=6.23 vs. limit=4.1866666666666665 2024-03-15 19:50:11,292 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer1.min_positive, batch_count=466.6666666666667, ans=0.04854166666666667 2024-03-15 19:50:13,885 INFO [train_char.py:689] (1/2) Epoch 1, batch 150, loss[loss=0.5696, simple_loss=0.5007, pruned_loss=0.5229, over 24027.00 frames. ], tot_loss[loss=1.386, simple_loss=1.253, pruned_loss=1.184, over 2546694.59 frames. ], batch size: 381, lr: 2.93e-02, grad_scale: 0.5 2024-03-15 19:50:16,286 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=186.98 vs. limit=5.25 2024-03-15 19:50:17,404 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=282.76 vs. limit=7.6875 2024-03-15 19:50:21,418 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.min_positive, batch_count=500.0, ans=0.0484375 2024-03-15 19:50:27,123 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer1.prob, batch_count=533.3333333333334, ans=0.475 2024-03-15 19:50:27,671 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=436.08 vs. limit=7.7 2024-03-15 19:50:31,595 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module2.balancer1.prob, batch_count=533.3333333333334, ans=0.475 2024-03-15 19:50:42,887 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=170.82 vs. limit=7.7125 2024-03-15 19:50:45,543 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=512, metric=119.29 vs. limit=7.7125 2024-03-15 19:50:48,400 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=485.20 vs. limit=7.7125 2024-03-15 19:51:00,246 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=21.48 vs. limit=5.3 2024-03-15 19:51:00,276 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.whiten, num_groups=1, num_channels=512, metric=8.10 vs. limit=4.24 2024-03-15 19:51:00,350 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=77.88 vs. limit=5.3 2024-03-15 19:51:06,546 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=303.08 vs. limit=7.725 2024-03-15 19:51:21,313 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten2.whitening_limit, batch_count=633.3333333333334, ans=5.316666666666666 2024-03-15 19:51:24,353 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=192, metric=86.94 vs. limit=5.316666666666666 2024-03-15 19:51:26,476 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.attention_skip_rate, batch_count=666.6666666666666, ans=0.17500000000000002 2024-03-15 19:51:27,432 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=192, metric=110.41 vs. limit=7.75 2024-03-15 19:51:28,246 INFO [train_char.py:689] (1/2) Epoch 1, batch 200, loss[loss=0.6847, simple_loss=0.5823, pruned_loss=0.6912, over 24086.00 frames. ], tot_loss[loss=1.108, simple_loss=0.9902, pruned_loss=0.9886, over 3044430.64 frames. ], batch size: 223, lr: 3.15e-02, grad_scale: 1.0 2024-03-15 19:51:29,893 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=666.6666666666666, ans=0.17500000000000002 2024-03-15 19:51:32,004 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=491.89 vs. limit=7.75 2024-03-15 19:51:32,583 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.353e+01 5.721e+01 6.735e+01 8.044e+01 1.996e+02, threshold=1.347e+02, percent-clipped=0.0 2024-03-15 19:51:42,258 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=142.33 vs. limit=5.35 2024-03-15 19:51:43,002 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.skip_rate, batch_count=700.0, ans=0.5 2024-03-15 19:51:46,650 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=113.48 vs. limit=7.7625 2024-03-15 19:51:58,318 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=314.18 vs. limit=7.775 2024-03-15 19:52:06,383 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=733.3333333333334, ans=0.17250000000000001 2024-03-15 19:52:07,107 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=271.03 vs. limit=7.775 2024-03-15 19:52:09,335 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer1.max_abs, batch_count=733.3333333333334, ans=5.458333333333333 2024-03-15 19:52:18,478 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=384, metric=22.80 vs. limit=7.7875 2024-03-15 19:52:35,620 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=23.72 vs. limit=7.8 2024-03-15 19:52:40,118 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=6.47 vs. limit=4.333333333333333 2024-03-15 19:52:40,984 INFO [train_char.py:689] (1/2) Epoch 1, batch 250, loss[loss=0.4406, simple_loss=0.3629, pruned_loss=0.4716, over 24247.00 frames. ], tot_loss[loss=0.9305, simple_loss=0.8232, pruned_loss=0.8505, over 3439893.48 frames. ], batch size: 122, lr: 3.38e-02, grad_scale: 1.0 2024-03-15 19:52:49,521 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer1.max_abs, batch_count=833.3333333333334, ans=5.520833333333333 2024-03-15 19:52:58,327 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=194.34 vs. limit=7.825 2024-03-15 19:53:04,027 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=238.18 vs. limit=7.825 2024-03-15 19:53:04,081 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.whiten, num_groups=1, num_channels=384, metric=6.14 vs. limit=4.346666666666667 2024-03-15 19:53:05,400 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=384, metric=17.67 vs. limit=8.15 2024-03-15 19:53:05,536 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=119.12 vs. limit=7.825 2024-03-15 19:53:08,194 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn2.whiten.whitening_limit, batch_count=866.6666666666666, ans=8.15 2024-03-15 19:53:08,425 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=121.17 vs. limit=7.825 2024-03-15 19:53:14,876 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module1.balancer1.max_abs, batch_count=900.0, ans=5.5625 2024-03-15 19:53:15,566 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_whiten.whitening_limit, batch_count=900.0, ans=7.8375 2024-03-15 19:53:18,469 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=9.19 vs. limit=5.225 2024-03-15 19:53:25,334 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=6.42 vs. limit=4.36 2024-03-15 19:53:25,384 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=22.99 vs. limit=7.8375 2024-03-15 19:53:35,111 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=933.3333333333334, ans=0.45625 2024-03-15 19:53:38,126 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.ff3_skip_rate, batch_count=933.3333333333334, ans=0.079 2024-03-15 19:53:44,171 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=149.68 vs. limit=7.8625 2024-03-15 19:53:45,686 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module1.whiten, num_groups=1, num_channels=512, metric=19.66 vs. limit=7.8625 2024-03-15 19:53:46,893 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=16.63 vs. limit=7.8625 2024-03-15 19:53:55,229 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward2.hidden_balancer.prob, batch_count=966.6666666666666, ans=0.4546875 2024-03-15 19:53:55,331 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.balancer.min_positive, batch_count=966.6666666666666, ans=0.24033333333333334 2024-03-15 19:53:57,965 INFO [train_char.py:689] (1/2) Epoch 1, batch 300, loss[loss=0.4439, simple_loss=0.3687, pruned_loss=0.4372, over 24370.00 frames. ], tot_loss[loss=0.8122, simple_loss=0.7119, pruned_loss=0.7508, over 3750023.44 frames. ], batch size: 158, lr: 3.60e-02, grad_scale: 2.0 2024-03-15 19:54:01,508 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten1.whitening_limit, batch_count=1000.0, ans=5.25 2024-03-15 19:54:02,222 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.426e+01 8.503e+01 1.019e+02 1.204e+02 1.857e+02, threshold=2.037e+02, percent-clipped=13.0 2024-03-15 19:54:02,549 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=1000.0, ans=0.29 2024-03-15 19:54:04,550 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=384, metric=8.54 vs. limit=7.875 2024-03-15 19:54:14,423 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=21.58 vs. limit=7.8875 2024-03-15 19:54:15,482 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.attention_skip_rate, batch_count=1033.3333333333333, ans=0.16125 2024-03-15 19:54:25,701 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=1066.6666666666667, ans=0.076 2024-03-15 19:54:29,276 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=512, metric=38.14 vs. limit=8.3 2024-03-15 19:54:37,941 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=384, metric=16.11 vs. limit=7.9 2024-03-15 19:54:42,388 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=192, metric=12.30 vs. limit=5.275 2024-03-15 19:54:43,868 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=192, metric=120.92 vs. limit=7.9125 2024-03-15 19:54:45,983 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module1.balancer1.min_positive, batch_count=1100.0, ans=0.0465625 2024-03-15 19:54:46,984 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.out_whiten, num_groups=1, num_channels=192, metric=20.55 vs. limit=4.22 2024-03-15 19:54:49,535 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=18.74 vs. limit=5.275 2024-03-15 19:55:10,369 INFO [train_char.py:689] (1/2) Epoch 1, batch 350, loss[loss=0.5187, simple_loss=0.4294, pruned_loss=0.4924, over 24087.00 frames. ], tot_loss[loss=0.73, simple_loss=0.6343, pruned_loss=0.6774, over 3992395.24 frames. ], batch size: 188, lr: 3.83e-02, grad_scale: 2.0 2024-03-15 19:55:17,650 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=256, metric=10.15 vs. limit=8.375 2024-03-15 19:55:19,079 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=256, metric=13.41 vs. limit=8.375 2024-03-15 19:55:20,146 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward3.out_whiten.whitening_limit, batch_count=1166.6666666666667, ans=7.9375 2024-03-15 19:55:20,440 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=82.66 vs. limit=5.583333333333333 2024-03-15 19:55:20,583 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=57.20 vs. limit=7.9375 2024-03-15 19:55:24,464 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=256, metric=10.78 vs. limit=7.9375 2024-03-15 19:55:28,695 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.attention_skip_rate, batch_count=1200.0, ans=0.155 2024-03-15 19:55:32,815 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=1200.0, ans=0.858 2024-03-15 19:55:35,274 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=192, metric=148.62 vs. limit=7.95 2024-03-15 19:55:40,074 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer2.prob, batch_count=1200.0, ans=0.44375 2024-03-15 19:55:41,993 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=512, metric=46.46 vs. limit=7.95 2024-03-15 19:55:47,733 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=6.57 vs. limit=5.308333333333334 2024-03-15 19:55:53,391 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn1.whiten.whitening_limit, batch_count=1233.3333333333333, ans=8.425 2024-03-15 19:55:57,860 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=38.99 vs. limit=7.975 2024-03-15 19:55:59,357 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=148.12 vs. limit=5.633333333333334 2024-03-15 19:56:05,212 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=256, metric=12.19 vs. limit=7.975 2024-03-15 19:56:09,434 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn2.whiten, num_groups=1, num_channels=512, metric=53.29 vs. limit=8.45 2024-03-15 19:56:10,969 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=70.58 vs. limit=7.975 2024-03-15 19:56:13,941 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=256, metric=8.62 vs. limit=8.475 2024-03-15 19:56:15,196 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=7.30 vs. limit=5.325 2024-03-15 19:56:16,082 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer2.min_abs, batch_count=1300.0, ans=0.2195 2024-03-15 19:56:16,083 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=1300.0, ans=0.4390625 2024-03-15 19:56:28,192 INFO [train_char.py:689] (1/2) Epoch 1, batch 400, loss[loss=0.5129, simple_loss=0.4262, pruned_loss=0.4613, over 24158.00 frames. ], tot_loss[loss=0.6715, simple_loss=0.5788, pruned_loss=0.6209, over 4180120.29 frames. ], batch size: 279, lr: 4.05e-02, grad_scale: 4.0 2024-03-15 19:56:32,514 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.902e+01 9.628e+01 1.167e+02 1.405e+02 3.100e+02, threshold=2.334e+02, percent-clipped=3.0 2024-03-15 19:56:43,412 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=384, metric=12.21 vs. limit=8.0125 2024-03-15 19:56:46,429 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=26.26 vs. limit=8.0125 2024-03-15 19:56:48,390 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.5.prob, batch_count=1366.6666666666667, ans=0.4359375 2024-03-15 19:57:05,884 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.whiten, num_groups=1, num_channels=512, metric=5.65 vs. limit=4.5600000000000005 2024-03-15 19:57:11,431 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=32.22 vs. limit=8.025 2024-03-15 19:57:13,418 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=152.83 vs. limit=8.0375 2024-03-15 19:57:21,821 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.conv_module2.whiten, num_groups=1, num_channels=384, metric=27.27 vs. limit=8.0375 2024-03-15 19:57:31,390 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff3_skip_rate, batch_count=1466.6666666666667, ans=0.067 2024-03-15 19:57:42,598 INFO [train_char.py:689] (1/2) Epoch 1, batch 450, loss[loss=0.4129, simple_loss=0.3447, pruned_loss=0.3536, over 24127.00 frames. ], tot_loss[loss=0.63, simple_loss=0.5389, pruned_loss=0.5779, over 4326181.88 frames. ], batch size: 362, lr: 4.28e-02, grad_scale: 4.0 2024-03-15 19:57:42,819 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=1500.0, ans=0.285 2024-03-15 19:57:43,376 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=31.93 vs. limit=8.0625 2024-03-15 19:57:44,837 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.whiten, num_groups=1, num_channels=512, metric=6.31 vs. limit=4.6 2024-03-15 19:57:44,855 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=512, metric=57.66 vs. limit=8.0625 2024-03-15 19:57:45,999 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=57.06 vs. limit=8.0625 2024-03-15 19:57:52,834 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=1500.0, ans=0.4296875 2024-03-15 19:58:03,897 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=34.03 vs. limit=8.075 2024-03-15 19:58:06,151 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module2.whiten, num_groups=1, num_channels=512, metric=42.15 vs. limit=8.075 2024-03-15 19:58:09,078 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=36.68 vs. limit=8.075 2024-03-15 19:58:24,141 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer2.prob, batch_count=1566.6666666666667, ans=0.4265625 2024-03-15 19:58:26,093 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=512, metric=31.85 vs. limit=8.1 2024-03-15 19:58:30,459 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=9.92 vs. limit=8.7 2024-03-15 19:58:39,009 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=7.51 vs. limit=5.4 2024-03-15 19:58:41,856 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=512, metric=46.88 vs. limit=8.1125 2024-03-15 19:58:54,806 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=512, metric=19.40 vs. limit=8.75 2024-03-15 19:58:55,419 INFO [train_char.py:689] (1/2) Epoch 1, batch 500, loss[loss=0.5479, simple_loss=0.4536, pruned_loss=0.4644, over 24128.00 frames. ], tot_loss[loss=0.6001, simple_loss=0.5099, pruned_loss=0.5438, over 4439785.68 frames. ], batch size: 251, lr: 4.49e-02, grad_scale: 8.0 2024-03-15 19:58:56,486 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=384, metric=29.37 vs. limit=8.75 2024-03-15 19:59:00,226 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.666e+01 9.891e+01 1.166e+02 1.626e+02 2.774e+02, threshold=2.332e+02, percent-clipped=3.0 2024-03-15 19:59:04,254 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.whiten.whitening_limit, batch_count=1666.6666666666667, ans=8.125 2024-03-15 19:59:52,123 INFO [train_char.py:689] (1/2) Epoch 2, batch 0, loss[loss=0.5018, simple_loss=0.4142, pruned_loss=0.4269, over 24063.00 frames. ], tot_loss[loss=0.5018, simple_loss=0.4142, pruned_loss=0.4269, over 24063.00 frames. ], batch size: 236, lr: 4.41e-02, grad_scale: 16.0 2024-03-15 19:59:52,123 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 20:00:04,650 INFO [train_char.py:721] (1/2) Epoch 2, validation: loss=0.4496, simple_loss=0.3723, pruned_loss=0.379, over 657665.00 frames. 2024-03-15 20:00:04,651 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25080MB 2024-03-15 20:00:14,152 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=384, metric=13.90 vs. limit=8.13375 2024-03-15 20:00:24,027 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=1723.3333333333333, ans=0.13537500000000002 2024-03-15 20:00:33,219 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module1.balancer2.prob, batch_count=1756.6666666666667, ans=0.41765625 2024-03-15 20:00:33,490 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=7.52 vs. limit=5.878333333333334 2024-03-15 20:00:34,531 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass_mid.scale_min, batch_count=1756.6666666666667, ans=0.8385166666666667 2024-03-15 20:00:38,502 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.conv_module1.whiten, num_groups=1, num_channels=192, metric=8.91 vs. limit=8.15875 2024-03-15 20:00:39,623 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=34.87 vs. limit=8.817499999999999 2024-03-15 20:00:44,222 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=6.90 vs. limit=5.878333333333334 2024-03-15 20:00:47,203 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_whiten.whitening_limit, batch_count=1756.6666666666667, ans=8.15875 2024-03-15 20:00:53,027 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer2.min_positive, batch_count=1790.0, ans=0.0888125 2024-03-15 20:01:02,484 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=256, metric=11.10 vs. limit=8.17125 2024-03-15 20:01:18,811 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass.scale_min, batch_count=1856.6666666666667, ans=0.8350166666666667 2024-03-15 20:01:19,892 INFO [train_char.py:689] (1/2) Epoch 2, batch 50, loss[loss=0.4199, simple_loss=0.3441, pruned_loss=0.3526, over 24215.00 frames. ], tot_loss[loss=0.4649, simple_loss=0.3834, pruned_loss=0.39, over 1085782.33 frames. ], batch size: 134, lr: 4.41e-02, grad_scale: 16.0 2024-03-15 20:01:20,134 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=1856.6666666666667, ans=0.41296875 2024-03-15 20:01:20,801 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=512, metric=25.77 vs. limit=8.19625 2024-03-15 20:01:29,578 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=26.38 vs. limit=8.19625 2024-03-15 20:01:34,174 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=12.76 vs. limit=8.20875 2024-03-15 20:01:35,714 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=105.42 vs. limit=8.20875 2024-03-15 20:01:40,775 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=1890.0, ans=0.41140625 2024-03-15 20:01:42,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten1.whitening_limit, batch_count=1890.0, ans=5.4725 2024-03-15 20:01:57,412 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.attention_skip_rate, batch_count=1923.3333333333333, ans=0.12787500000000002 2024-03-15 20:02:20,085 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=10.56 vs. limit=8.23375 2024-03-15 20:02:21,446 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=22.12 vs. limit=8.9675 2024-03-15 20:02:28,972 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=12.32 vs. limit=8.24625 2024-03-15 20:02:29,876 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=1990.0, ans=0.2801 2024-03-15 20:02:36,407 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=256, metric=9.88 vs. limit=8.24625 2024-03-15 20:02:37,220 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.545e+01 1.311e+02 1.573e+02 1.999e+02 7.233e+02, threshold=3.146e+02, percent-clipped=17.0 2024-03-15 20:02:37,504 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer1.min_positive, batch_count=1990.0, ans=0.04378125 2024-03-15 20:02:38,272 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=12.31 vs. limit=8.24625 2024-03-15 20:02:40,494 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=1990.0, ans=0.2801 2024-03-15 20:02:43,175 INFO [train_char.py:689] (1/2) Epoch 2, batch 100, loss[loss=0.5065, simple_loss=0.4194, pruned_loss=0.4038, over 24166.00 frames. ], tot_loss[loss=0.4557, simple_loss=0.3761, pruned_loss=0.3758, over 1908420.72 frames. ], batch size: 188, lr: 4.41e-02, grad_scale: 16.0 2024-03-15 20:02:43,346 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=2023.3333333333333, ans=0.27976666666666666 2024-03-15 20:02:49,263 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=2023.3333333333333, ans=0.27976666666666666 2024-03-15 20:02:52,737 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.whiten, num_groups=1, num_channels=512, metric=5.71 vs. limit=4.809333333333333 2024-03-15 20:02:56,958 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=256, metric=22.30 vs. limit=9.0425 2024-03-15 20:03:01,603 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=6.29 vs. limit=5.514166666666666 2024-03-15 20:03:05,368 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.prob, batch_count=2056.6666666666665, ans=0.40359375 2024-03-15 20:03:08,559 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=18.26 vs. limit=8.27125 2024-03-15 20:03:09,758 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=256, metric=10.20 vs. limit=8.27125 2024-03-15 20:03:31,244 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=61.43 vs. limit=8.29625 2024-03-15 20:03:39,487 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=2156.6666666666665, ans=0.23041666666666666 2024-03-15 20:03:54,524 INFO [train_char.py:689] (1/2) Epoch 2, batch 150, loss[loss=0.4074, simple_loss=0.3421, pruned_loss=0.3069, over 24363.00 frames. ], tot_loss[loss=0.4585, simple_loss=0.3792, pruned_loss=0.3702, over 2553635.50 frames. ], batch size: 158, lr: 4.40e-02, grad_scale: 16.0 2024-03-15 20:04:02,428 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=20.27 vs. limit=9.1425 2024-03-15 20:04:15,123 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=19.19 vs. limit=9.1675 2024-03-15 20:04:20,418 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.ff3_skip_rate, batch_count=2223.3333333333335, ans=0.04997499999999999 2024-03-15 20:04:21,077 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=512, metric=20.02 vs. limit=8.33375 2024-03-15 20:04:21,854 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=2256.6666666666665, ans=0.2179166666666667 2024-03-15 20:04:39,193 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.attention_skip_rate, batch_count=2290.0, ans=0.11412499999999999 2024-03-15 20:04:58,956 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=2323.3333333333335, ans=0.39109375 2024-03-15 20:05:00,127 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.330e+01 1.356e+02 1.673e+02 2.299e+02 9.080e+02, threshold=3.347e+02, percent-clipped=8.0 2024-03-15 20:05:02,575 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=10.24 vs. limit=8.37125 2024-03-15 20:05:09,553 INFO [train_char.py:689] (1/2) Epoch 2, batch 200, loss[loss=0.3996, simple_loss=0.3362, pruned_loss=0.2934, over 24168.00 frames. ], tot_loss[loss=0.4446, simple_loss=0.3693, pruned_loss=0.3503, over 3052278.69 frames. ], batch size: 362, lr: 4.40e-02, grad_scale: 16.0 2024-03-15 20:05:26,957 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.0.self_attn_weights, loss-sum=6.781e+00 2024-03-15 20:05:28,721 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.whiten, num_groups=1, num_channels=384, metric=4.87 vs. limit=4.9559999999999995 2024-03-15 20:05:47,484 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=11.25 vs. limit=8.40875 2024-03-15 20:05:49,671 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer2.prob, batch_count=2423.3333333333335, ans=0.38640625 2024-03-15 20:06:18,262 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer1.prob, batch_count=2490.0, ans=0.38328125 2024-03-15 20:06:18,771 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.whiten, num_groups=1, num_channels=512, metric=5.92 vs. limit=4.996 2024-03-15 20:06:19,714 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer2.prob, batch_count=2490.0, ans=0.38328125 2024-03-15 20:06:21,407 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=256, metric=13.35 vs. limit=9.3675 2024-03-15 20:06:23,636 INFO [train_char.py:689] (1/2) Epoch 2, batch 250, loss[loss=0.3246, simple_loss=0.285, pruned_loss=0.2122, over 24296.00 frames. ], tot_loss[loss=0.4315, simple_loss=0.3613, pruned_loss=0.3296, over 3447053.59 frames. ], batch size: 140, lr: 4.40e-02, grad_scale: 16.0 2024-03-15 20:06:24,248 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=256, metric=12.34 vs. limit=9.3925 2024-03-15 20:06:24,799 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=192, metric=9.13 vs. limit=8.44625 2024-03-15 20:06:26,823 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=2523.3333333333335, ans=0.38171875 2024-03-15 20:06:28,191 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.attention_skip_rate, batch_count=2523.3333333333335, ans=0.105375 2024-03-15 20:06:31,594 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward2.out_whiten.whitening_limit, batch_count=2523.3333333333335, ans=8.44625 2024-03-15 20:06:45,161 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer2.min_positive, batch_count=2556.6666666666665, ans=0.08402083333333334 2024-03-15 20:06:49,330 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer2.prob, batch_count=2556.6666666666665, ans=0.38015625 2024-03-15 20:06:56,639 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=2590.0, ans=0.17625000000000002 2024-03-15 20:06:59,374 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff3_skip_rate, batch_count=2590.0, ans=0.041725 2024-03-15 20:07:13,937 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=13.68 vs. limit=8.48375 2024-03-15 20:07:21,852 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_skip_rate, batch_count=2656.6666666666665, ans=0.10037499999999999 2024-03-15 20:07:28,753 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.293e+01 1.310e+02 1.524e+02 2.043e+02 4.110e+02, threshold=3.049e+02, percent-clipped=1.0 2024-03-15 20:07:34,197 INFO [train_char.py:689] (1/2) Epoch 2, batch 300, loss[loss=0.3661, simple_loss=0.3177, pruned_loss=0.2429, over 24201.00 frames. ], tot_loss[loss=0.4109, simple_loss=0.3478, pruned_loss=0.3029, over 3756107.07 frames. ], batch size: 328, lr: 4.40e-02, grad_scale: 16.0 2024-03-15 20:08:05,533 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=2756.6666666666665, ans=0.2724333333333333 2024-03-15 20:08:12,198 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.ff3_skip_rate, batch_count=2756.6666666666665, ans=0.037975 2024-03-15 20:08:12,212 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer1.max_abs, batch_count=2756.6666666666665, ans=6.722916666666666 2024-03-15 20:08:24,655 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=2790.0, ans=0.2721 2024-03-15 20:08:30,942 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=8.90 vs. limit=8.54625 2024-03-15 20:08:45,749 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=256, metric=9.85 vs. limit=9.6175 2024-03-15 20:08:47,723 INFO [train_char.py:689] (1/2) Epoch 2, batch 350, loss[loss=0.3045, simple_loss=0.2689, pruned_loss=0.1918, over 24179.00 frames. ], tot_loss[loss=0.3917, simple_loss=0.3352, pruned_loss=0.2789, over 3993974.22 frames. ], batch size: 344, lr: 4.40e-02, grad_scale: 16.0 2024-03-15 20:08:53,919 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=14.83 vs. limit=8.57125 2024-03-15 20:09:01,863 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.42 vs. limit=5.7225 2024-03-15 20:09:08,207 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=2890.0, ans=0.034975000000000006 2024-03-15 20:09:14,801 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=2923.3333333333335, ans=0.27076666666666666 2024-03-15 20:09:15,357 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=9.99 vs. limit=8.59625 2024-03-15 20:09:17,592 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.0.self_attn_weights, loss-sum=7.207e+01 2024-03-15 20:09:19,456 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=512, metric=8.65 vs. limit=8.59625 2024-03-15 20:09:22,256 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=10.49 vs. limit=8.59625 2024-03-15 20:09:43,152 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer2.prob, batch_count=2990.0, ans=0.35984375 2024-03-15 20:09:52,603 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.804e+01 1.410e+02 1.649e+02 2.083e+02 3.278e+02, threshold=3.297e+02, percent-clipped=3.0 2024-03-15 20:09:58,110 INFO [train_char.py:689] (1/2) Epoch 2, batch 400, loss[loss=0.3251, simple_loss=0.2927, pruned_loss=0.1944, over 24131.00 frames. ], tot_loss[loss=0.3747, simple_loss=0.3247, pruned_loss=0.2569, over 4182464.41 frames. ], batch size: 279, lr: 4.40e-02, grad_scale: 32.0 2024-03-15 20:10:01,088 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer1.min_positive, batch_count=3023.3333333333335, ans=0.040552083333333336 2024-03-15 20:10:13,163 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer1.prob, batch_count=3056.6666666666665, ans=0.35671875 2024-03-15 20:10:13,721 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=8.20 vs. limit=5.764166666666666 2024-03-15 20:10:34,428 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=384, metric=11.52 vs. limit=9.817499999999999 2024-03-15 20:11:03,785 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.scale_min, batch_count=3156.6666666666665, ans=0.7895166666666666 2024-03-15 20:11:06,219 INFO [train_char.py:689] (1/2) Epoch 2, batch 450, loss[loss=0.2549, simple_loss=0.2476, pruned_loss=0.1253, over 24372.00 frames. ], tot_loss[loss=0.355, simple_loss=0.3118, pruned_loss=0.2343, over 4328623.14 frames. ], batch size: 172, lr: 4.39e-02, grad_scale: 16.0 2024-03-15 20:11:13,169 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=3190.0, ans=0.2681 2024-03-15 20:11:17,105 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff2_skip_rate, batch_count=3190.0, ans=0.028225 2024-03-15 20:11:35,697 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_skip_rate, batch_count=3256.6666666666665, ans=0.077875 2024-03-15 20:11:37,333 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=9.85 vs. limit=8.72125 2024-03-15 20:11:44,088 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=11.50 vs. limit=8.72125 2024-03-15 20:11:45,593 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=14.96 vs. limit=8.73375 2024-03-15 20:11:52,468 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=11.01 vs. limit=6.645 2024-03-15 20:12:02,267 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.hidden_balancer.prob, batch_count=3323.3333333333335, ans=0.34421875 2024-03-15 20:12:06,709 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_skip_rate, batch_count=3323.3333333333335, ans=0.07537499999999998 2024-03-15 20:12:10,473 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.201e+02 1.533e+02 1.846e+02 2.254e+02 4.152e+02, threshold=3.692e+02, percent-clipped=4.0 2024-03-15 20:12:14,493 INFO [train_char.py:689] (1/2) Epoch 2, batch 500, loss[loss=0.2867, simple_loss=0.2657, pruned_loss=0.1589, over 24111.00 frames. ], tot_loss[loss=0.3398, simple_loss=0.3026, pruned_loss=0.2159, over 4439295.84 frames. ], batch size: 279, lr: 4.39e-02, grad_scale: 16.0 2024-03-15 20:12:14,692 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module2.balancer1.max_abs, batch_count=3356.6666666666665, ans=7.097916666666666 2024-03-15 20:12:16,006 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=3356.6666666666665, ans=0.2664333333333333 2024-03-15 20:12:21,243 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=3356.6666666666665, ans=0.34265625 2024-03-15 20:13:10,346 INFO [train_char.py:689] (1/2) Epoch 3, batch 0, loss[loss=0.2514, simple_loss=0.2298, pruned_loss=0.1435, over 24002.00 frames. ], tot_loss[loss=0.2514, simple_loss=0.2298, pruned_loss=0.1435, over 24002.00 frames. ], batch size: 381, lr: 4.17e-02, grad_scale: 32.0 2024-03-15 20:13:10,346 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 20:13:22,073 INFO [train_char.py:721] (1/2) Epoch 3, validation: loss=0.1923, simple_loss=0.1987, pruned_loss=0.07892, over 657665.00 frames. 2024-03-15 20:13:22,073 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 20:13:25,141 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer2.prob, batch_count=3380.0, ans=0.3415625 2024-03-15 20:13:41,287 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=10.94 vs. limit=10.06 2024-03-15 20:13:44,862 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer2.prob, batch_count=3413.3333333333335, ans=0.33999999999999997 2024-03-15 20:13:54,234 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward2.out_whiten.whitening_limit, batch_count=3446.6666666666665, ans=8.7925 2024-03-15 20:14:06,196 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=3446.6666666666665, ans=0.07075000000000001 2024-03-15 20:14:13,230 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.balancer1.prob, batch_count=3480.0, ans=0.33687500000000004 2024-03-15 20:14:17,925 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=5.65 vs. limit=5.3919999999999995 2024-03-15 20:14:18,817 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer2.min_positive, batch_count=3480.0, ans=0.07825000000000001 2024-03-15 20:14:20,254 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.skip_rate, batch_count=3480.0, ans=0.04949747468305833 2024-03-15 20:14:23,514 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=8.69 vs. limit=8.817499999999999 2024-03-15 20:14:39,179 INFO [train_char.py:689] (1/2) Epoch 3, batch 50, loss[loss=0.2784, simple_loss=0.2675, pruned_loss=0.1417, over 24072.00 frames. ], tot_loss[loss=0.2419, simple_loss=0.2308, pruned_loss=0.1254, over 1080149.61 frames. ], batch size: 236, lr: 4.17e-02, grad_scale: 16.0 2024-03-15 20:14:48,120 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=3546.6666666666665, ans=0.33375 2024-03-15 20:14:57,449 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=5.83 vs. limit=5.432 2024-03-15 20:14:59,639 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.balancer.max_positive, batch_count=3580.0, ans=0.7857999999999999 2024-03-15 20:15:03,708 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer2.min_abs, batch_count=3580.0, ans=0.25370000000000004 2024-03-15 20:15:37,031 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.786e+01 1.576e+02 1.794e+02 2.249e+02 5.227e+02, threshold=3.588e+02, percent-clipped=2.0 2024-03-15 20:15:39,067 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn1.whiten, num_groups=1, num_channels=512, metric=10.82 vs. limit=10.26 2024-03-15 20:15:49,463 INFO [train_char.py:689] (1/2) Epoch 3, batch 100, loss[loss=0.2112, simple_loss=0.2132, pruned_loss=0.09538, over 24447.00 frames. ], tot_loss[loss=0.2283, simple_loss=0.2217, pruned_loss=0.1135, over 1911331.60 frames. ], batch size: 165, lr: 4.17e-02, grad_scale: 16.0 2024-03-15 20:15:58,726 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn2.whiten, num_groups=1, num_channels=512, metric=13.89 vs. limit=10.285 2024-03-15 20:16:01,634 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=4.89 vs. limit=5.485333333333333 2024-03-15 20:16:02,915 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=3.31 vs. limit=3.5620000000000003 2024-03-15 20:16:03,714 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.ff2_skip_rate, batch_count=3746.6666666666665, ans=0.015699999999999992 2024-03-15 20:16:05,356 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass.skip_rate, batch_count=3746.6666666666665, ans=0.09899494936611666 2024-03-15 20:16:09,724 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer1.prob, batch_count=3746.6666666666665, ans=0.32437499999999997 2024-03-15 20:16:17,998 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=3780.0, ans=0.3228125 2024-03-15 20:16:22,025 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=3780.0, ans=0.3228125 2024-03-15 20:16:28,016 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=7.64 vs. limit=6.89 2024-03-15 20:16:47,463 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=7.24 vs. limit=6.906666666666666 2024-03-15 20:17:02,994 INFO [train_char.py:689] (1/2) Epoch 3, batch 150, loss[loss=0.2483, simple_loss=0.2378, pruned_loss=0.1277, over 24235.00 frames. ], tot_loss[loss=0.2271, simple_loss=0.2218, pruned_loss=0.1115, over 2557753.84 frames. ], batch size: 328, lr: 4.17e-02, grad_scale: 16.0 2024-03-15 20:17:10,060 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=3880.0, ans=0.015000000000000013 2024-03-15 20:17:10,563 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn2.whiten, num_groups=1, num_channels=512, metric=14.14 vs. limit=10.41 2024-03-15 20:17:28,723 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.ff3_skip_rate, batch_count=3913.3333333333335, ans=0.011949999999999988 2024-03-15 20:17:48,083 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn1.whiten.whitening_limit, batch_count=3980.0, ans=10.485 2024-03-15 20:17:55,281 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=192, metric=7.11 vs. limit=6.99 2024-03-15 20:18:02,891 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.004e+02 1.483e+02 1.805e+02 2.218e+02 4.449e+02, threshold=3.610e+02, percent-clipped=2.0 2024-03-15 20:18:05,312 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=8.55 vs. limit=9.004999999999999 2024-03-15 20:18:15,107 INFO [train_char.py:689] (1/2) Epoch 3, batch 200, loss[loss=0.1547, simple_loss=0.17, pruned_loss=0.05627, over 24212.00 frames. ], tot_loss[loss=0.2241, simple_loss=0.2205, pruned_loss=0.1085, over 3054011.10 frames. ], batch size: 122, lr: 4.17e-02, grad_scale: 16.0 2024-03-15 20:18:48,575 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=384, metric=11.03 vs. limit=10.584999999999999 2024-03-15 20:18:50,736 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass.scale_min, batch_count=4113.333333333333, ans=0.7560333333333333 2024-03-15 20:18:53,226 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.ff3_skip_rate, batch_count=4113.333333333333, ans=0.00997536231884058 2024-03-15 20:19:01,563 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass.scale_min, batch_count=4146.666666666667, ans=0.7548666666666667 2024-03-15 20:19:02,827 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.min_positive, batch_count=4146.666666666667, ans=0.03704166666666667 2024-03-15 20:19:07,499 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=9.25 vs. limit=9.055 2024-03-15 20:19:19,685 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn1.whiten, num_groups=1, num_channels=512, metric=11.18 vs. limit=10.635 2024-03-15 20:19:23,023 INFO [train_char.py:689] (1/2) Epoch 3, batch 250, loss[loss=0.1546, simple_loss=0.1706, pruned_loss=0.05665, over 24400.00 frames. ], tot_loss[loss=0.2191, simple_loss=0.2178, pruned_loss=0.104, over 3442357.82 frames. ], batch size: 135, lr: 4.16e-02, grad_scale: 16.0 2024-03-15 20:19:25,192 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.30 vs. limit=6.053333333333333 2024-03-15 20:19:34,103 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.skip_rate, batch_count=4213.333333333333, ans=0.04949747468305833 2024-03-15 20:19:49,753 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=4246.666666666667, ans=0.04897222222222222 2024-03-15 20:19:50,989 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=4280.0, ans=0.299375 2024-03-15 20:19:56,698 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=512, metric=11.17 vs. limit=10.71 2024-03-15 20:20:19,390 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer1.max_abs, batch_count=4313.333333333333, ans=7.695833333333333 2024-03-15 20:20:23,329 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.012e+01 1.394e+02 1.554e+02 1.862e+02 4.696e+02, threshold=3.109e+02, percent-clipped=1.0 2024-03-15 20:20:23,656 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer1.min_positive, batch_count=4346.666666666667, ans=0.03641666666666667 2024-03-15 20:20:24,953 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=4346.666666666667, ans=0.25653333333333334 2024-03-15 20:20:35,101 INFO [train_char.py:689] (1/2) Epoch 3, batch 300, loss[loss=0.2294, simple_loss=0.2292, pruned_loss=0.1092, over 24156.00 frames. ], tot_loss[loss=0.2162, simple_loss=0.2163, pruned_loss=0.1015, over 3749753.31 frames. ], batch size: 311, lr: 4.16e-02, grad_scale: 16.0 2024-03-15 20:21:04,458 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=4446.666666666667, ans=0.04813888888888889 2024-03-15 20:21:04,578 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_skip_rate, batch_count=4446.666666666667, ans=0.04813888888888889 2024-03-15 20:21:19,860 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=3.67 vs. limit=3.672 2024-03-15 20:21:20,571 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward2.hidden_balancer.prob, batch_count=4480.0, ans=0.29000000000000004 2024-03-15 20:21:20,602 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module2.balancer2.prob, batch_count=4480.0, ans=0.29000000000000004 2024-03-15 20:21:41,895 INFO [train_char.py:689] (1/2) Epoch 3, batch 350, loss[loss=0.1527, simple_loss=0.1727, pruned_loss=0.05455, over 24294.00 frames. ], tot_loss[loss=0.2148, simple_loss=0.2166, pruned_loss=0.0997, over 3991071.01 frames. ], batch size: 146, lr: 4.16e-02, grad_scale: 16.0 2024-03-15 20:21:50,756 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer2.prob, batch_count=4546.666666666667, ans=0.286875 2024-03-15 20:21:53,412 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.attention_skip_rate, batch_count=4546.666666666667, ans=0.04772222222222222 2024-03-15 20:22:18,334 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=11.27 vs. limit=9.23 2024-03-15 20:22:30,005 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.attention_skip_rate, batch_count=4646.666666666667, ans=0.04730555555555556 2024-03-15 20:22:40,655 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.064e+02 1.434e+02 1.778e+02 2.135e+02 4.568e+02, threshold=3.555e+02, percent-clipped=3.0 2024-03-15 20:22:46,337 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=4680.0, ans=0.04716666666666667 2024-03-15 20:22:51,198 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=192, metric=9.55 vs. limit=7.34 2024-03-15 20:22:52,746 INFO [train_char.py:689] (1/2) Epoch 3, batch 400, loss[loss=0.2419, simple_loss=0.2479, pruned_loss=0.1109, over 24146.00 frames. ], tot_loss[loss=0.2135, simple_loss=0.2164, pruned_loss=0.09842, over 4179489.11 frames. ], batch size: 223, lr: 4.16e-02, grad_scale: 32.0 2024-03-15 20:23:05,384 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=512, metric=6.97 vs. limit=9.28 2024-03-15 20:23:16,947 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.out_balancer.prob, batch_count=4746.666666666667, ans=0.27749999999999997 2024-03-15 20:23:22,198 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=4780.0, ans=0.2759375 2024-03-15 20:23:53,585 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass_mid.scale_min, batch_count=4846.666666666667, ans=0.7303666666666667 2024-03-15 20:23:56,272 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module1.balancer2.prob, batch_count=4846.666666666667, ans=0.2728125 2024-03-15 20:24:02,678 INFO [train_char.py:689] (1/2) Epoch 3, batch 450, loss[loss=0.2163, simple_loss=0.228, pruned_loss=0.09465, over 24094.00 frames. ], tot_loss[loss=0.2126, simple_loss=0.2174, pruned_loss=0.09695, over 4324021.63 frames. ], batch size: 188, lr: 4.15e-02, grad_scale: 32.0 2024-03-15 20:24:36,649 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff3_skip_rate, batch_count=4946.666666666667, ans=0.009794202898550725 2024-03-15 20:24:44,222 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer1.prob, batch_count=4980.0, ans=0.26656250000000004 2024-03-15 20:24:48,181 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff3_skip_rate, batch_count=4980.0, ans=0.00978695652173913 2024-03-15 20:24:58,414 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.102e+02 1.409e+02 1.576e+02 1.756e+02 2.496e+02, threshold=3.151e+02, percent-clipped=0.0 2024-03-15 20:25:10,091 INFO [train_char.py:689] (1/2) Epoch 3, batch 500, loss[loss=0.2324, simple_loss=0.2424, pruned_loss=0.1047, over 24060.00 frames. ], tot_loss[loss=0.2113, simple_loss=0.2184, pruned_loss=0.09493, over 4437467.15 frames. ], batch size: 199, lr: 4.15e-02, grad_scale: 32.0 2024-03-15 20:25:10,358 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=5046.666666666667, ans=0.24953333333333333 2024-03-15 20:26:07,966 INFO [train_char.py:689] (1/2) Epoch 4, batch 0, loss[loss=0.2008, simple_loss=0.2064, pruned_loss=0.09295, over 24145.00 frames. ], tot_loss[loss=0.2008, simple_loss=0.2064, pruned_loss=0.09295, over 24145.00 frames. ], batch size: 344, lr: 3.88e-02, grad_scale: 32.0 2024-03-15 20:26:07,967 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 20:26:20,001 INFO [train_char.py:721] (1/2) Epoch 4, validation: loss=0.1315, simple_loss=0.1639, pruned_loss=0.03821, over 657665.00 frames. 2024-03-15 20:26:20,001 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 20:26:24,630 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=256, metric=12.96 vs. limit=11.3025 2024-03-15 20:26:28,741 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.whiten, num_groups=1, num_channels=384, metric=5.44 vs. limit=6.0280000000000005 2024-03-15 20:26:47,811 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=9.08 vs. limit=9.41375 2024-03-15 20:26:50,287 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.scale_min, batch_count=5136.666666666667, ans=0.7202166666666667 2024-03-15 20:26:50,621 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=12.62 vs. limit=11.3525 2024-03-15 20:26:56,332 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=6.42 vs. limit=6.284166666666667 2024-03-15 20:27:07,114 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.skip_rate, batch_count=5170.0, ans=0.04949747468305833 2024-03-15 20:27:11,081 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=5170.0, ans=0.25765625000000003 2024-03-15 20:27:19,738 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.3.self_attn_weights, loss-sum=0.000e+00 2024-03-15 20:27:22,802 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=12.01 vs. limit=11.4025 2024-03-15 20:27:33,155 INFO [train_char.py:689] (1/2) Epoch 4, batch 50, loss[loss=0.1666, simple_loss=0.1867, pruned_loss=0.06601, over 24255.00 frames. ], tot_loss[loss=0.1827, simple_loss=0.1976, pruned_loss=0.07731, over 1088700.38 frames. ], batch size: 116, lr: 3.88e-02, grad_scale: 32.0 2024-03-15 20:27:49,573 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=5270.0, ans=0.044708333333333336 2024-03-15 20:27:53,622 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=5270.0, ans=0.044708333333333336 2024-03-15 20:28:11,111 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.prob, batch_count=5303.333333333333, ans=0.25140625 2024-03-15 20:28:12,813 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=512, metric=12.84 vs. limit=11.4775 2024-03-15 20:28:16,922 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.skip_rate, batch_count=5336.666666666667, ans=0.09899494936611666 2024-03-15 20:28:22,008 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.159e+02 1.441e+02 1.598e+02 1.918e+02 4.201e+02, threshold=3.197e+02, percent-clipped=3.0 2024-03-15 20:28:35,470 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=5370.0, ans=0.2463 2024-03-15 20:28:42,261 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer2.min_positive, batch_count=5403.333333333333, ans=0.06622916666666667 2024-03-15 20:28:42,331 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer2.prob, batch_count=5403.333333333333, ans=0.24671875 2024-03-15 20:28:43,237 INFO [train_char.py:689] (1/2) Epoch 4, batch 100, loss[loss=0.2079, simple_loss=0.2295, pruned_loss=0.08613, over 24104.00 frames. ], tot_loss[loss=0.1806, simple_loss=0.197, pruned_loss=0.07568, over 1912580.80 frames. ], batch size: 236, lr: 3.88e-02, grad_scale: 32.0 2024-03-15 20:28:53,599 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=5403.333333333333, ans=0.24596666666666667 2024-03-15 20:28:57,497 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer1.max_abs, batch_count=5403.333333333333, ans=8.377083333333333 2024-03-15 20:29:16,415 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=5470.0, ans=0.043875000000000004 2024-03-15 20:29:34,331 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=9.27 vs. limit=7.751666666666667 2024-03-15 20:29:36,469 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=5503.333333333333, ans=0.043736111111111114 2024-03-15 20:29:54,672 INFO [train_char.py:689] (1/2) Epoch 4, batch 150, loss[loss=0.2249, simple_loss=0.2426, pruned_loss=0.09823, over 24247.00 frames. ], tot_loss[loss=0.1781, simple_loss=0.1958, pruned_loss=0.07407, over 2556502.95 frames. ], batch size: 212, lr: 3.87e-02, grad_scale: 32.0 2024-03-15 20:30:03,728 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=144, metric=6.37 vs. limit=6.3925 2024-03-15 20:30:04,308 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=5570.0, ans=0.2443 2024-03-15 20:30:12,510 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=5603.333333333333, ans=0.24396666666666667 2024-03-15 20:30:12,905 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn1.whiten, num_groups=1, num_channels=512, metric=11.94 vs. limit=11.7025 2024-03-15 20:30:17,921 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=5603.333333333333, ans=0.24396666666666667 2024-03-15 20:30:22,031 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer1.prob, batch_count=5636.666666666667, ans=0.23578125 2024-03-15 20:30:26,096 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass_mid.scale_min, batch_count=5636.666666666667, ans=0.7027166666666667 2024-03-15 20:30:40,117 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.045e+02 1.375e+02 1.537e+02 1.686e+02 3.623e+02, threshold=3.073e+02, percent-clipped=1.0 2024-03-15 20:30:45,741 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer1.prob, batch_count=5670.0, ans=0.23421874999999998 2024-03-15 20:30:45,757 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=5670.0, ans=0.23421874999999998 2024-03-15 20:30:46,201 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=14.94 vs. limit=9.62625 2024-03-15 20:30:59,555 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.ff2_skip_rate, batch_count=5703.333333333333, ans=0.009629710144927537 2024-03-15 20:30:59,991 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=5.33 vs. limit=7.851666666666667 2024-03-15 20:31:04,437 INFO [train_char.py:689] (1/2) Epoch 4, batch 200, loss[loss=0.1899, simple_loss=0.2033, pruned_loss=0.08469, over 24222.00 frames. ], tot_loss[loss=0.1737, simple_loss=0.1931, pruned_loss=0.07135, over 3061280.45 frames. ], batch size: 328, lr: 3.87e-02, grad_scale: 32.0 2024-03-15 20:31:22,101 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=5770.0, ans=0.0 2024-03-15 20:31:24,696 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=5770.0, ans=0.22953125000000002 2024-03-15 20:31:44,141 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys.whitening_limit, batch_count=5836.666666666667, ans=3.8754999999999997 2024-03-15 20:32:14,777 INFO [train_char.py:689] (1/2) Epoch 4, batch 250, loss[loss=0.17, simple_loss=0.1915, pruned_loss=0.07059, over 21456.00 frames. ], tot_loss[loss=0.1735, simple_loss=0.1939, pruned_loss=0.07128, over 3445927.52 frames. ], batch size: 85, lr: 3.87e-02, grad_scale: 32.0 2024-03-15 20:32:24,809 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=512, metric=12.53 vs. limit=11.9275 2024-03-15 20:32:27,523 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.whiten.whitening_limit, batch_count=5936.666666666667, ans=6.374666666666666 2024-03-15 20:32:36,608 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=10.73 vs. limit=9.72625 2024-03-15 20:32:59,463 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.052e+02 1.330e+02 1.542e+02 1.707e+02 5.716e+02, threshold=3.084e+02, percent-clipped=1.0 2024-03-15 20:33:12,798 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.prob, batch_count=6036.666666666667, ans=0.21703125 2024-03-15 20:33:15,445 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass_mid.scale_min, batch_count=6036.666666666667, ans=0.6887166666666666 2024-03-15 20:33:20,531 INFO [train_char.py:689] (1/2) Epoch 4, batch 300, loss[loss=0.1269, simple_loss=0.1629, pruned_loss=0.04152, over 24363.00 frames. ], tot_loss[loss=0.1708, simple_loss=0.193, pruned_loss=0.06947, over 3753132.20 frames. ], batch size: 158, lr: 3.87e-02, grad_scale: 32.0 2024-03-15 20:33:49,146 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.attention_skip_rate, batch_count=6136.666666666667, ans=0.04109722222222222 2024-03-15 20:33:53,107 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer1.prob, batch_count=6136.666666666667, ans=0.21234375 2024-03-15 20:34:10,200 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=8.03 vs. limit=8.085 2024-03-15 20:34:17,567 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=6203.333333333333, ans=0.0 2024-03-15 20:34:17,627 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=6203.333333333333, ans=0.0 2024-03-15 20:34:18,909 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer2.prob, batch_count=6203.333333333333, ans=0.20921875 2024-03-15 20:34:29,788 INFO [train_char.py:689] (1/2) Epoch 4, batch 350, loss[loss=0.19, simple_loss=0.2124, pruned_loss=0.08169, over 24209.00 frames. ], tot_loss[loss=0.1689, simple_loss=0.1925, pruned_loss=0.06831, over 3994047.51 frames. ], batch size: 311, lr: 3.86e-02, grad_scale: 32.0 2024-03-15 20:34:31,378 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer1.prob, batch_count=6236.666666666667, ans=0.20765625 2024-03-15 20:34:43,122 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=6270.0, ans=0.04054166666666667 2024-03-15 20:34:44,392 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=6270.0, ans=0.20609375000000002 2024-03-15 20:34:47,163 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer1.prob, batch_count=6270.0, ans=0.20609375000000002 2024-03-15 20:35:09,478 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=8.37 vs. limit=9.87625 2024-03-15 20:35:10,414 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer1.max_abs, batch_count=6336.666666666667, ans=8.960416666666667 2024-03-15 20:35:13,928 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.936e+01 1.387e+02 1.543e+02 1.746e+02 2.964e+02, threshold=3.085e+02, percent-clipped=0.0 2024-03-15 20:35:23,803 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.27 vs. limit=6.5925 2024-03-15 20:35:35,589 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=384, metric=11.85 vs. limit=12.3025 2024-03-15 20:35:36,170 INFO [train_char.py:689] (1/2) Epoch 4, batch 400, loss[loss=0.1824, simple_loss=0.2124, pruned_loss=0.0747, over 24207.00 frames. ], tot_loss[loss=0.1689, simple_loss=0.194, pruned_loss=0.06826, over 4179913.75 frames. ], batch size: 212, lr: 3.86e-02, grad_scale: 32.0 2024-03-15 20:35:53,355 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer1.prob, batch_count=6436.666666666667, ans=0.19828125000000002 2024-03-15 20:36:00,448 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.01 vs. limit=3.9655 2024-03-15 20:36:09,371 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=9.58 vs. limit=9.92625 2024-03-15 20:36:11,511 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=6470.0, ans=0.19671875 2024-03-15 20:36:15,920 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=6.79 vs. limit=6.6258333333333335 2024-03-15 20:36:43,143 INFO [train_char.py:689] (1/2) Epoch 4, batch 450, loss[loss=0.1793, simple_loss=0.2057, pruned_loss=0.07598, over 24090.00 frames. ], tot_loss[loss=0.1697, simple_loss=0.1958, pruned_loss=0.06874, over 4325139.26 frames. ], batch size: 279, lr: 3.86e-02, grad_scale: 32.0 2024-03-15 20:36:52,313 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=6570.0, ans=0.19203125 2024-03-15 20:36:57,375 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=6603.333333333333, ans=0.009434057971014494 2024-03-15 20:37:02,554 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn1.whiten, num_groups=1, num_channels=512, metric=13.02 vs. limit=12.4525 2024-03-15 20:37:27,416 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.128e+02 1.382e+02 1.515e+02 1.771e+02 3.341e+02, threshold=3.031e+02, percent-clipped=1.0 2024-03-15 20:37:34,580 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer2.prob, batch_count=6703.333333333333, ans=0.18578125 2024-03-15 20:37:35,835 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.attention_skip_rate, batch_count=6703.333333333333, ans=0.03873611111111112 2024-03-15 20:37:39,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer2.prob, batch_count=6703.333333333333, ans=0.18578125 2024-03-15 20:37:46,395 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=12.15 vs. limit=12.5275 2024-03-15 20:37:48,133 INFO [train_char.py:689] (1/2) Epoch 4, batch 500, loss[loss=0.1716, simple_loss=0.2066, pruned_loss=0.06828, over 24224.00 frames. ], tot_loss[loss=0.17, simple_loss=0.1981, pruned_loss=0.06864, over 4437944.59 frames. ], batch size: 224, lr: 3.85e-02, grad_scale: 32.0 2024-03-15 20:38:43,777 INFO [train_char.py:689] (1/2) Epoch 5, batch 0, loss[loss=0.1488, simple_loss=0.1647, pruned_loss=0.06644, over 23915.00 frames. ], tot_loss[loss=0.1488, simple_loss=0.1647, pruned_loss=0.06644, over 23915.00 frames. ], batch size: 407, lr: 3.59e-02, grad_scale: 32.0 2024-03-15 20:38:43,777 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 20:38:56,466 INFO [train_char.py:721] (1/2) Epoch 5, validation: loss=0.1042, simple_loss=0.1552, pruned_loss=0.02658, over 657665.00 frames. 2024-03-15 20:38:56,467 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 20:39:06,159 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.ff3_skip_rate, batch_count=6760.0, ans=0.0094 2024-03-15 20:39:28,623 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.71 vs. limit=6.706666666666667 2024-03-15 20:39:32,311 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=13.33 vs. limit=10.06 2024-03-15 20:39:37,102 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward3.hidden_balancer.prob, batch_count=6826.666666666667, ans=0.18 2024-03-15 20:39:42,504 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.balancer.max_positive, batch_count=6860.0, ans=0.8186 2024-03-15 20:40:06,291 INFO [train_char.py:689] (1/2) Epoch 5, batch 50, loss[loss=0.1541, simple_loss=0.1706, pruned_loss=0.06885, over 23958.00 frames. ], tot_loss[loss=0.1502, simple_loss=0.1851, pruned_loss=0.05766, over 1093091.65 frames. ], batch size: 407, lr: 3.58e-02, grad_scale: 32.0 2024-03-15 20:40:26,231 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn1.whiten, num_groups=1, num_channels=512, metric=12.66 vs. limit=12.719999999999999 2024-03-15 20:40:47,747 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.107e+02 1.334e+02 1.501e+02 1.668e+02 3.121e+02, threshold=3.001e+02, percent-clipped=1.0 2024-03-15 20:40:55,017 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=12.58 vs. limit=10.135 2024-03-15 20:40:56,236 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module1.whiten.whitening_limit, batch_count=7026.666666666667, ans=10.135 2024-03-15 20:40:58,311 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=7026.666666666667, ans=0.0 2024-03-15 20:41:09,117 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=8.83 vs. limit=10.1475 2024-03-15 20:41:17,663 INFO [train_char.py:689] (1/2) Epoch 5, batch 100, loss[loss=0.116, simple_loss=0.1519, pruned_loss=0.04002, over 24375.00 frames. ], tot_loss[loss=0.1487, simple_loss=0.1831, pruned_loss=0.05721, over 1917043.79 frames. ], batch size: 129, lr: 3.58e-02, grad_scale: 32.0 2024-03-15 20:41:28,546 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn2.whiten, num_groups=1, num_channels=512, metric=14.53 vs. limit=12.82 2024-03-15 20:41:36,264 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=12.49 vs. limit=12.844999999999999 2024-03-15 20:41:37,135 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff3_skip_rate, batch_count=7126.666666666667, ans=0.009320289855072463 2024-03-15 20:41:39,921 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=7126.666666666667, ans=0.16593750000000002 2024-03-15 20:41:47,542 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module2.balancer2.prob, batch_count=7160.0, ans=0.164375 2024-03-15 20:41:58,345 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=5.44 vs. limit=10.185 2024-03-15 20:42:00,563 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.bypass_mid.scale_min, batch_count=7193.333333333333, ans=0.6482333333333334 2024-03-15 20:42:07,019 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=7193.333333333333, ans=0.22806666666666667 2024-03-15 20:42:26,267 INFO [train_char.py:689] (1/2) Epoch 5, batch 150, loss[loss=0.1709, simple_loss=0.2044, pruned_loss=0.06873, over 24161.00 frames. ], tot_loss[loss=0.1483, simple_loss=0.1821, pruned_loss=0.05719, over 2555115.48 frames. ], batch size: 279, lr: 3.58e-02, grad_scale: 32.0 2024-03-15 20:42:44,740 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer1.prob, batch_count=7293.333333333333, ans=0.15812500000000002 2024-03-15 20:42:45,131 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=6.92 vs. limit=6.823333333333333 2024-03-15 20:42:47,197 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.out_combiner.scale_min, batch_count=7293.333333333333, ans=0.6447333333333334 2024-03-15 20:42:48,536 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=7293.333333333333, ans=0.15812500000000002 2024-03-15 20:43:01,618 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.239e+01 1.296e+02 1.474e+02 1.745e+02 2.581e+02, threshold=2.947e+02, percent-clipped=0.0 2024-03-15 20:43:03,829 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=9.15 vs. limit=8.663333333333334 2024-03-15 20:43:08,547 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_skip_rate, batch_count=7360.0, ans=0.036000000000000004 2024-03-15 20:43:26,618 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=3.90 vs. limit=4.109 2024-03-15 20:43:32,180 INFO [train_char.py:689] (1/2) Epoch 5, batch 200, loss[loss=0.1815, simple_loss=0.2152, pruned_loss=0.07388, over 24154.00 frames. ], tot_loss[loss=0.1461, simple_loss=0.1806, pruned_loss=0.05581, over 3061227.40 frames. ], batch size: 279, lr: 3.58e-02, grad_scale: 32.0 2024-03-15 20:43:40,997 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.whiten, num_groups=1, num_channels=384, metric=5.88 vs. limit=6.970666666666666 2024-03-15 20:43:41,761 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=7426.666666666667, ans=0.15187499999999998 2024-03-15 20:43:48,445 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.97 vs. limit=6.865 2024-03-15 20:43:54,478 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.balancer1.prob, batch_count=7460.0, ans=0.15031250000000002 2024-03-15 20:43:57,531 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=10.71 vs. limit=10.2975 2024-03-15 20:44:12,077 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=256, metric=13.48 vs. limit=13.120000000000001 2024-03-15 20:44:28,847 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.out_balancer.prob, batch_count=7560.0, ans=0.145625 2024-03-15 20:44:34,758 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.whiten, num_groups=1, num_channels=192, metric=4.12 vs. limit=7.024 2024-03-15 20:44:40,044 INFO [train_char.py:689] (1/2) Epoch 5, batch 250, loss[loss=0.1015, simple_loss=0.1437, pruned_loss=0.0296, over 24247.00 frames. ], tot_loss[loss=0.1467, simple_loss=0.1817, pruned_loss=0.05587, over 3454416.25 frames. ], batch size: 134, lr: 3.57e-02, grad_scale: 32.0 2024-03-15 20:45:17,200 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.039e+02 1.273e+02 1.431e+02 1.641e+02 2.139e+02, threshold=2.863e+02, percent-clipped=0.0 2024-03-15 20:45:46,789 INFO [train_char.py:689] (1/2) Epoch 5, batch 300, loss[loss=0.175, simple_loss=0.2109, pruned_loss=0.06962, over 24175.00 frames. ], tot_loss[loss=0.1466, simple_loss=0.1821, pruned_loss=0.05552, over 3761365.96 frames. ], batch size: 199, lr: 3.57e-02, grad_scale: 32.0 2024-03-15 20:45:57,517 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.26 vs. limit=6.9399999999999995 2024-03-15 20:46:33,784 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=12.80 vs. limit=13.395 2024-03-15 20:46:54,026 INFO [train_char.py:689] (1/2) Epoch 5, batch 350, loss[loss=0.1717, simple_loss=0.2109, pruned_loss=0.06625, over 24227.00 frames. ], tot_loss[loss=0.1477, simple_loss=0.1833, pruned_loss=0.05607, over 3995305.53 frames. ], batch size: 212, lr: 3.57e-02, grad_scale: 32.0 2024-03-15 20:47:01,973 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.ff3_skip_rate, batch_count=7926.666666666667, ans=0.009146376811594203 2024-03-15 20:47:07,136 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer2.prob, batch_count=7960.0, ans=0.12687500000000002 2024-03-15 20:47:12,325 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer2.prob, batch_count=7960.0, ans=0.12687500000000002 2024-03-15 20:47:16,080 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer1.prob, batch_count=7960.0, ans=0.12687500000000002 2024-03-15 20:47:29,122 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.080e+02 1.281e+02 1.438e+02 1.669e+02 2.808e+02, threshold=2.877e+02, percent-clipped=0.0 2024-03-15 20:47:35,753 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=8026.666666666667, ans=0.0 2024-03-15 20:47:37,512 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=4.08 vs. limit=4.204 2024-03-15 20:47:59,583 INFO [train_char.py:689] (1/2) Epoch 5, batch 400, loss[loss=0.1623, simple_loss=0.2037, pruned_loss=0.06046, over 24125.00 frames. ], tot_loss[loss=0.1477, simple_loss=0.1843, pruned_loss=0.05557, over 4175047.93 frames. ], batch size: 279, lr: 3.56e-02, grad_scale: 32.0 2024-03-15 20:48:22,135 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=8126.666666666667, ans=0.21873333333333334 2024-03-15 20:48:37,754 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=3.49 vs. limit=10.5725 2024-03-15 20:48:48,317 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=8193.333333333334, ans=0.125 2024-03-15 20:48:59,350 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module1.balancer2.prob, batch_count=8226.666666666666, ans=0.125 2024-03-15 20:49:04,035 INFO [train_char.py:689] (1/2) Epoch 5, batch 450, loss[loss=0.155, simple_loss=0.1966, pruned_loss=0.0567, over 24036.00 frames. ], tot_loss[loss=0.1482, simple_loss=0.1857, pruned_loss=0.05534, over 4322720.73 frames. ], batch size: 199, lr: 3.56e-02, grad_scale: 32.0 2024-03-15 20:49:05,406 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=8260.0, ans=0.125 2024-03-15 20:49:07,211 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=7.19 vs. limit=7.0649999999999995 2024-03-15 20:49:07,947 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass_mid.scale_min, batch_count=8260.0, ans=0.6109 2024-03-15 20:49:21,405 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=8293.333333333334, ans=0.03211111111111111 2024-03-15 20:49:26,675 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.out_whiten, num_groups=1, num_channels=192, metric=5.55 vs. limit=5.658666666666667 2024-03-15 20:49:38,030 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.036e+02 1.311e+02 1.413e+02 1.606e+02 2.821e+02, threshold=2.826e+02, percent-clipped=0.0 2024-03-15 20:49:43,881 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.whiten, num_groups=1, num_channels=256, metric=4.73 vs. limit=7.343999999999999 2024-03-15 20:49:53,765 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.16 vs. limit=4.259 2024-03-15 20:49:58,086 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.whiten_keys.whitening_limit, batch_count=8393.333333333334, ans=4.259 2024-03-15 20:49:59,641 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.conv_module2.whiten, num_groups=1, num_channels=192, metric=4.00 vs. limit=10.6475 2024-03-15 20:50:01,402 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer1.prob, batch_count=8393.333333333334, ans=0.125 2024-03-15 20:50:07,393 INFO [train_char.py:689] (1/2) Epoch 5, batch 500, loss[loss=0.1527, simple_loss=0.1973, pruned_loss=0.05404, over 24043.00 frames. ], tot_loss[loss=0.1477, simple_loss=0.186, pruned_loss=0.05467, over 4437223.53 frames. ], batch size: 199, lr: 3.55e-02, grad_scale: 32.0 2024-03-15 20:51:05,262 INFO [train_char.py:689] (1/2) Epoch 6, batch 0, loss[loss=0.1183, simple_loss=0.1619, pruned_loss=0.0373, over 24446.00 frames. ], tot_loss[loss=0.1183, simple_loss=0.1619, pruned_loss=0.0373, over 24446.00 frames. ], batch size: 165, lr: 3.32e-02, grad_scale: 32.0 2024-03-15 20:51:05,263 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 20:51:16,937 INFO [train_char.py:721] (1/2) Epoch 6, validation: loss=0.09386, simple_loss=0.1462, pruned_loss=0.02078, over 657665.00 frames. 2024-03-15 20:51:16,938 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 20:51:31,112 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.whiten, num_groups=1, num_channels=192, metric=3.80 vs. limit=7.3933333333333335 2024-03-15 20:51:31,683 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=8483.333333333334, ans=0.03131944444444444 2024-03-15 20:51:37,314 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=5.72 vs. limit=9.241666666666667 2024-03-15 20:51:51,445 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer2.prob, batch_count=8516.666666666666, ans=0.125 2024-03-15 20:52:05,128 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.conv_module2.whiten, num_groups=1, num_channels=512, metric=5.17 vs. limit=10.70625 2024-03-15 20:52:10,688 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=10.59 vs. limit=10.71875 2024-03-15 20:52:12,708 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=8583.333333333334, ans=0.21416666666666667 2024-03-15 20:52:22,880 INFO [train_char.py:689] (1/2) Epoch 6, batch 50, loss[loss=0.1096, simple_loss=0.1584, pruned_loss=0.03036, over 24414.00 frames. ], tot_loss[loss=0.1346, simple_loss=0.1725, pruned_loss=0.04837, over 1090206.16 frames. ], batch size: 158, lr: 3.31e-02, grad_scale: 32.0 2024-03-15 20:52:34,282 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=9.03 vs. limit=9.308333333333334 2024-03-15 20:52:56,039 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=144, metric=7.04 vs. limit=7.1625 2024-03-15 20:52:57,569 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.674e+01 1.220e+02 1.395e+02 1.528e+02 2.413e+02, threshold=2.789e+02, percent-clipped=0.0 2024-03-15 20:53:00,573 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.whiten.whitening_limit, batch_count=8683.333333333334, ans=10.756250000000001 2024-03-15 20:53:07,539 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff2_skip_rate, batch_count=8683.333333333334, ans=0.008981884057971014 2024-03-15 20:53:16,625 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer2.prob, batch_count=8716.666666666666, ans=0.125 2024-03-15 20:53:29,685 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=13.73 vs. limit=14.0625 2024-03-15 20:53:35,211 INFO [train_char.py:689] (1/2) Epoch 6, batch 100, loss[loss=0.1627, simple_loss=0.2063, pruned_loss=0.05961, over 24064.00 frames. ], tot_loss[loss=0.1352, simple_loss=0.1748, pruned_loss=0.04778, over 1919702.37 frames. ], batch size: 199, lr: 3.31e-02, grad_scale: 32.0 2024-03-15 20:53:51,329 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=384, metric=13.54 vs. limit=14.1125 2024-03-15 20:53:51,395 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=11.13 vs. limit=10.80625 2024-03-15 20:54:39,058 INFO [train_char.py:689] (1/2) Epoch 6, batch 150, loss[loss=0.1139, simple_loss=0.1585, pruned_loss=0.03464, over 24389.00 frames. ], tot_loss[loss=0.1352, simple_loss=0.1752, pruned_loss=0.04761, over 2558150.26 frames. ], batch size: 152, lr: 3.31e-02, grad_scale: 32.0 2024-03-15 20:54:52,036 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.balancer1.prob, batch_count=8983.333333333334, ans=0.125 2024-03-15 20:54:59,723 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.0.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 20:55:04,791 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.003e+01 1.266e+02 1.394e+02 1.550e+02 2.577e+02, threshold=2.787e+02, percent-clipped=0.0 2024-03-15 20:55:36,943 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=9083.333333333334, ans=0.0 2024-03-15 20:55:50,072 INFO [train_char.py:689] (1/2) Epoch 6, batch 200, loss[loss=0.09754, simple_loss=0.1397, pruned_loss=0.02769, over 24337.00 frames. ], tot_loss[loss=0.1331, simple_loss=0.1736, pruned_loss=0.04631, over 3060068.63 frames. ], batch size: 146, lr: 3.30e-02, grad_scale: 32.0 2024-03-15 20:55:59,923 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.whiten_keys.whitening_limit, batch_count=9116.666666666666, ans=4.3675 2024-03-15 20:56:01,965 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff2_skip_rate, batch_count=9150.0, ans=0.008880434782608696 2024-03-15 20:56:21,170 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=9183.333333333334, ans=0.125 2024-03-15 20:56:41,214 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.balancer.min_positive, batch_count=9250.0, ans=0.1575 2024-03-15 20:56:50,043 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward3.hidden_balancer.prob, batch_count=9250.0, ans=0.125 2024-03-15 20:56:53,708 INFO [train_char.py:689] (1/2) Epoch 6, batch 250, loss[loss=0.1185, simple_loss=0.1507, pruned_loss=0.04317, over 23929.00 frames. ], tot_loss[loss=0.1334, simple_loss=0.1736, pruned_loss=0.04661, over 3440823.06 frames. ], batch size: 407, lr: 3.30e-02, grad_scale: 32.0 2024-03-15 20:56:59,118 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=9283.333333333334, ans=0.125 2024-03-15 20:57:02,008 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.65 vs. limit=7.320833333333334 2024-03-15 20:57:11,626 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=9316.666666666666, ans=0.20683333333333334 2024-03-15 20:57:15,737 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer2.prob, batch_count=9316.666666666666, ans=0.125 2024-03-15 20:57:19,276 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.505e+01 1.206e+02 1.322e+02 1.448e+02 1.971e+02, threshold=2.645e+02, percent-clipped=0.0 2024-03-15 20:57:20,206 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=9.06 vs. limit=11.00625 2024-03-15 20:57:32,067 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer2.prob, batch_count=9383.333333333334, ans=0.125 2024-03-15 20:57:39,473 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.hidden_balancer.prob, batch_count=9383.333333333334, ans=0.125 2024-03-15 20:57:42,021 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer2.min_positive, batch_count=9383.333333333334, ans=0.05 2024-03-15 20:57:44,582 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=9416.666666666666, ans=0.20583333333333334 2024-03-15 20:57:47,014 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=9416.666666666666, ans=0.02743055555555556 2024-03-15 20:57:56,800 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer_ff3.min_abs, batch_count=9416.666666666666, ans=0.2 2024-03-15 20:57:59,161 INFO [train_char.py:689] (1/2) Epoch 6, batch 300, loss[loss=0.1025, simple_loss=0.1457, pruned_loss=0.02967, over 24293.00 frames. ], tot_loss[loss=0.1322, simple_loss=0.1728, pruned_loss=0.04586, over 3746692.12 frames. ], batch size: 146, lr: 3.30e-02, grad_scale: 32.0 2024-03-15 20:58:08,795 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=9450.0, ans=0.0 2024-03-15 20:58:16,101 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer2.prob, batch_count=9483.333333333334, ans=0.125 2024-03-15 20:58:18,591 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer1.prob, batch_count=9483.333333333334, ans=0.125 2024-03-15 20:58:24,186 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=15.57 vs. limit=14.6125 2024-03-15 20:58:33,374 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.skip_rate, batch_count=9516.666666666666, ans=0.07 2024-03-15 20:58:47,816 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=9.42 vs. limit=9.775 2024-03-15 20:58:56,367 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=5.76 vs. limit=9.791666666666668 2024-03-15 20:58:57,272 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer2.prob, batch_count=9583.333333333334, ans=0.125 2024-03-15 20:59:02,185 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=9616.666666666666, ans=0.026597222222222227 2024-03-15 20:59:03,058 INFO [train_char.py:689] (1/2) Epoch 6, batch 350, loss[loss=0.1287, simple_loss=0.1648, pruned_loss=0.04628, over 24281.00 frames. ], tot_loss[loss=0.1323, simple_loss=0.1732, pruned_loss=0.04573, over 3982343.00 frames. ], batch size: 328, lr: 3.29e-02, grad_scale: 32.0 2024-03-15 20:59:10,429 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=9616.666666666666, ans=0.125 2024-03-15 20:59:27,800 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.602e+01 1.250e+02 1.405e+02 1.526e+02 4.340e+02, threshold=2.811e+02, percent-clipped=2.0 2024-03-15 20:59:30,604 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=9683.333333333334, ans=0.20316666666666666 2024-03-15 20:59:36,153 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=7.56 vs. limit=9.841666666666667 2024-03-15 20:59:57,818 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=13.44 vs. limit=11.15625 2024-03-15 20:59:57,835 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.self_attn1.whiten.whitening_limit, batch_count=9750.0, ans=14.8125 2024-03-15 20:59:58,646 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=9750.0, ans=0.026041666666666668 2024-03-15 21:00:03,021 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=192, metric=9.41 vs. limit=9.875 2024-03-15 21:00:04,147 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=384, metric=14.36 vs. limit=14.8125 2024-03-15 21:00:08,422 INFO [train_char.py:689] (1/2) Epoch 6, batch 400, loss[loss=0.1225, simple_loss=0.1503, pruned_loss=0.04736, over 23993.00 frames. ], tot_loss[loss=0.1334, simple_loss=0.1745, pruned_loss=0.04616, over 4170824.88 frames. ], batch size: 381, lr: 3.29e-02, grad_scale: 32.0 2024-03-15 21:00:11,577 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=11.73 vs. limit=11.16875 2024-03-15 21:00:41,434 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer1.prob, batch_count=9850.0, ans=0.125 2024-03-15 21:00:47,854 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=9.51 vs. limit=11.20625 2024-03-15 21:01:06,065 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=8.56 vs. limit=11.21875 2024-03-15 21:01:12,948 INFO [train_char.py:689] (1/2) Epoch 6, batch 450, loss[loss=0.1445, simple_loss=0.1934, pruned_loss=0.04778, over 24135.00 frames. ], tot_loss[loss=0.1334, simple_loss=0.1749, pruned_loss=0.04596, over 4318591.16 frames. ], batch size: 251, lr: 3.28e-02, grad_scale: 32.0 2024-03-15 21:01:25,956 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff2_skip_rate, batch_count=9983.333333333334, ans=0.008699275362318841 2024-03-15 21:01:38,551 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.947e+01 1.180e+02 1.287e+02 1.488e+02 2.013e+02, threshold=2.574e+02, percent-clipped=0.0 2024-03-15 21:02:04,452 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.balancer.max_positive, batch_count=10083.333333333334, ans=0.8508333333333333 2024-03-15 21:02:11,831 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff3_skip_rate, batch_count=10083.333333333334, ans=0.008677536231884058 2024-03-15 21:02:16,580 INFO [train_char.py:689] (1/2) Epoch 6, batch 500, loss[loss=0.1509, simple_loss=0.1996, pruned_loss=0.05114, over 24112.00 frames. ], tot_loss[loss=0.1343, simple_loss=0.1769, pruned_loss=0.04586, over 4432543.55 frames. ], batch size: 223, lr: 3.28e-02, grad_scale: 32.0 2024-03-15 21:02:18,090 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.attention_skip_rate, batch_count=10116.666666666666, ans=0.024513888888888894 2024-03-15 21:03:13,381 INFO [train_char.py:689] (1/2) Epoch 7, batch 0, loss[loss=0.1349, simple_loss=0.1763, pruned_loss=0.04669, over 24100.00 frames. ], tot_loss[loss=0.1349, simple_loss=0.1763, pruned_loss=0.04669, over 24100.00 frames. ], batch size: 279, lr: 3.07e-02, grad_scale: 64.0 2024-03-15 21:03:13,382 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 21:03:25,713 INFO [train_char.py:721] (1/2) Epoch 7, validation: loss=0.08738, simple_loss=0.1405, pruned_loss=0.01713, over 657665.00 frames. 2024-03-15 21:03:25,714 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 21:03:30,028 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=10140.0, ans=0.008665217391304348 2024-03-15 21:03:43,111 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer2.prob, batch_count=10173.333333333334, ans=0.125 2024-03-15 21:04:25,845 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.06 vs. limit=4.541 2024-03-15 21:04:38,224 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.attention_skip_rate, batch_count=10306.666666666666, ans=0.023722222222222228 2024-03-15 21:04:39,258 INFO [train_char.py:689] (1/2) Epoch 7, batch 50, loss[loss=0.09879, simple_loss=0.1484, pruned_loss=0.02458, over 24470.00 frames. ], tot_loss[loss=0.1244, simple_loss=0.1667, pruned_loss=0.04103, over 1074701.15 frames. ], batch size: 165, lr: 3.07e-02, grad_scale: 64.0 2024-03-15 21:04:43,578 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=10306.666666666666, ans=0.125 2024-03-15 21:04:56,561 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.529e+01 1.149e+02 1.269e+02 1.425e+02 2.118e+02, threshold=2.537e+02, percent-clipped=0.0 2024-03-15 21:05:26,164 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=11.22 vs. limit=11.4025 2024-03-15 21:05:28,191 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer2.prob, batch_count=10406.666666666666, ans=0.125 2024-03-15 21:05:44,150 INFO [train_char.py:689] (1/2) Epoch 7, batch 100, loss[loss=0.1294, simple_loss=0.1775, pruned_loss=0.04067, over 24150.00 frames. ], tot_loss[loss=0.1225, simple_loss=0.1645, pruned_loss=0.04025, over 1897290.97 frames. ], batch size: 188, lr: 3.07e-02, grad_scale: 64.0 2024-03-15 21:05:45,067 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn2.whiten, num_groups=1, num_channels=192, metric=12.74 vs. limit=15.355 2024-03-15 21:05:47,379 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=5.00 vs. limit=8.189333333333334 2024-03-15 21:05:54,062 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=9.14 vs. limit=11.4275 2024-03-15 21:05:57,161 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=10506.666666666666, ans=0.022888888888888893 2024-03-15 21:06:07,331 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff3_skip_rate, batch_count=10506.666666666666, ans=0.008585507246376812 2024-03-15 21:06:27,311 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=10573.333333333334, ans=0.19426666666666664 2024-03-15 21:06:48,073 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=14.15 vs. limit=11.4775 2024-03-15 21:06:52,539 INFO [train_char.py:689] (1/2) Epoch 7, batch 150, loss[loss=0.1395, simple_loss=0.1897, pruned_loss=0.04468, over 24006.00 frames. ], tot_loss[loss=0.121, simple_loss=0.164, pruned_loss=0.03901, over 2546292.92 frames. ], batch size: 199, lr: 3.06e-02, grad_scale: 64.0 2024-03-15 21:07:13,147 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.543e+01 1.182e+02 1.309e+02 1.499e+02 2.178e+02, threshold=2.618e+02, percent-clipped=0.0 2024-03-15 21:07:14,740 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=10673.333333333334, ans=0.19326666666666667 2024-03-15 21:07:40,447 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer_ff2.min_abs, batch_count=10740.0, ans=0.1 2024-03-15 21:07:48,182 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer1.prob, batch_count=10773.333333333334, ans=0.125 2024-03-15 21:08:00,624 INFO [train_char.py:689] (1/2) Epoch 7, batch 200, loss[loss=0.1486, simple_loss=0.199, pruned_loss=0.04908, over 24195.00 frames. ], tot_loss[loss=0.1207, simple_loss=0.1637, pruned_loss=0.03884, over 3051290.14 frames. ], batch size: 224, lr: 3.06e-02, grad_scale: 64.0 2024-03-15 21:08:12,482 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.balancer1.prob, batch_count=10840.0, ans=0.125 2024-03-15 21:08:44,077 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=10906.666666666666, ans=0.125 2024-03-15 21:08:51,410 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff3_skip_rate, batch_count=10940.0, ans=0.008491304347826087 2024-03-15 21:08:58,906 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=10940.0, ans=0.1906 2024-03-15 21:09:01,608 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer1.prob, batch_count=10940.0, ans=0.125 2024-03-15 21:09:03,860 INFO [train_char.py:689] (1/2) Epoch 7, batch 250, loss[loss=0.1394, simple_loss=0.189, pruned_loss=0.04493, over 24092.00 frames. ], tot_loss[loss=0.1217, simple_loss=0.1655, pruned_loss=0.03897, over 3440775.60 frames. ], batch size: 279, lr: 3.06e-02, grad_scale: 64.0 2024-03-15 21:09:23,131 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.070e+01 1.140e+02 1.283e+02 1.415e+02 2.671e+02, threshold=2.567e+02, percent-clipped=1.0 2024-03-15 21:09:39,127 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=512, metric=15.46 vs. limit=15.78 2024-03-15 21:09:59,309 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.attention_skip_rate, batch_count=11106.666666666666, ans=0.020388888888888894 2024-03-15 21:10:00,678 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=11106.666666666666, ans=0.18893333333333334 2024-03-15 21:10:13,177 INFO [train_char.py:689] (1/2) Epoch 7, batch 300, loss[loss=0.1381, simple_loss=0.1839, pruned_loss=0.04609, over 24115.00 frames. ], tot_loss[loss=0.1215, simple_loss=0.1658, pruned_loss=0.03865, over 3749992.59 frames. ], batch size: 279, lr: 3.05e-02, grad_scale: 64.0 2024-03-15 21:10:34,695 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.5.prob, batch_count=11173.333333333334, ans=0.125 2024-03-15 21:10:36,051 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer2.prob, batch_count=11173.333333333334, ans=0.125 2024-03-15 21:10:58,874 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=9.90 vs. limit=11.715 2024-03-15 21:11:16,068 INFO [train_char.py:689] (1/2) Epoch 7, batch 350, loss[loss=0.09785, simple_loss=0.1452, pruned_loss=0.02526, over 24315.00 frames. ], tot_loss[loss=0.1217, simple_loss=0.1659, pruned_loss=0.03873, over 3990753.01 frames. ], batch size: 140, lr: 3.05e-02, grad_scale: 64.0 2024-03-15 21:11:16,377 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff2_skip_rate, batch_count=11306.666666666666, ans=0.00841159420289855 2024-03-15 21:11:23,364 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=11306.666666666666, ans=0.5042666666666666 2024-03-15 21:11:34,732 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.866e+01 1.129e+02 1.284e+02 1.477e+02 2.549e+02, threshold=2.568e+02, percent-clipped=0.0 2024-03-15 21:11:51,484 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=10.99 vs. limit=10.686666666666667 2024-03-15 21:11:56,555 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=16.11 vs. limit=16.03 2024-03-15 21:12:23,034 INFO [train_char.py:689] (1/2) Epoch 7, batch 400, loss[loss=0.1308, simple_loss=0.1739, pruned_loss=0.04386, over 24210.00 frames. ], tot_loss[loss=0.1222, simple_loss=0.1669, pruned_loss=0.03877, over 4179047.45 frames. ], batch size: 311, lr: 3.05e-02, grad_scale: 64.0 2024-03-15 21:12:25,823 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=11473.333333333334, ans=0.018861111111111106 2024-03-15 21:12:46,927 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=11506.666666666666, ans=0.125 2024-03-15 21:13:08,201 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff2_skip_rate, batch_count=11573.333333333334, ans=0.008353623188405797 2024-03-15 21:13:17,898 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:13:19,144 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:13:27,879 INFO [train_char.py:689] (1/2) Epoch 7, batch 450, loss[loss=0.1313, simple_loss=0.1842, pruned_loss=0.03923, over 24098.00 frames. ], tot_loss[loss=0.1232, simple_loss=0.1683, pruned_loss=0.03903, over 4325529.40 frames. ], batch size: 223, lr: 3.04e-02, grad_scale: 64.0 2024-03-15 21:13:29,485 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass_mid.scale_min, batch_count=11640.0, ans=0.4926000000000001 2024-03-15 21:13:35,363 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass_mid.scale_min, batch_count=11640.0, ans=0.4926000000000001 2024-03-15 21:13:44,129 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.510e+01 1.138e+02 1.275e+02 1.402e+02 1.953e+02, threshold=2.551e+02, percent-clipped=0.0 2024-03-15 21:13:45,640 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=11673.333333333334, ans=0.125 2024-03-15 21:13:46,205 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=11.68 vs. limit=11.877500000000001 2024-03-15 21:14:00,811 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=144, metric=7.70 vs. limit=7.926666666666666 2024-03-15 21:14:03,792 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.convnext.out_whiten, num_groups=1, num_channels=128, metric=4.37 vs. limit=5.0 2024-03-15 21:14:05,497 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=11740.0, ans=0.0 2024-03-15 21:14:12,735 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=11740.0, ans=0.18259999999999998 2024-03-15 21:14:24,431 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.whiten, num_groups=1, num_channels=512, metric=6.91 vs. limit=8.709333333333333 2024-03-15 21:14:30,194 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer2.prob, batch_count=11806.666666666666, ans=0.125 2024-03-15 21:14:31,178 INFO [train_char.py:689] (1/2) Epoch 7, batch 500, loss[loss=0.1371, simple_loss=0.1828, pruned_loss=0.04564, over 24133.00 frames. ], tot_loss[loss=0.1238, simple_loss=0.1696, pruned_loss=0.03903, over 4439208.79 frames. ], batch size: 279, lr: 3.04e-02, grad_scale: 64.0 2024-03-15 21:14:31,370 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module2.balancer1.prob, batch_count=11806.666666666666, ans=0.125 2024-03-15 21:15:28,113 INFO [train_char.py:689] (1/2) Epoch 8, batch 0, loss[loss=0.09929, simple_loss=0.1502, pruned_loss=0.02417, over 24360.00 frames. ], tot_loss[loss=0.09929, simple_loss=0.1502, pruned_loss=0.02417, over 24360.00 frames. ], batch size: 129, lr: 2.86e-02, grad_scale: 64.0 2024-03-15 21:15:28,113 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 21:15:37,352 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.2.encoder.layers.1.self_attn_weights, attn_weights_entropy = tensor([3.5245, 3.0505, 2.7547, 3.3823], device='cuda:1') 2024-03-15 21:15:41,599 INFO [train_char.py:721] (1/2) Epoch 8, validation: loss=0.08204, simple_loss=0.1358, pruned_loss=0.01415, over 657665.00 frames. 2024-03-15 21:15:41,600 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 21:16:05,648 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=11863.333333333334, ans=0.125 2024-03-15 21:16:52,360 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=10.34 vs. limit=10.998333333333333 2024-03-15 21:16:52,631 INFO [train_char.py:689] (1/2) Epoch 8, batch 50, loss[loss=0.1097, simple_loss=0.1628, pruned_loss=0.02836, over 24145.00 frames. ], tot_loss[loss=0.1104, simple_loss=0.1544, pruned_loss=0.03321, over 1080537.33 frames. ], batch size: 188, lr: 2.86e-02, grad_scale: 64.0 2024-03-15 21:16:52,972 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer2.min_positive, batch_count=11996.666666666666, ans=0.05 2024-03-15 21:17:01,126 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.530e+01 1.124e+02 1.230e+02 1.388e+02 1.986e+02, threshold=2.460e+02, percent-clipped=0.0 2024-03-15 21:17:04,034 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=11996.666666666666, ans=0.0 2024-03-15 21:17:58,118 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=18.58 vs. limit=12.04875 2024-03-15 21:18:05,210 INFO [train_char.py:689] (1/2) Epoch 8, batch 100, loss[loss=0.1028, simple_loss=0.1504, pruned_loss=0.02757, over 24377.00 frames. ], tot_loss[loss=0.1143, simple_loss=0.1594, pruned_loss=0.03457, over 1903085.91 frames. ], batch size: 158, lr: 2.85e-02, grad_scale: 64.0 2024-03-15 21:18:06,774 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=12163.333333333334, ans=0.47428333333333333 2024-03-15 21:18:07,260 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=9.10 vs. limit=8.040833333333333 2024-03-15 21:18:21,247 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=7.70 vs. limit=11.098333333333333 2024-03-15 21:18:27,415 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:18:37,766 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=12230.0, ans=0.17769999999999997 2024-03-15 21:18:58,716 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=12263.333333333334, ans=0.125 2024-03-15 21:19:00,456 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=384, metric=3.04 vs. limit=12.11125 2024-03-15 21:19:06,591 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.balancer.min_positive, batch_count=12296.666666666666, ans=0.12703333333333333 2024-03-15 21:19:09,161 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.balancer2.prob, batch_count=12296.666666666666, ans=0.125 2024-03-15 21:19:11,786 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.0.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:19:14,093 INFO [train_char.py:689] (1/2) Epoch 8, batch 150, loss[loss=0.1283, simple_loss=0.1737, pruned_loss=0.04148, over 24104.00 frames. ], tot_loss[loss=0.1131, simple_loss=0.1579, pruned_loss=0.03409, over 2545841.65 frames. ], batch size: 279, lr: 2.85e-02, grad_scale: 64.0 2024-03-15 21:19:14,327 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer1.prob, batch_count=12330.0, ans=0.125 2024-03-15 21:19:21,561 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.886e+01 1.136e+02 1.338e+02 1.602e+02 2.558e+02, threshold=2.676e+02, percent-clipped=2.0 2024-03-15 21:19:28,291 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer1.prob, batch_count=12363.333333333334, ans=0.125 2024-03-15 21:19:36,232 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer2.prob, batch_count=12363.333333333334, ans=0.125 2024-03-15 21:19:46,369 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.balancer1.prob, batch_count=12396.666666666666, ans=0.125 2024-03-15 21:20:00,453 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer2.prob, batch_count=12430.0, ans=0.125 2024-03-15 21:20:10,875 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=12463.333333333334, ans=0.17536666666666667 2024-03-15 21:20:15,801 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.scale_min, batch_count=12463.333333333334, ans=0.4637833333333333 2024-03-15 21:20:18,063 INFO [train_char.py:689] (1/2) Epoch 8, batch 200, loss[loss=0.09446, simple_loss=0.1424, pruned_loss=0.02324, over 24363.00 frames. ], tot_loss[loss=0.114, simple_loss=0.1595, pruned_loss=0.03428, over 3050157.58 frames. ], batch size: 152, lr: 2.85e-02, grad_scale: 64.0 2024-03-15 21:20:40,103 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.attention_skip_rate, batch_count=12530.0, ans=0.014458333333333337 2024-03-15 21:20:40,673 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=10.44 vs. limit=11.265 2024-03-15 21:21:01,399 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=12596.666666666666, ans=0.125 2024-03-15 21:21:06,435 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=12596.666666666666, ans=0.014180555555555557 2024-03-15 21:21:25,057 INFO [train_char.py:689] (1/2) Epoch 8, batch 250, loss[loss=0.1167, simple_loss=0.1623, pruned_loss=0.03553, over 24195.00 frames. ], tot_loss[loss=0.1144, simple_loss=0.1608, pruned_loss=0.03403, over 3447119.52 frames. ], batch size: 311, lr: 2.84e-02, grad_scale: 64.0 2024-03-15 21:21:33,126 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.876e+01 1.081e+02 1.224e+02 1.402e+02 2.503e+02, threshold=2.448e+02, percent-clipped=0.0 2024-03-15 21:22:14,777 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module1.balancer1.prob, batch_count=12763.333333333334, ans=0.125 2024-03-15 21:22:25,303 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.prob, batch_count=12796.666666666666, ans=0.125 2024-03-15 21:22:32,680 INFO [train_char.py:689] (1/2) Epoch 8, batch 300, loss[loss=0.1151, simple_loss=0.1633, pruned_loss=0.03344, over 24417.00 frames. ], tot_loss[loss=0.1149, simple_loss=0.1617, pruned_loss=0.0341, over 3754657.16 frames. ], batch size: 165, lr: 2.84e-02, grad_scale: 64.0 2024-03-15 21:23:03,706 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=10.43 vs. limit=12.33625 2024-03-15 21:23:05,729 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff2_skip_rate, batch_count=12896.666666666666, ans=0.008065942028985508 2024-03-15 21:23:23,524 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff3_skip_rate, batch_count=12930.0, ans=0.008058695652173913 2024-03-15 21:23:26,026 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=12963.333333333334, ans=0.17036666666666667 2024-03-15 21:23:38,236 INFO [train_char.py:689] (1/2) Epoch 8, batch 350, loss[loss=0.1254, simple_loss=0.1689, pruned_loss=0.04097, over 24264.00 frames. ], tot_loss[loss=0.1146, simple_loss=0.1614, pruned_loss=0.03391, over 3994096.74 frames. ], batch size: 296, lr: 2.83e-02, grad_scale: 64.0 2024-03-15 21:23:40,146 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=4.84 vs. limit=4.9495000000000005 2024-03-15 21:23:45,442 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=144, metric=7.28 vs. limit=8.249166666666667 2024-03-15 21:23:45,775 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.133e+01 1.142e+02 1.443e+02 1.715e+02 2.690e+02, threshold=2.886e+02, percent-clipped=3.0 2024-03-15 21:24:13,110 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer1.prob, batch_count=13063.333333333334, ans=0.125 2024-03-15 21:24:13,140 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=13063.333333333334, ans=0.0 2024-03-15 21:24:22,398 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.self_attn_weights.whiten_keys.whitening_limit, batch_count=13096.666666666666, ans=4.9645 2024-03-15 21:24:30,782 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=13130.0, ans=0.1687 2024-03-15 21:24:44,912 INFO [train_char.py:689] (1/2) Epoch 8, batch 400, loss[loss=0.09896, simple_loss=0.1461, pruned_loss=0.0259, over 24390.00 frames. ], tot_loss[loss=0.1143, simple_loss=0.161, pruned_loss=0.0338, over 4182826.85 frames. ], batch size: 158, lr: 2.83e-02, grad_scale: 64.0 2024-03-15 21:24:50,261 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer2.prob, batch_count=13163.333333333334, ans=0.125 2024-03-15 21:25:07,770 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward3.hidden_balancer.prob, batch_count=13196.666666666666, ans=0.125 2024-03-15 21:25:23,982 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.attention_skip_rate, batch_count=13263.333333333334, ans=0.011402777777777776 2024-03-15 21:25:25,340 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=13263.333333333334, ans=0.0 2024-03-15 21:25:27,767 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer1.prob, batch_count=13263.333333333334, ans=0.125 2024-03-15 21:25:32,023 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=512, metric=5.01 vs. limit=12.473749999999999 2024-03-15 21:25:41,805 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=16.00 vs. limit=17.4725 2024-03-15 21:25:49,709 INFO [train_char.py:689] (1/2) Epoch 8, batch 450, loss[loss=0.1204, simple_loss=0.1715, pruned_loss=0.03464, over 24164.00 frames. ], tot_loss[loss=0.1158, simple_loss=0.1629, pruned_loss=0.03436, over 4327702.68 frames. ], batch size: 188, lr: 2.83e-02, grad_scale: 64.0 2024-03-15 21:26:00,097 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.686e+01 1.193e+02 1.318e+02 1.623e+02 2.694e+02, threshold=2.637e+02, percent-clipped=0.0 2024-03-15 21:26:15,572 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.scale_min, batch_count=13363.333333333334, ans=0.43228333333333335 2024-03-15 21:26:26,404 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=13396.666666666666, ans=0.4311166666666667 2024-03-15 21:26:33,881 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=13430.0, ans=0.125 2024-03-15 21:26:53,239 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=13463.333333333334, ans=0.16536666666666666 2024-03-15 21:26:56,755 INFO [train_char.py:689] (1/2) Epoch 8, batch 500, loss[loss=0.1112, simple_loss=0.164, pruned_loss=0.02921, over 24132.00 frames. ], tot_loss[loss=0.117, simple_loss=0.1647, pruned_loss=0.0346, over 4438377.55 frames. ], batch size: 188, lr: 2.82e-02, grad_scale: 64.0 2024-03-15 21:26:56,980 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass_mid.scale_min, batch_count=13496.666666666666, ans=0.4276166666666667 2024-03-15 21:26:57,017 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=13496.666666666666, ans=0.125 2024-03-15 21:27:51,733 INFO [train_char.py:689] (1/2) Epoch 9, batch 0, loss[loss=0.1052, simple_loss=0.1466, pruned_loss=0.03196, over 24247.00 frames. ], tot_loss[loss=0.1052, simple_loss=0.1466, pruned_loss=0.03196, over 24247.00 frames. ], batch size: 328, lr: 2.67e-02, grad_scale: 64.0 2024-03-15 21:27:51,733 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 21:28:04,359 INFO [train_char.py:721] (1/2) Epoch 9, validation: loss=0.07904, simple_loss=0.134, pruned_loss=0.01202, over 657665.00 frames. 2024-03-15 21:28:04,360 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 21:28:08,429 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer2.prob, batch_count=13520.0, ans=0.125 2024-03-15 21:28:16,743 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=10.37 vs. limit=12.5825 2024-03-15 21:28:24,333 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=13553.333333333334, ans=0.16446666666666665 2024-03-15 21:28:28,284 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.ff3_skip_rate, batch_count=13553.333333333334, ans=0.007923188405797102 2024-03-15 21:28:40,338 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=13586.666666666666, ans=0.125 2024-03-15 21:28:40,366 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_skip_rate, batch_count=13586.666666666666, ans=0.01005555555555556 2024-03-15 21:28:42,840 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.bypass_mid.scale_min, batch_count=13620.0, ans=0.4233 2024-03-15 21:28:55,937 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.14 vs. limit=5.043 2024-03-15 21:29:11,359 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.371e+01 1.148e+02 1.267e+02 1.499e+02 2.749e+02, threshold=2.533e+02, percent-clipped=1.0 2024-03-15 21:29:12,758 INFO [train_char.py:689] (1/2) Epoch 9, batch 50, loss[loss=0.1116, simple_loss=0.1578, pruned_loss=0.03272, over 24214.00 frames. ], tot_loss[loss=0.1053, simple_loss=0.151, pruned_loss=0.02977, over 1087419.63 frames. ], batch size: 311, lr: 2.67e-02, grad_scale: 64.0 2024-03-15 21:29:21,154 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer2.prob, batch_count=13686.666666666666, ans=0.125 2024-03-15 21:29:26,746 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.whiten.whitening_limit, batch_count=13720.0, ans=9.488 2024-03-15 21:29:51,440 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.out_combiner.scale_min, batch_count=13753.333333333334, ans=0.4186333333333333 2024-03-15 21:29:56,584 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=13786.666666666666, ans=0.125 2024-03-15 21:30:01,819 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=13786.666666666666, ans=0.125 2024-03-15 21:30:12,239 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=13820.0, ans=0.125 2024-03-15 21:30:17,231 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.ff2_skip_rate, batch_count=13820.0, ans=0.007865217391304347 2024-03-15 21:30:17,349 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer2.prob, batch_count=13820.0, ans=0.125 2024-03-15 21:30:22,107 INFO [train_char.py:689] (1/2) Epoch 9, batch 100, loss[loss=0.1101, simple_loss=0.1602, pruned_loss=0.02999, over 24368.00 frames. ], tot_loss[loss=0.1075, simple_loss=0.1556, pruned_loss=0.02968, over 1914916.16 frames. ], batch size: 152, lr: 2.66e-02, grad_scale: 64.0 2024-03-15 21:30:25,319 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.whiten, num_groups=1, num_channels=512, metric=6.84 vs. limit=9.541333333333334 2024-03-15 21:30:27,328 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.ff2_skip_rate, batch_count=13853.333333333334, ans=0.007857971014492753 2024-03-15 21:30:32,608 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.3.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:30:57,098 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.attention_skip_rate, batch_count=13920.0, ans=0.00866666666666667 2024-03-15 21:31:09,808 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass_mid.scale_min, batch_count=13953.333333333334, ans=0.41163333333333335 2024-03-15 21:31:28,546 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.163e+01 1.095e+02 1.330e+02 1.584e+02 2.657e+02, threshold=2.661e+02, percent-clipped=2.0 2024-03-15 21:31:29,875 INFO [train_char.py:689] (1/2) Epoch 9, batch 150, loss[loss=0.1057, simple_loss=0.159, pruned_loss=0.02617, over 24437.00 frames. ], tot_loss[loss=0.1077, simple_loss=0.1557, pruned_loss=0.02986, over 2558405.11 frames. ], batch size: 165, lr: 2.66e-02, grad_scale: 64.0 2024-03-15 21:31:51,706 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=384, metric=6.51 vs. limit=12.77 2024-03-15 21:31:58,998 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=14086.666666666666, ans=0.0 2024-03-15 21:32:31,334 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.scale_min, batch_count=14153.333333333334, ans=0.40463333333333334 2024-03-15 21:32:32,893 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=11.71 vs. limit=12.82 2024-03-15 21:32:33,644 INFO [train_char.py:689] (1/2) Epoch 9, batch 200, loss[loss=0.08981, simple_loss=0.1326, pruned_loss=0.02352, over 23997.00 frames. ], tot_loss[loss=0.1067, simple_loss=0.1545, pruned_loss=0.0294, over 3053443.90 frames. ], batch size: 381, lr: 2.66e-02, grad_scale: 64.0 2024-03-15 21:32:35,671 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=8.82 vs. limit=12.82 2024-03-15 21:32:51,298 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward3.out_whiten.whitening_limit, batch_count=14220.0, ans=12.8325 2024-03-15 21:32:53,939 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.51 vs. limit=5.133 2024-03-15 21:32:57,363 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=14220.0, ans=0.125 2024-03-15 21:33:06,192 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=14253.333333333334, ans=0.125 2024-03-15 21:33:28,503 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=5.07 vs. limit=9.728 2024-03-15 21:33:36,528 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.whiten, num_groups=1, num_channels=256, metric=5.63 vs. limit=9.728 2024-03-15 21:33:39,560 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.662e+01 1.181e+02 1.353e+02 1.596e+02 2.674e+02, threshold=2.706e+02, percent-clipped=1.0 2024-03-15 21:33:40,904 INFO [train_char.py:689] (1/2) Epoch 9, batch 250, loss[loss=0.09629, simple_loss=0.1463, pruned_loss=0.02313, over 23912.00 frames. ], tot_loss[loss=0.1067, simple_loss=0.1543, pruned_loss=0.02952, over 3441500.05 frames. ], batch size: 107, lr: 2.65e-02, grad_scale: 64.0 2024-03-15 21:33:49,822 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer2.prob, batch_count=14353.333333333334, ans=0.125 2024-03-15 21:33:53,102 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.86 vs. limit=5.1579999999999995 2024-03-15 21:33:55,057 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=14386.666666666666, ans=0.15613333333333335 2024-03-15 21:34:15,996 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff2_skip_rate, batch_count=14420.0, ans=0.007734782608695652 2024-03-15 21:34:28,703 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=11.13 vs. limit=12.92 2024-03-15 21:34:46,644 INFO [train_char.py:689] (1/2) Epoch 9, batch 300, loss[loss=0.1167, simple_loss=0.1649, pruned_loss=0.03424, over 24212.00 frames. ], tot_loss[loss=0.1074, simple_loss=0.1554, pruned_loss=0.02975, over 3745719.96 frames. ], batch size: 296, lr: 2.65e-02, grad_scale: 64.0 2024-03-15 21:34:49,432 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.scale_min, batch_count=14520.0, ans=0.39180000000000004 2024-03-15 21:35:00,755 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff2_skip_rate, batch_count=14553.333333333334, ans=0.007705797101449275 2024-03-15 21:35:28,094 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=8.77 vs. limit=12.9825 2024-03-15 21:35:37,631 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=14620.0, ans=0.125 2024-03-15 21:35:38,848 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer2.prob, batch_count=14653.333333333334, ans=0.125 2024-03-15 21:35:45,742 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=14653.333333333334, ans=0.125 2024-03-15 21:35:45,756 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=14653.333333333334, ans=0.15346666666666667 2024-03-15 21:35:47,009 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.balancer.min_positive, batch_count=14653.333333333334, ans=0.10346666666666665 2024-03-15 21:35:51,566 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.453e+01 1.119e+02 1.274e+02 1.597e+02 2.203e+02, threshold=2.548e+02, percent-clipped=0.0 2024-03-15 21:35:52,840 INFO [train_char.py:689] (1/2) Epoch 9, batch 350, loss[loss=0.1247, simple_loss=0.167, pruned_loss=0.04124, over 24206.00 frames. ], tot_loss[loss=0.1079, simple_loss=0.1563, pruned_loss=0.02976, over 3988369.57 frames. ], batch size: 311, lr: 2.65e-02, grad_scale: 64.0 2024-03-15 21:36:04,460 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=12.61 vs. limit=13.02 2024-03-15 21:36:31,475 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.prob, batch_count=14786.666666666666, ans=0.125 2024-03-15 21:36:57,657 INFO [train_char.py:689] (1/2) Epoch 9, batch 400, loss[loss=0.1171, simple_loss=0.1672, pruned_loss=0.03351, over 24346.00 frames. ], tot_loss[loss=0.1081, simple_loss=0.1565, pruned_loss=0.0298, over 4176166.56 frames. ], batch size: 180, lr: 2.64e-02, grad_scale: 64.0 2024-03-15 21:37:08,467 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass.skip_rate, batch_count=14853.333333333334, ans=0.04949747468305833 2024-03-15 21:37:10,954 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=14886.666666666666, ans=0.15113333333333334 2024-03-15 21:37:22,307 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module1.balancer1.max_abs, batch_count=14886.666666666666, ans=10.0 2024-03-15 21:37:27,324 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.nonlin_attention.balancer.prob, batch_count=14920.0, ans=0.125 2024-03-15 21:37:36,256 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.balancer2.prob, batch_count=14920.0, ans=0.125 2024-03-15 21:38:02,042 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.080e+01 1.138e+02 1.304e+02 1.631e+02 3.170e+02, threshold=2.609e+02, percent-clipped=2.0 2024-03-15 21:38:02,338 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=15020.0, ans=0.004083333333333335 2024-03-15 21:38:03,329 INFO [train_char.py:689] (1/2) Epoch 9, batch 450, loss[loss=0.09519, simple_loss=0.1512, pruned_loss=0.01958, over 24334.00 frames. ], tot_loss[loss=0.1088, simple_loss=0.1574, pruned_loss=0.03015, over 4323712.75 frames. ], batch size: 158, lr: 2.64e-02, grad_scale: 64.0 2024-03-15 21:38:13,413 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module1.balancer1.min_positive, batch_count=15020.0, ans=0.025 2024-03-15 21:38:25,669 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=15053.333333333334, ans=0.14946666666666666 2024-03-15 21:38:25,763 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:38:26,955 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=15053.333333333334, ans=0.14946666666666666 2024-03-15 21:38:47,076 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=512, metric=4.60 vs. limit=13.17 2024-03-15 21:38:59,699 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=10.14 vs. limit=13.182500000000001 2024-03-15 21:39:08,084 INFO [train_char.py:689] (1/2) Epoch 9, batch 500, loss[loss=0.1209, simple_loss=0.1698, pruned_loss=0.03595, over 24101.00 frames. ], tot_loss[loss=0.1106, simple_loss=0.1601, pruned_loss=0.03058, over 4435884.81 frames. ], batch size: 279, lr: 2.63e-02, grad_scale: 64.0 2024-03-15 21:40:06,722 INFO [train_char.py:689] (1/2) Epoch 10, batch 0, loss[loss=0.096, simple_loss=0.1513, pruned_loss=0.02034, over 24253.00 frames. ], tot_loss[loss=0.096, simple_loss=0.1513, pruned_loss=0.02034, over 24253.00 frames. ], batch size: 134, lr: 2.50e-02, grad_scale: 64.0 2024-03-15 21:40:06,723 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 21:40:19,992 INFO [train_char.py:721] (1/2) Epoch 10, validation: loss=0.07731, simple_loss=0.1327, pruned_loss=0.01097, over 657665.00 frames. 2024-03-15 21:40:19,993 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 21:40:27,020 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=15210.0, ans=0.003291666666666672 2024-03-15 21:40:37,087 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module2.balancer1.prob, batch_count=15243.333333333334, ans=0.125 2024-03-15 21:40:44,018 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=15243.333333333334, ans=0.125 2024-03-15 21:41:11,057 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=7.86 vs. limit=8.8275 2024-03-15 21:41:18,776 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 1.020e+02 1.181e+02 1.394e+02 1.702e+02 3.976e+02, threshold=2.787e+02, percent-clipped=2.0 2024-03-15 21:41:29,311 INFO [train_char.py:689] (1/2) Epoch 10, batch 50, loss[loss=0.1065, simple_loss=0.1552, pruned_loss=0.0289, over 24224.00 frames. ], tot_loss[loss=0.09868, simple_loss=0.1451, pruned_loss=0.02613, over 1080485.87 frames. ], batch size: 328, lr: 2.50e-02, grad_scale: 64.0 2024-03-15 21:41:32,672 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=13.49 vs. limit=13.26625 2024-03-15 21:41:43,901 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=15410.0, ans=0.125 2024-03-15 21:41:59,970 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=15443.333333333334, ans=0.0023194444444444434 2024-03-15 21:42:03,868 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module2.balancer2.prob, batch_count=15443.333333333334, ans=0.125 2024-03-15 21:42:18,000 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=15476.666666666666, ans=0.0 2024-03-15 21:42:21,808 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer2.prob, batch_count=15476.666666666666, ans=0.125 2024-03-15 21:42:35,517 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=512, metric=4.87 vs. limit=13.31625 2024-03-15 21:42:42,180 INFO [train_char.py:689] (1/2) Epoch 10, batch 100, loss[loss=0.1211, simple_loss=0.1757, pruned_loss=0.03329, over 24138.00 frames. ], tot_loss[loss=0.1012, simple_loss=0.1481, pruned_loss=0.02708, over 1909967.87 frames. ], batch size: 223, lr: 2.50e-02, grad_scale: 64.0 2024-03-15 21:42:53,039 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=13.40 vs. limit=13.32875 2024-03-15 21:42:58,479 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=11.10 vs. limit=12.788333333333334 2024-03-15 21:43:00,522 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.bypass.scale_min, batch_count=15576.666666666666, ans=0.35481666666666667 2024-03-15 21:43:05,521 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:43:31,022 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.prob, batch_count=15643.333333333334, ans=0.125 2024-03-15 21:43:35,746 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.673e+01 1.133e+02 1.390e+02 1.768e+02 3.155e+02, threshold=2.779e+02, percent-clipped=3.0 2024-03-15 21:43:46,333 INFO [train_char.py:689] (1/2) Epoch 10, batch 150, loss[loss=0.07595, simple_loss=0.1208, pruned_loss=0.01554, over 24218.00 frames. ], tot_loss[loss=0.1003, simple_loss=0.1479, pruned_loss=0.02635, over 2549759.37 frames. ], batch size: 122, lr: 2.49e-02, grad_scale: 64.0 2024-03-15 21:43:49,799 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=384, metric=19.14 vs. limit=19.2825 2024-03-15 21:43:57,999 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.out_combiner.scale_min, batch_count=15743.333333333334, ans=0.3489833333333333 2024-03-15 21:44:03,084 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_skip_rate, batch_count=15743.333333333334, ans=0.0010694444444444423 2024-03-15 21:44:03,487 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=256, metric=7.20 vs. limit=13.403749999999999 2024-03-15 21:44:52,407 INFO [train_char.py:689] (1/2) Epoch 10, batch 200, loss[loss=0.09352, simple_loss=0.1397, pruned_loss=0.02366, over 24191.00 frames. ], tot_loss[loss=0.1004, simple_loss=0.1487, pruned_loss=0.02607, over 3052496.72 frames. ], batch size: 344, lr: 2.49e-02, grad_scale: 64.0 2024-03-15 21:44:57,915 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=15876.666666666666, ans=0.125 2024-03-15 21:45:13,031 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.balancer2.prob, batch_count=15910.0, ans=0.125 2024-03-15 21:45:15,637 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.bypass.scale_min, batch_count=15910.0, ans=0.34315000000000007 2024-03-15 21:45:23,406 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=4.86 vs. limit=12.971666666666668 2024-03-15 21:45:25,650 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff3_skip_rate, batch_count=15943.333333333334, ans=0.007403623188405798 2024-03-15 21:45:32,888 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer1.prob, batch_count=15976.666666666666, ans=0.125 2024-03-15 21:45:35,326 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=15976.666666666666, ans=9.722222222222077e-05 2024-03-15 21:45:50,708 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.242e+01 1.189e+02 1.412e+02 1.879e+02 4.191e+02, threshold=2.824e+02, percent-clipped=6.0 2024-03-15 21:45:58,592 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_skip_rate, batch_count=16043.333333333334, ans=0.0 2024-03-15 21:45:59,541 INFO [train_char.py:689] (1/2) Epoch 10, batch 250, loss[loss=0.07695, simple_loss=0.1212, pruned_loss=0.01634, over 24229.00 frames. ], tot_loss[loss=0.1, simple_loss=0.1483, pruned_loss=0.02584, over 3443617.27 frames. ], batch size: 134, lr: 2.49e-02, grad_scale: 32.0 2024-03-15 21:46:07,309 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass_mid.scale_min, batch_count=16043.333333333334, ans=0.33848333333333336 2024-03-15 21:46:15,894 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=16076.666666666666, ans=0.125 2024-03-15 21:46:58,470 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.ff2_skip_rate, batch_count=16176.666666666666, ans=0.007352898550724638 2024-03-15 21:46:58,516 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer2.prob, batch_count=16176.666666666666, ans=0.125 2024-03-15 21:47:01,805 INFO [train_char.py:689] (1/2) Epoch 10, batch 300, loss[loss=0.107, simple_loss=0.1592, pruned_loss=0.02741, over 24167.00 frames. ], tot_loss[loss=0.1002, simple_loss=0.149, pruned_loss=0.02568, over 3752732.69 frames. ], batch size: 188, lr: 2.48e-02, grad_scale: 32.0 2024-03-15 21:47:03,939 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=384, metric=19.00 vs. limit=19.6575 2024-03-15 21:47:58,002 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.attention_skip_rate, batch_count=16343.333333333334, ans=0.0 2024-03-15 21:48:02,754 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.345e+01 1.095e+02 1.275e+02 1.585e+02 3.583e+02, threshold=2.551e+02, percent-clipped=3.0 2024-03-15 21:48:11,652 INFO [train_char.py:689] (1/2) Epoch 10, batch 350, loss[loss=0.09104, simple_loss=0.1305, pruned_loss=0.02578, over 24007.00 frames. ], tot_loss[loss=0.1012, simple_loss=0.1504, pruned_loss=0.02598, over 3986199.69 frames. ], batch size: 381, lr: 2.48e-02, grad_scale: 32.0 2024-03-15 21:48:52,495 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=9.08 vs. limit=13.67875 2024-03-15 21:49:16,154 INFO [train_char.py:689] (1/2) Epoch 10, batch 400, loss[loss=0.08616, simple_loss=0.1269, pruned_loss=0.02273, over 23968.00 frames. ], tot_loss[loss=0.1015, simple_loss=0.1508, pruned_loss=0.02606, over 4173729.58 frames. ], batch size: 407, lr: 2.47e-02, grad_scale: 32.0 2024-03-15 21:49:43,603 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.ff2_skip_rate, batch_count=16610.0, ans=0.007258695652173913 2024-03-15 21:49:49,754 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.nonlin_attention.balancer.prob, batch_count=16610.0, ans=0.125 2024-03-15 21:49:50,995 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward2.hidden_balancer.prob, batch_count=16610.0, ans=0.125 2024-03-15 21:49:56,117 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:50:12,484 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.049e+01 1.160e+02 1.425e+02 1.732e+02 3.861e+02, threshold=2.849e+02, percent-clipped=2.0 2024-03-15 21:50:21,542 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=16710.0, ans=0.31515000000000004 2024-03-15 21:50:22,503 INFO [train_char.py:689] (1/2) Epoch 10, batch 450, loss[loss=0.1169, simple_loss=0.1752, pruned_loss=0.02934, over 24150.00 frames. ], tot_loss[loss=0.1026, simple_loss=0.1523, pruned_loss=0.02642, over 4321273.20 frames. ], batch size: 223, lr: 2.47e-02, grad_scale: 32.0 2024-03-15 21:50:34,985 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff2_skip_rate, batch_count=16743.333333333332, ans=0.007229710144927537 2024-03-15 21:50:40,378 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=14.12 vs. limit=20.057499999999997 2024-03-15 21:51:05,457 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.balancer2.prob, batch_count=16810.0, ans=0.125 2024-03-15 21:51:12,964 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer2.prob, batch_count=16843.333333333332, ans=0.125 2024-03-15 21:51:24,436 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=192, metric=9.70 vs. limit=13.421666666666665 2024-03-15 21:51:25,903 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn2.whiten.whitening_limit, batch_count=16843.333333333332, ans=20.1325 2024-03-15 21:51:27,821 INFO [train_char.py:689] (1/2) Epoch 10, batch 500, loss[loss=0.1013, simple_loss=0.1575, pruned_loss=0.0225, over 24118.00 frames. ], tot_loss[loss=0.1043, simple_loss=0.1548, pruned_loss=0.02686, over 4434672.61 frames. ], batch size: 188, lr: 2.47e-02, grad_scale: 32.0 2024-03-15 21:52:28,944 INFO [train_char.py:689] (1/2) Epoch 11, batch 0, loss[loss=0.08493, simple_loss=0.1386, pruned_loss=0.01562, over 24328.00 frames. ], tot_loss[loss=0.08493, simple_loss=0.1386, pruned_loss=0.01562, over 24328.00 frames. ], batch size: 146, lr: 2.35e-02, grad_scale: 32.0 2024-03-15 21:52:28,944 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 21:52:42,715 INFO [train_char.py:721] (1/2) Epoch 11, validation: loss=0.07449, simple_loss=0.1293, pruned_loss=0.009864, over 657665.00 frames. 2024-03-15 21:52:42,716 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 21:53:12,531 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=16966.666666666668, ans=0.0 2024-03-15 21:53:23,476 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module2.balancer2.prob, batch_count=17000.0, ans=0.125 2024-03-15 21:53:24,937 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.attention_skip_rate, batch_count=17000.0, ans=0.0 2024-03-15 21:53:31,281 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.751e+01 1.149e+02 1.354e+02 1.675e+02 2.945e+02, threshold=2.709e+02, percent-clipped=2.0 2024-03-15 21:53:31,583 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.hidden_balancer.prob, batch_count=17000.0, ans=0.125 2024-03-15 21:53:33,165 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=17000.0, ans=0.0 2024-03-15 21:53:39,885 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.ff3_skip_rate, batch_count=17033.333333333332, ans=0.007166666666666667 2024-03-15 21:53:50,578 INFO [train_char.py:689] (1/2) Epoch 11, batch 50, loss[loss=0.08596, simple_loss=0.1211, pruned_loss=0.02543, over 23789.00 frames. ], tot_loss[loss=0.09511, simple_loss=0.1446, pruned_loss=0.02281, over 1088064.24 frames. ], batch size: 439, lr: 2.35e-02, grad_scale: 32.0 2024-03-15 21:54:10,683 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.balancer.min_positive, batch_count=17066.666666666668, ans=0.07933333333333331 2024-03-15 21:54:10,722 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=17066.666666666668, ans=0.125 2024-03-15 21:54:10,790 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:54:12,107 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=17100.0, ans=0.125 2024-03-15 21:54:20,647 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.81 vs. limit=5.5649999999999995 2024-03-15 21:54:29,316 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=17133.333333333332, ans=0.30033333333333345 2024-03-15 21:54:55,152 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=17200.0, ans=0.128 2024-03-15 21:55:05,050 INFO [train_char.py:689] (1/2) Epoch 11, batch 100, loss[loss=0.08874, simple_loss=0.1271, pruned_loss=0.0252, over 23816.00 frames. ], tot_loss[loss=0.09632, simple_loss=0.1449, pruned_loss=0.02385, over 1914843.08 frames. ], batch size: 439, lr: 2.35e-02, grad_scale: 32.0 2024-03-15 21:55:11,861 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.ff3_skip_rate, batch_count=17233.333333333332, ans=0.0071231884057971016 2024-03-15 21:55:28,514 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=17266.666666666668, ans=0.12733333333333333 2024-03-15 21:55:51,851 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.443e+01 1.069e+02 1.293e+02 1.619e+02 2.642e+02, threshold=2.585e+02, percent-clipped=0.0 2024-03-15 21:55:56,235 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_skip_rate, batch_count=17366.666666666668, ans=0.0 2024-03-15 21:56:04,649 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=15.54 vs. limit=20.525000000000002 2024-03-15 21:56:10,420 INFO [train_char.py:689] (1/2) Epoch 11, batch 150, loss[loss=0.0977, simple_loss=0.1493, pruned_loss=0.02306, over 24431.00 frames. ], tot_loss[loss=0.09801, simple_loss=0.1478, pruned_loss=0.02411, over 2555085.43 frames. ], batch size: 165, lr: 2.34e-02, grad_scale: 32.0 2024-03-15 21:56:15,774 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer2.prob, batch_count=17400.0, ans=0.125 2024-03-15 21:56:30,146 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 21:57:22,209 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module2.balancer2.min_abs, batch_count=17533.333333333332, ans=0.46299999999999997 2024-03-15 21:57:24,587 INFO [train_char.py:689] (1/2) Epoch 11, batch 200, loss[loss=0.1055, simple_loss=0.1584, pruned_loss=0.02636, over 24314.00 frames. ], tot_loss[loss=0.09757, simple_loss=0.1473, pruned_loss=0.02393, over 3060674.38 frames. ], batch size: 297, lr: 2.34e-02, grad_scale: 32.0 2024-03-15 21:57:41,850 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=7.21 vs. limit=14.1 2024-03-15 21:57:44,308 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=9.87 vs. limit=14.1 2024-03-15 21:57:59,406 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff3_skip_rate, batch_count=17633.333333333332, ans=0.007036231884057971 2024-03-15 21:58:09,058 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.conv_module1.whiten, num_groups=1, num_channels=192, metric=5.20 vs. limit=14.125 2024-03-15 21:58:10,565 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.157e+01 1.128e+02 1.430e+02 1.869e+02 3.901e+02, threshold=2.860e+02, percent-clipped=2.0 2024-03-15 21:58:28,751 INFO [train_char.py:689] (1/2) Epoch 11, batch 250, loss[loss=0.07564, simple_loss=0.1249, pruned_loss=0.01321, over 24329.00 frames. ], tot_loss[loss=0.09624, simple_loss=0.1458, pruned_loss=0.02333, over 3445990.12 frames. ], batch size: 129, lr: 2.34e-02, grad_scale: 32.0 2024-03-15 21:58:57,702 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=17800.0, ans=0.12200000000000003 2024-03-15 21:58:59,128 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.skip_rate, batch_count=17800.0, ans=0.09899494936611666 2024-03-15 21:59:02,236 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=16.99 vs. limit=14.175 2024-03-15 21:59:33,015 INFO [train_char.py:689] (1/2) Epoch 11, batch 300, loss[loss=0.1206, simple_loss=0.1724, pruned_loss=0.03442, over 24130.00 frames. ], tot_loss[loss=0.0969, simple_loss=0.1464, pruned_loss=0.02369, over 3751822.66 frames. ], batch size: 279, lr: 2.33e-02, grad_scale: 32.0 2024-03-15 21:59:42,619 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.bypass.scale_min, batch_count=17900.0, ans=0.2735000000000001 2024-03-15 21:59:50,059 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=384, metric=5.53 vs. limit=14.2125 2024-03-15 22:00:19,303 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.59 vs. limit=5.7 2024-03-15 22:00:25,141 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.582e+01 1.169e+02 1.362e+02 1.542e+02 2.807e+02, threshold=2.725e+02, percent-clipped=0.0 2024-03-15 22:00:32,407 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=4.50 vs. limit=14.016666666666666 2024-03-15 22:00:43,050 INFO [train_char.py:689] (1/2) Epoch 11, batch 350, loss[loss=0.1178, simple_loss=0.1694, pruned_loss=0.03307, over 24140.00 frames. ], tot_loss[loss=0.0984, simple_loss=0.1485, pruned_loss=0.02415, over 3988239.66 frames. ], batch size: 279, lr: 2.33e-02, grad_scale: 32.0 2024-03-15 22:01:01,812 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=192, metric=6.23 vs. limit=9.525 2024-03-15 22:01:06,269 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=18100.0, ans=0.125 2024-03-15 22:01:08,763 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=18133.333333333332, ans=0.125 2024-03-15 22:01:23,328 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.whiten, num_groups=1, num_channels=512, metric=8.65 vs. limit=11.266666666666667 2024-03-15 22:01:45,329 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=256, metric=3.08 vs. limit=14.325 2024-03-15 22:01:50,968 INFO [train_char.py:689] (1/2) Epoch 11, batch 400, loss[loss=0.09504, simple_loss=0.1441, pruned_loss=0.02301, over 24229.00 frames. ], tot_loss[loss=0.09963, simple_loss=0.1505, pruned_loss=0.02437, over 4176957.02 frames. ], batch size: 328, lr: 2.32e-02, grad_scale: 32.0 2024-03-15 22:01:56,041 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_skip_rate, batch_count=18233.333333333332, ans=0.0 2024-03-15 22:02:03,804 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.layerdrop_rate, batch_count=18266.666666666668, ans=0.03103333333333333 2024-03-15 22:02:11,928 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn1.whiten.whitening_limit, batch_count=18266.666666666668, ans=21.2 2024-03-15 22:02:20,628 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass.scale_min, batch_count=18300.0, ans=0.25950000000000006 2024-03-15 22:02:36,912 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 9.007e+01 1.120e+02 1.360e+02 1.680e+02 3.491e+02, threshold=2.721e+02, percent-clipped=4.0 2024-03-15 22:02:42,934 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=6.74 vs. limit=14.183333333333334 2024-03-15 22:02:48,836 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_skip_rate, batch_count=18366.666666666668, ans=0.0 2024-03-15 22:02:57,087 INFO [train_char.py:689] (1/2) Epoch 11, batch 450, loss[loss=0.09825, simple_loss=0.1468, pruned_loss=0.02486, over 24235.00 frames. ], tot_loss[loss=0.09999, simple_loss=0.1513, pruned_loss=0.02433, over 4323881.51 frames. ], batch size: 328, lr: 2.32e-02, grad_scale: 32.0 2024-03-15 22:03:20,101 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.skip_rate, batch_count=18433.333333333332, ans=0.04949747468305833 2024-03-15 22:03:23,112 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn2.whiten, num_groups=1, num_channels=512, metric=12.89 vs. limit=21.35 2024-03-15 22:03:31,510 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=18466.666666666668, ans=0.125 2024-03-15 22:03:57,247 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.26 vs. limit=5.779999999999999 2024-03-15 22:03:59,216 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=18533.333333333332, ans=0.0 2024-03-15 22:04:01,550 INFO [train_char.py:689] (1/2) Epoch 11, batch 500, loss[loss=0.1094, simple_loss=0.1635, pruned_loss=0.02764, over 24111.00 frames. ], tot_loss[loss=0.1011, simple_loss=0.153, pruned_loss=0.02462, over 4436321.73 frames. ], batch size: 251, lr: 2.32e-02, grad_scale: 32.0 2024-03-15 22:04:04,862 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=512, metric=15.75 vs. limit=21.425 2024-03-15 22:05:03,110 INFO [train_char.py:689] (1/2) Epoch 12, batch 0, loss[loss=0.08046, simple_loss=0.1335, pruned_loss=0.01371, over 24291.00 frames. ], tot_loss[loss=0.08046, simple_loss=0.1335, pruned_loss=0.01371, over 24291.00 frames. ], batch size: 140, lr: 2.22e-02, grad_scale: 32.0 2024-03-15 22:05:03,111 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 22:05:11,881 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.0.layers.1.self_attn_weights, attn_weights_entropy = tensor([4.7471, 4.8187, 4.8473, 4.5771], device='cuda:1') 2024-03-15 22:05:16,703 INFO [train_char.py:721] (1/2) Epoch 12, validation: loss=0.07356, simple_loss=0.1286, pruned_loss=0.009258, over 657665.00 frames. 2024-03-15 22:05:16,704 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 22:05:28,079 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.39 vs. limit=9.6475 2024-03-15 22:05:35,636 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer2.prob, batch_count=18623.333333333332, ans=0.125 2024-03-15 22:05:40,444 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=384, metric=4.25 vs. limit=14.48375 2024-03-15 22:05:48,369 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=18656.666666666668, ans=0.0 2024-03-15 22:06:01,479 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.627e+01 1.040e+02 1.285e+02 1.688e+02 2.682e+02, threshold=2.570e+02, percent-clipped=0.0 2024-03-15 22:06:13,980 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.bypass.skip_rate, batch_count=18690.0, ans=0.07 2024-03-15 22:06:15,383 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module1.balancer1.prob, batch_count=18723.333333333332, ans=0.125 2024-03-15 22:06:25,500 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=18723.333333333332, ans=0.0 2024-03-15 22:06:34,479 INFO [train_char.py:689] (1/2) Epoch 12, batch 50, loss[loss=0.09785, simple_loss=0.1484, pruned_loss=0.02363, over 24205.00 frames. ], tot_loss[loss=0.09295, simple_loss=0.1418, pruned_loss=0.02203, over 1090155.33 frames. ], batch size: 311, lr: 2.22e-02, grad_scale: 16.0 2024-03-15 22:06:45,645 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer2.prob, batch_count=18756.666666666668, ans=0.125 2024-03-15 22:06:55,263 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff2_skip_rate, batch_count=18790.0, ans=0.0067847826086956525 2024-03-15 22:06:57,864 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=18790.0, ans=0.125 2024-03-15 22:07:08,225 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.0.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 22:07:18,484 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=18856.666666666668, ans=0.125 2024-03-15 22:07:37,940 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer1.max_abs, batch_count=18890.0, ans=10.0 2024-03-15 22:07:40,385 INFO [train_char.py:689] (1/2) Epoch 12, batch 100, loss[loss=0.08836, simple_loss=0.1282, pruned_loss=0.02428, over 23830.00 frames. ], tot_loss[loss=0.09347, simple_loss=0.1435, pruned_loss=0.02172, over 1919157.40 frames. ], batch size: 439, lr: 2.21e-02, grad_scale: 16.0 2024-03-15 22:07:48,195 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.min_abs, batch_count=18923.333333333332, ans=0.48384999999999995 2024-03-15 22:07:54,865 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass.scale_min, batch_count=18956.666666666668, ans=0.2365166666666667 2024-03-15 22:08:02,471 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=18956.666666666668, ans=0.11043333333333333 2024-03-15 22:08:06,469 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=18990.0, ans=0.0 2024-03-15 22:08:19,136 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.176e+01 1.103e+02 1.316e+02 1.837e+02 3.115e+02, threshold=2.631e+02, percent-clipped=6.0 2024-03-15 22:08:28,790 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward3.hidden_balancer.prob, batch_count=19023.333333333332, ans=0.125 2024-03-15 22:08:49,993 INFO [train_char.py:689] (1/2) Epoch 12, batch 150, loss[loss=0.1127, simple_loss=0.1718, pruned_loss=0.02681, over 24127.00 frames. ], tot_loss[loss=0.09197, simple_loss=0.142, pruned_loss=0.02094, over 2563840.84 frames. ], batch size: 236, lr: 2.21e-02, grad_scale: 16.0 2024-03-15 22:09:28,502 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=10.84 vs. limit=14.578333333333335 2024-03-15 22:09:30,432 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module2.balancer1.max_abs, batch_count=19156.666666666668, ans=10.0 2024-03-15 22:09:40,867 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward3.hidden_balancer.prob, batch_count=19190.0, ans=0.125 2024-03-15 22:09:47,245 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer1.prob, batch_count=19223.333333333332, ans=0.125 2024-03-15 22:09:58,942 INFO [train_char.py:689] (1/2) Epoch 12, batch 200, loss[loss=0.1038, simple_loss=0.1604, pruned_loss=0.02363, over 24232.00 frames. ], tot_loss[loss=0.09348, simple_loss=0.1441, pruned_loss=0.02142, over 3055656.42 frames. ], batch size: 189, lr: 2.21e-02, grad_scale: 16.0 2024-03-15 22:10:13,325 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=19290.0, ans=0.125 2024-03-15 22:10:19,749 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.attention_skip_rate, batch_count=19290.0, ans=0.0 2024-03-15 22:10:20,452 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=13.14 vs. limit=14.73375 2024-03-15 22:10:25,976 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=19323.333333333332, ans=0.10676666666666668 2024-03-15 22:10:37,611 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.260e+01 1.092e+02 1.355e+02 1.626e+02 2.369e+02, threshold=2.710e+02, percent-clipped=0.0 2024-03-15 22:11:03,400 INFO [train_char.py:689] (1/2) Epoch 12, batch 250, loss[loss=0.1005, simple_loss=0.1568, pruned_loss=0.02211, over 24146.00 frames. ], tot_loss[loss=0.09282, simple_loss=0.1431, pruned_loss=0.02127, over 3441947.72 frames. ], batch size: 188, lr: 2.20e-02, grad_scale: 16.0 2024-03-15 22:11:58,484 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.23 vs. limit=5.9285 2024-03-15 22:12:14,009 INFO [train_char.py:689] (1/2) Epoch 12, batch 300, loss[loss=0.1014, simple_loss=0.1522, pruned_loss=0.02532, over 24226.00 frames. ], tot_loss[loss=0.09337, simple_loss=0.144, pruned_loss=0.02137, over 3746640.11 frames. ], batch size: 311, lr: 2.20e-02, grad_scale: 16.0 2024-03-15 22:12:19,131 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer1.prob, batch_count=19590.0, ans=0.125 2024-03-15 22:12:22,244 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=14.05 vs. limit=14.846250000000001 2024-03-15 22:12:51,974 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.720e+01 1.174e+02 1.515e+02 1.923e+02 3.546e+02, threshold=3.029e+02, percent-clipped=9.0 2024-03-15 22:13:05,021 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module1.balancer2.min_abs, batch_count=19723.333333333332, ans=0.49584999999999996 2024-03-15 22:13:14,011 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=19723.333333333332, ans=0.125 2024-03-15 22:13:17,466 INFO [train_char.py:689] (1/2) Epoch 12, batch 350, loss[loss=0.1127, simple_loss=0.1707, pruned_loss=0.02733, over 24095.00 frames. ], tot_loss[loss=0.09413, simple_loss=0.1449, pruned_loss=0.02168, over 3987473.16 frames. ], batch size: 236, lr: 2.19e-02, grad_scale: 16.0 2024-03-15 22:13:31,567 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=19790.0, ans=0.125 2024-03-15 22:14:16,907 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=19890.0, ans=0.125 2024-03-15 22:14:25,148 INFO [train_char.py:689] (1/2) Epoch 12, batch 400, loss[loss=0.09731, simple_loss=0.1541, pruned_loss=0.02025, over 24143.00 frames. ], tot_loss[loss=0.09529, simple_loss=0.1465, pruned_loss=0.02202, over 4175736.05 frames. ], batch size: 188, lr: 2.19e-02, grad_scale: 32.0 2024-03-15 22:14:30,585 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.scale_min, batch_count=19923.333333333332, ans=0.20268333333333344 2024-03-15 22:15:06,270 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.057e+01 1.097e+02 1.316e+02 1.719e+02 2.765e+02, threshold=2.632e+02, percent-clipped=0.0 2024-03-15 22:15:12,358 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn1.whiten, num_groups=1, num_channels=192, metric=13.37 vs. limit=22.5 2024-03-15 22:15:27,850 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer2.prob, batch_count=20056.666666666668, ans=0.125 2024-03-15 22:15:31,378 INFO [train_char.py:689] (1/2) Epoch 12, batch 450, loss[loss=0.1072, simple_loss=0.1623, pruned_loss=0.02604, over 24018.00 frames. ], tot_loss[loss=0.09631, simple_loss=0.1481, pruned_loss=0.02225, over 4322566.61 frames. ], batch size: 199, lr: 2.19e-02, grad_scale: 16.0 2024-03-15 22:15:35,761 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.79 vs. limit=6.0 2024-03-15 22:15:36,436 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.out_balancer.prob, batch_count=20090.0, ans=0.125 2024-03-15 22:15:44,696 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=256, metric=6.81 vs. limit=15.0 2024-03-15 22:15:47,762 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=20123.333333333332, ans=0.0 2024-03-15 22:16:35,457 INFO [train_char.py:689] (1/2) Epoch 12, batch 500, loss[loss=0.09875, simple_loss=0.153, pruned_loss=0.02225, over 24141.00 frames. ], tot_loss[loss=0.0972, simple_loss=0.1497, pruned_loss=0.02237, over 4436532.96 frames. ], batch size: 279, lr: 2.18e-02, grad_scale: 16.0 2024-03-15 22:17:36,118 INFO [train_char.py:689] (1/2) Epoch 13, batch 0, loss[loss=0.08987, simple_loss=0.1426, pruned_loss=0.01857, over 24373.00 frames. ], tot_loss[loss=0.08987, simple_loss=0.1426, pruned_loss=0.01857, over 24373.00 frames. ], batch size: 158, lr: 2.10e-02, grad_scale: 32.0 2024-03-15 22:17:36,118 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 22:17:45,814 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.2.encoder.layers.2.self_attn_weights, attn_weights_entropy = tensor([2.8009, 2.9283, 3.4938, 3.2977], device='cuda:1') 2024-03-15 22:17:54,657 INFO [train_char.py:721] (1/2) Epoch 13, validation: loss=0.0715, simple_loss=0.1264, pruned_loss=0.008321, over 657665.00 frames. 2024-03-15 22:17:54,658 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 22:17:56,388 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=20280.0, ans=0.125 2024-03-15 22:17:56,397 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.min_positive, batch_count=20280.0, ans=0.025 2024-03-15 22:18:11,244 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer1.prob, batch_count=20313.333333333332, ans=0.125 2024-03-15 22:18:19,428 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.ff3_skip_rate, batch_count=20313.333333333332, ans=0.006453623188405797 2024-03-15 22:18:27,333 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.294e+01 1.085e+02 1.296e+02 1.642e+02 3.122e+02, threshold=2.592e+02, percent-clipped=1.0 2024-03-15 22:18:29,038 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=20346.666666666668, ans=0.1 2024-03-15 22:18:30,300 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer2.prob, batch_count=20346.666666666668, ans=0.125 2024-03-15 22:18:45,710 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=20380.0, ans=0.1 2024-03-15 22:18:48,778 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=12.66 vs. limit=15.0 2024-03-15 22:19:04,162 INFO [train_char.py:689] (1/2) Epoch 13, batch 50, loss[loss=0.09107, simple_loss=0.1366, pruned_loss=0.02277, over 24222.00 frames. ], tot_loss[loss=0.09124, simple_loss=0.1417, pruned_loss=0.02042, over 1086041.04 frames. ], batch size: 328, lr: 2.09e-02, grad_scale: 32.0 2024-03-15 22:19:05,985 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer2.prob, batch_count=20446.666666666668, ans=0.125 2024-03-15 22:19:30,264 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward3.hidden_balancer.prob, batch_count=20513.333333333332, ans=0.125 2024-03-15 22:20:11,618 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=20580.0, ans=0.125 2024-03-15 22:20:16,622 INFO [train_char.py:689] (1/2) Epoch 13, batch 100, loss[loss=0.08525, simple_loss=0.135, pruned_loss=0.01776, over 24156.00 frames. ], tot_loss[loss=0.0898, simple_loss=0.1402, pruned_loss=0.01969, over 1914390.42 frames. ], batch size: 344, lr: 2.09e-02, grad_scale: 32.0 2024-03-15 22:20:17,420 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=9.96 vs. limit=15.0 2024-03-15 22:20:24,438 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.2.prob, batch_count=20613.333333333332, ans=0.125 2024-03-15 22:20:25,889 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=20613.333333333332, ans=0.125 2024-03-15 22:20:25,926 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.prob, batch_count=20613.333333333332, ans=0.125 2024-03-15 22:20:49,099 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.262e+01 1.069e+02 1.330e+02 1.855e+02 3.421e+02, threshold=2.660e+02, percent-clipped=6.0 2024-03-15 22:21:06,513 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer2.min_positive, batch_count=20713.333333333332, ans=0.05 2024-03-15 22:21:06,616 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer2.prob, batch_count=20713.333333333332, ans=0.125 2024-03-15 22:21:07,162 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=512, metric=22.66 vs. limit=22.5 2024-03-15 22:21:09,657 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=11.47 vs. limit=15.0 2024-03-15 22:21:14,963 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=14.56 vs. limit=15.0 2024-03-15 22:21:21,393 INFO [train_char.py:689] (1/2) Epoch 13, batch 150, loss[loss=0.1062, simple_loss=0.1678, pruned_loss=0.02231, over 24217.00 frames. ], tot_loss[loss=0.08894, simple_loss=0.1388, pruned_loss=0.01955, over 2553182.92 frames. ], batch size: 224, lr: 2.09e-02, grad_scale: 16.0 2024-03-15 22:21:24,135 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer2.min_abs, batch_count=20780.0, ans=0.5 2024-03-15 22:21:25,406 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass_mid.scale_min, batch_count=20780.0, ans=0.2 2024-03-15 22:21:37,644 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=20813.333333333332, ans=0.125 2024-03-15 22:21:47,488 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.13 vs. limit=6.0 2024-03-15 22:22:30,654 INFO [train_char.py:689] (1/2) Epoch 13, batch 200, loss[loss=0.06459, simple_loss=0.1135, pruned_loss=0.007817, over 24306.00 frames. ], tot_loss[loss=0.09018, simple_loss=0.1409, pruned_loss=0.01972, over 3055496.39 frames. ], batch size: 146, lr: 2.08e-02, grad_scale: 16.0 2024-03-15 22:22:39,845 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.balancer1.prob, batch_count=20946.666666666668, ans=0.125 2024-03-15 22:22:49,634 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.ff3_skip_rate, batch_count=20980.0, ans=0.006308695652173913 2024-03-15 22:22:54,747 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=21013.333333333332, ans=0.125 2024-03-15 22:23:06,058 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.679e+01 1.109e+02 1.340e+02 1.722e+02 4.051e+02, threshold=2.681e+02, percent-clipped=3.0 2024-03-15 22:23:08,882 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.skip_rate, batch_count=21013.333333333332, ans=0.07 2024-03-15 22:23:38,087 INFO [train_char.py:689] (1/2) Epoch 13, batch 250, loss[loss=0.09615, simple_loss=0.1488, pruned_loss=0.02177, over 24245.00 frames. ], tot_loss[loss=0.09018, simple_loss=0.1405, pruned_loss=0.01994, over 3448809.14 frames. ], batch size: 296, lr: 2.08e-02, grad_scale: 16.0 2024-03-15 22:23:57,527 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer2.prob, batch_count=21146.666666666668, ans=0.125 2024-03-15 22:24:03,912 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.ff2_skip_rate, batch_count=21180.0, ans=0.006265217391304348 2024-03-15 22:24:07,750 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_skip_rate, batch_count=21180.0, ans=0.0 2024-03-15 22:24:33,953 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.ff2_skip_rate, batch_count=21246.666666666668, ans=0.00625072463768116 2024-03-15 22:24:41,307 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer2.min_positive, batch_count=21246.666666666668, ans=0.05 2024-03-15 22:24:44,884 INFO [train_char.py:689] (1/2) Epoch 13, batch 300, loss[loss=0.09707, simple_loss=0.1508, pruned_loss=0.02168, over 24219.00 frames. ], tot_loss[loss=0.09127, simple_loss=0.1421, pruned_loss=0.02023, over 3754206.55 frames. ], batch size: 311, lr: 2.08e-02, grad_scale: 16.0 2024-03-15 22:25:02,962 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=21313.333333333332, ans=0.0 2024-03-15 22:25:03,115 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer2.prob, batch_count=21313.333333333332, ans=0.125 2024-03-15 22:25:14,167 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=14.97 vs. limit=15.0 2024-03-15 22:25:19,872 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.169e+01 1.103e+02 1.320e+02 1.684e+02 3.519e+02, threshold=2.641e+02, percent-clipped=4.0 2024-03-15 22:25:21,432 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=21346.666666666668, ans=0.125 2024-03-15 22:25:51,552 INFO [train_char.py:689] (1/2) Epoch 13, batch 350, loss[loss=0.1029, simple_loss=0.1643, pruned_loss=0.02071, over 24080.00 frames. ], tot_loss[loss=0.0919, simple_loss=0.1431, pruned_loss=0.02035, over 3992138.21 frames. ], batch size: 251, lr: 2.07e-02, grad_scale: 16.0 2024-03-15 22:25:55,716 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=21446.666666666668, ans=0.125 2024-03-15 22:25:58,287 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=21446.666666666668, ans=0.0 2024-03-15 22:26:20,768 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass.skip_rate, batch_count=21513.333333333332, ans=0.035 2024-03-15 22:26:55,086 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.balancer2.prob, batch_count=21580.0, ans=0.125 2024-03-15 22:26:58,606 INFO [train_char.py:689] (1/2) Epoch 13, batch 400, loss[loss=0.07303, simple_loss=0.1267, pruned_loss=0.009661, over 24320.00 frames. ], tot_loss[loss=0.09239, simple_loss=0.1441, pruned_loss=0.02034, over 4180879.95 frames. ], batch size: 140, lr: 2.07e-02, grad_scale: 32.0 2024-03-15 22:27:11,705 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=512, metric=14.89 vs. limit=22.5 2024-03-15 22:27:18,906 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=21646.666666666668, ans=0.125 2024-03-15 22:27:21,356 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=21646.666666666668, ans=0.0 2024-03-15 22:27:25,287 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=21680.0, ans=0.1 2024-03-15 22:27:29,070 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward2.hidden_balancer.prob, batch_count=21680.0, ans=0.125 2024-03-15 22:27:29,991 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.006e+01 1.047e+02 1.201e+02 1.533e+02 2.664e+02, threshold=2.402e+02, percent-clipped=3.0 2024-03-15 22:27:30,280 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=21680.0, ans=0.2 2024-03-15 22:27:32,760 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=21680.0, ans=0.125 2024-03-15 22:27:34,118 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=21680.0, ans=0.125 2024-03-15 22:27:57,672 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.scale_min, batch_count=21746.666666666668, ans=0.2 2024-03-15 22:28:02,328 INFO [train_char.py:689] (1/2) Epoch 13, batch 450, loss[loss=0.0939, simple_loss=0.1511, pruned_loss=0.01836, over 24144.00 frames. ], tot_loss[loss=0.09292, simple_loss=0.1451, pruned_loss=0.02037, over 4327832.16 frames. ], batch size: 279, lr: 2.07e-02, grad_scale: 32.0 2024-03-15 22:28:04,982 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=21780.0, ans=0.125 2024-03-15 22:28:41,286 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer1.prob, batch_count=21880.0, ans=0.125 2024-03-15 22:28:43,832 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=21880.0, ans=0.0 2024-03-15 22:28:52,504 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=512, metric=4.15 vs. limit=15.0 2024-03-15 22:28:54,464 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff3_skip_rate, batch_count=21913.333333333332, ans=0.006105797101449276 2024-03-15 22:29:08,188 INFO [train_char.py:689] (1/2) Epoch 13, batch 500, loss[loss=0.1032, simple_loss=0.1596, pruned_loss=0.02339, over 24084.00 frames. ], tot_loss[loss=0.09409, simple_loss=0.1471, pruned_loss=0.02053, over 4438193.69 frames. ], batch size: 236, lr: 2.06e-02, grad_scale: 32.0 2024-03-15 22:30:09,794 INFO [train_char.py:689] (1/2) Epoch 14, batch 0, loss[loss=0.06021, simple_loss=0.08998, pruned_loss=0.01522, over 22346.00 frames. ], tot_loss[loss=0.06021, simple_loss=0.08998, pruned_loss=0.01522, over 22346.00 frames. ], batch size: 483, lr: 1.99e-02, grad_scale: 32.0 2024-03-15 22:30:09,794 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 22:30:22,538 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.3.encoder.layers.0.self_attn_weights, attn_weights_entropy = tensor([2.8193, 2.6506, 2.3611, 2.3875, 1.8530, 2.3082, 2.5095, 2.3175], device='cuda:1') 2024-03-15 22:30:23,551 INFO [train_char.py:721] (1/2) Epoch 14, validation: loss=0.07044, simple_loss=0.125, pruned_loss=0.007917, over 657665.00 frames. 2024-03-15 22:30:23,551 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 22:30:35,403 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=7.17 vs. limit=12.0 2024-03-15 22:30:36,558 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer2.prob, batch_count=22003.333333333332, ans=0.125 2024-03-15 22:30:41,200 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=256, metric=21.09 vs. limit=22.5 2024-03-15 22:30:43,291 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=22003.333333333332, ans=0.125 2024-03-15 22:30:48,074 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.362e+01 1.053e+02 1.320e+02 1.724e+02 3.044e+02, threshold=2.640e+02, percent-clipped=7.0 2024-03-15 22:31:15,181 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_skip_rate, batch_count=22070.0, ans=0.0 2024-03-15 22:31:15,752 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn2.whiten, num_groups=1, num_channels=192, metric=17.86 vs. limit=22.5 2024-03-15 22:31:39,354 INFO [train_char.py:689] (1/2) Epoch 14, batch 50, loss[loss=0.08244, simple_loss=0.1312, pruned_loss=0.01687, over 24232.00 frames. ], tot_loss[loss=0.08817, simple_loss=0.1392, pruned_loss=0.01858, over 1086582.56 frames. ], batch size: 328, lr: 1.99e-02, grad_scale: 32.0 2024-03-15 22:31:51,799 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=22170.0, ans=0.1 2024-03-15 22:31:53,212 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=22170.0, ans=0.125 2024-03-15 22:32:03,141 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.90 vs. limit=6.0 2024-03-15 22:32:04,056 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=22170.0, ans=0.125 2024-03-15 22:32:09,122 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=22203.333333333332, ans=0.1 2024-03-15 22:32:15,637 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.balancer2.prob, batch_count=22203.333333333332, ans=0.125 2024-03-15 22:32:15,686 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=22203.333333333332, ans=0.125 2024-03-15 22:32:45,119 INFO [train_char.py:689] (1/2) Epoch 14, batch 100, loss[loss=0.1063, simple_loss=0.1666, pruned_loss=0.02297, over 24070.00 frames. ], tot_loss[loss=0.08781, simple_loss=0.1391, pruned_loss=0.01827, over 1912870.58 frames. ], batch size: 250, lr: 1.98e-02, grad_scale: 32.0 2024-03-15 22:33:08,437 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.509e+01 1.179e+02 1.448e+02 1.825e+02 3.837e+02, threshold=2.896e+02, percent-clipped=4.0 2024-03-15 22:33:41,269 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.prob, batch_count=22436.666666666668, ans=0.125 2024-03-15 22:33:48,580 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer2.prob, batch_count=22436.666666666668, ans=0.125 2024-03-15 22:33:59,115 INFO [train_char.py:689] (1/2) Epoch 14, batch 150, loss[loss=0.11, simple_loss=0.1707, pruned_loss=0.02467, over 24061.00 frames. ], tot_loss[loss=0.08801, simple_loss=0.1395, pruned_loss=0.01824, over 2557409.46 frames. ], batch size: 236, lr: 1.98e-02, grad_scale: 16.0 2024-03-15 22:34:03,393 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer2.min_positive, batch_count=22470.0, ans=0.05 2024-03-15 22:34:04,713 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_skip_rate, batch_count=22470.0, ans=0.0 2024-03-15 22:34:15,474 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.95 vs. limit=6.0 2024-03-15 22:34:18,989 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=22503.333333333332, ans=0.2 2024-03-15 22:34:24,073 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_skip_rate, batch_count=22536.666666666668, ans=0.0 2024-03-15 22:34:42,195 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer2.prob, batch_count=22570.0, ans=0.125 2024-03-15 22:34:43,485 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 22:34:49,868 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=22603.333333333332, ans=0.0 2024-03-15 22:34:52,534 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.ff3_skip_rate, batch_count=22603.333333333332, ans=0.005955797101449276 2024-03-15 22:34:55,073 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=22603.333333333332, ans=0.1 2024-03-15 22:34:57,545 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=22603.333333333332, ans=0.125 2024-03-15 22:35:01,518 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer2.prob, batch_count=22603.333333333332, ans=0.125 2024-03-15 22:35:03,860 INFO [train_char.py:689] (1/2) Epoch 14, batch 200, loss[loss=0.09401, simple_loss=0.1508, pruned_loss=0.01861, over 24369.00 frames. ], tot_loss[loss=0.08847, simple_loss=0.1404, pruned_loss=0.01828, over 3058456.29 frames. ], batch size: 180, lr: 1.98e-02, grad_scale: 16.0 2024-03-15 22:35:09,341 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=22636.666666666668, ans=0.1 2024-03-15 22:35:10,895 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=22.23 vs. limit=22.5 2024-03-15 22:35:15,881 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.attention_skip_rate, batch_count=22670.0, ans=0.0 2024-03-15 22:35:21,511 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=256, metric=15.87 vs. limit=22.5 2024-03-15 22:35:23,543 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.attention_skip_rate, batch_count=22670.0, ans=0.0 2024-03-15 22:35:28,457 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.371e+01 1.071e+02 1.309e+02 1.701e+02 3.687e+02, threshold=2.619e+02, percent-clipped=6.0 2024-03-15 22:36:07,287 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=22803.333333333332, ans=0.125 2024-03-15 22:36:08,279 INFO [train_char.py:689] (1/2) Epoch 14, batch 250, loss[loss=0.09694, simple_loss=0.1555, pruned_loss=0.01918, over 24118.00 frames. ], tot_loss[loss=0.08789, simple_loss=0.1396, pruned_loss=0.01811, over 3452587.01 frames. ], batch size: 188, lr: 1.97e-02, grad_scale: 16.0 2024-03-15 22:36:13,333 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer2.prob, batch_count=22803.333333333332, ans=0.125 2024-03-15 22:36:19,736 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer2.prob, batch_count=22836.666666666668, ans=0.125 2024-03-15 22:36:53,771 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.4.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 22:37:09,816 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=10.95 vs. limit=15.0 2024-03-15 22:37:19,307 INFO [train_char.py:689] (1/2) Epoch 14, batch 300, loss[loss=0.1028, simple_loss=0.1636, pruned_loss=0.02095, over 24032.00 frames. ], tot_loss[loss=0.08824, simple_loss=0.1402, pruned_loss=0.01814, over 3757754.04 frames. ], batch size: 250, lr: 1.97e-02, grad_scale: 16.0 2024-03-15 22:37:20,738 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff3_skip_rate, batch_count=22970.0, ans=0.0058760869565217385 2024-03-15 22:37:27,060 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=22970.0, ans=0.125 2024-03-15 22:37:29,463 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module2.balancer1.min_positive, batch_count=22970.0, ans=0.025 2024-03-15 22:37:43,068 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.042e+01 1.084e+02 1.249e+02 1.783e+02 3.247e+02, threshold=2.497e+02, percent-clipped=4.0 2024-03-15 22:37:53,670 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=23036.666666666668, ans=0.125 2024-03-15 22:38:00,047 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=23070.0, ans=0.125 2024-03-15 22:38:22,488 INFO [train_char.py:689] (1/2) Epoch 14, batch 350, loss[loss=0.1048, simple_loss=0.164, pruned_loss=0.02273, over 24097.00 frames. ], tot_loss[loss=0.08944, simple_loss=0.142, pruned_loss=0.01843, over 3990898.54 frames. ], batch size: 236, lr: 1.97e-02, grad_scale: 16.0 2024-03-15 22:38:22,765 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=23136.666666666668, ans=0.125 2024-03-15 22:38:38,084 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.balancer.max_positive, batch_count=23170.0, ans=0.95 2024-03-15 22:38:57,746 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.whiten, num_groups=1, num_channels=192, metric=4.00 vs. limit=12.0 2024-03-15 22:39:00,349 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.conv_module1.whiten, num_groups=1, num_channels=192, metric=4.63 vs. limit=15.0 2024-03-15 22:39:29,952 INFO [train_char.py:689] (1/2) Epoch 14, batch 400, loss[loss=0.08704, simple_loss=0.1423, pruned_loss=0.01588, over 24279.00 frames. ], tot_loss[loss=0.08979, simple_loss=0.1423, pruned_loss=0.01861, over 4177546.03 frames. ], batch size: 296, lr: 1.96e-02, grad_scale: 32.0 2024-03-15 22:39:36,888 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=192, metric=4.58 vs. limit=10.0 2024-03-15 22:39:46,720 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=23336.666666666668, ans=0.125 2024-03-15 22:39:54,048 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.676e+01 1.024e+02 1.200e+02 1.525e+02 3.223e+02, threshold=2.400e+02, percent-clipped=2.0 2024-03-15 22:39:59,600 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-15 22:40:30,507 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=23436.666666666668, ans=0.1 2024-03-15 22:40:31,790 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=23436.666666666668, ans=0.1 2024-03-15 22:40:34,866 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=384, metric=7.45 vs. limit=15.0 2024-03-15 22:40:35,486 INFO [train_char.py:689] (1/2) Epoch 14, batch 450, loss[loss=0.1017, simple_loss=0.1573, pruned_loss=0.02308, over 24123.00 frames. ], tot_loss[loss=0.09061, simple_loss=0.1434, pruned_loss=0.0189, over 4322658.65 frames. ], batch size: 199, lr: 1.96e-02, grad_scale: 32.0 2024-03-15 22:40:53,035 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=23503.333333333332, ans=0.125 2024-03-15 22:41:06,108 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=23536.666666666668, ans=0.1 2024-03-15 22:41:09,002 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.96 vs. limit=6.0 2024-03-15 22:41:28,977 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=14.00 vs. limit=15.0 2024-03-15 22:41:35,512 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=144, metric=9.56 vs. limit=10.0 2024-03-15 22:41:37,226 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=23603.333333333332, ans=0.1 2024-03-15 22:41:39,548 INFO [train_char.py:689] (1/2) Epoch 14, batch 500, loss[loss=0.09605, simple_loss=0.1566, pruned_loss=0.01773, over 24105.00 frames. ], tot_loss[loss=0.09136, simple_loss=0.1449, pruned_loss=0.01893, over 4436186.45 frames. ], batch size: 199, lr: 1.96e-02, grad_scale: 16.0 2024-03-15 22:41:41,399 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=12.32 vs. limit=15.0 2024-03-15 22:41:42,431 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass_mid.scale_min, batch_count=23636.666666666668, ans=0.2 2024-03-15 22:42:40,784 INFO [train_char.py:689] (1/2) Epoch 15, batch 0, loss[loss=0.07464, simple_loss=0.129, pruned_loss=0.01014, over 24255.00 frames. ], tot_loss[loss=0.07464, simple_loss=0.129, pruned_loss=0.01014, over 24255.00 frames. ], batch size: 140, lr: 1.89e-02, grad_scale: 32.0 2024-03-15 22:42:40,785 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 22:42:47,207 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.3.encoder.layers.3.self_attn_weights, attn_weights_entropy = tensor([2.8769, 2.6825, 2.8125, 2.5207, 2.4892, 2.5649, 2.4620, 2.4304], device='cuda:1') 2024-03-15 22:42:54,295 INFO [train_char.py:721] (1/2) Epoch 15, validation: loss=0.06886, simple_loss=0.1225, pruned_loss=0.007616, over 657665.00 frames. 2024-03-15 22:42:54,296 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 22:42:57,450 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=23660.0, ans=0.2 2024-03-15 22:43:11,612 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.091e+01 9.908e+01 1.153e+02 1.389e+02 3.573e+02, threshold=2.307e+02, percent-clipped=1.0 2024-03-15 22:43:24,584 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass.skip_rate, batch_count=23726.666666666668, ans=0.04949747468305833 2024-03-15 22:43:34,406 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=384, metric=4.15 vs. limit=15.0 2024-03-15 22:43:47,234 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=23760.0, ans=0.2 2024-03-15 22:43:56,086 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=192, metric=9.38 vs. limit=15.0 2024-03-15 22:43:59,859 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.conv_module2.whiten, num_groups=1, num_channels=384, metric=2.86 vs. limit=15.0 2024-03-15 22:44:05,865 INFO [train_char.py:689] (1/2) Epoch 15, batch 50, loss[loss=0.0726, simple_loss=0.1214, pruned_loss=0.01191, over 24316.00 frames. ], tot_loss[loss=0.0845, simple_loss=0.1365, pruned_loss=0.01624, over 1092124.67 frames. ], batch size: 129, lr: 1.89e-02, grad_scale: 32.0 2024-03-15 22:44:06,292 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff2_skip_rate, batch_count=23826.666666666668, ans=0.005689855072463768 2024-03-15 22:44:42,144 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=256, metric=4.04 vs. limit=15.0 2024-03-15 22:44:58,616 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=23926.666666666668, ans=0.0 2024-03-15 22:45:04,955 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass_mid.scale_min, batch_count=23960.0, ans=0.2 2024-03-15 22:45:15,030 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 22:45:17,468 INFO [train_char.py:689] (1/2) Epoch 15, batch 100, loss[loss=0.07898, simple_loss=0.1266, pruned_loss=0.01569, over 24209.00 frames. ], tot_loss[loss=0.0854, simple_loss=0.1368, pruned_loss=0.01701, over 1916047.54 frames. ], batch size: 116, lr: 1.88e-02, grad_scale: 32.0 2024-03-15 22:45:17,851 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=23993.333333333332, ans=0.2 2024-03-15 22:45:27,395 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=23993.333333333332, ans=0.1 2024-03-15 22:45:39,811 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.932e+01 1.121e+02 1.256e+02 1.613e+02 3.118e+02, threshold=2.512e+02, percent-clipped=5.0 2024-03-15 22:45:44,061 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=24026.666666666668, ans=0.2 2024-03-15 22:45:59,935 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.whiten, num_groups=1, num_channels=256, metric=3.45 vs. limit=12.0 2024-03-15 22:46:00,219 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=12.50 vs. limit=15.0 2024-03-15 22:46:27,880 INFO [train_char.py:689] (1/2) Epoch 15, batch 150, loss[loss=0.06912, simple_loss=0.0994, pruned_loss=0.01942, over 22787.00 frames. ], tot_loss[loss=0.08443, simple_loss=0.1355, pruned_loss=0.01669, over 2557124.48 frames. ], batch size: 483, lr: 1.88e-02, grad_scale: 32.0 2024-03-15 22:46:39,787 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=24193.333333333332, ans=0.0 2024-03-15 22:46:54,997 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=24226.666666666668, ans=0.0 2024-03-15 22:47:05,562 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.attention_skip_rate, batch_count=24260.0, ans=0.0 2024-03-15 22:47:13,402 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass_mid.scale_min, batch_count=24260.0, ans=0.2 2024-03-15 22:47:14,750 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=24260.0, ans=0.0 2024-03-15 22:47:32,166 INFO [train_char.py:689] (1/2) Epoch 15, batch 200, loss[loss=0.07669, simple_loss=0.1199, pruned_loss=0.01673, over 24032.00 frames. ], tot_loss[loss=0.08462, simple_loss=0.1358, pruned_loss=0.01672, over 3058325.48 frames. ], batch size: 381, lr: 1.88e-02, grad_scale: 32.0 2024-03-15 22:47:34,994 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.min_positive, batch_count=24326.666666666668, ans=0.025 2024-03-15 22:47:49,168 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=24360.0, ans=0.1 2024-03-15 22:47:52,618 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.835e+01 1.004e+02 1.271e+02 1.513e+02 3.157e+02, threshold=2.541e+02, percent-clipped=4.0 2024-03-15 22:48:03,698 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=12.76 vs. limit=15.0 2024-03-15 22:48:23,567 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_skip_rate, batch_count=24426.666666666668, ans=0.0 2024-03-15 22:48:43,215 INFO [train_char.py:689] (1/2) Epoch 15, batch 250, loss[loss=0.08251, simple_loss=0.1384, pruned_loss=0.01332, over 24358.00 frames. ], tot_loss[loss=0.08456, simple_loss=0.1358, pruned_loss=0.01664, over 3446939.41 frames. ], batch size: 172, lr: 1.87e-02, grad_scale: 16.0 2024-03-15 22:48:46,394 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=14.15 vs. limit=15.0 2024-03-15 22:49:10,168 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=24560.0, ans=0.125 2024-03-15 22:49:22,622 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer1.prob, batch_count=24593.333333333332, ans=0.125 2024-03-15 22:49:29,064 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.scale_min, batch_count=24593.333333333332, ans=0.2 2024-03-15 22:49:46,326 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.self_attn1.whiten, num_groups=1, num_channels=192, metric=15.61 vs. limit=22.5 2024-03-15 22:49:46,535 INFO [train_char.py:689] (1/2) Epoch 15, batch 300, loss[loss=0.09631, simple_loss=0.1515, pruned_loss=0.02056, over 24222.00 frames. ], tot_loss[loss=0.08383, simple_loss=0.1348, pruned_loss=0.01645, over 3751480.73 frames. ], batch size: 296, lr: 1.87e-02, grad_scale: 16.0 2024-03-15 22:50:04,342 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.889e+01 1.067e+02 1.348e+02 1.795e+02 4.377e+02, threshold=2.695e+02, percent-clipped=7.0 2024-03-15 22:50:04,680 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer2.prob, batch_count=24693.333333333332, ans=0.125 2024-03-15 22:50:16,249 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.4.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 22:50:16,260 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=24726.666666666668, ans=0.125 2024-03-15 22:50:32,897 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.whiten, num_groups=1, num_channels=192, metric=4.53 vs. limit=12.0 2024-03-15 22:50:54,784 INFO [train_char.py:689] (1/2) Epoch 15, batch 350, loss[loss=0.07088, simple_loss=0.1175, pruned_loss=0.01213, over 24218.00 frames. ], tot_loss[loss=0.08465, simple_loss=0.1363, pruned_loss=0.01651, over 3995508.28 frames. ], batch size: 122, lr: 1.87e-02, grad_scale: 16.0 2024-03-15 22:51:13,451 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.out_balancer.prob, batch_count=24860.0, ans=0.125 2024-03-15 22:51:26,361 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=24893.333333333332, ans=0.0 2024-03-15 22:51:33,862 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward3.hidden_balancer.prob, batch_count=24926.666666666668, ans=0.125 2024-03-15 22:51:39,019 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass.skip_rate, batch_count=24926.666666666668, ans=0.09899494936611666 2024-03-15 22:51:59,421 INFO [train_char.py:689] (1/2) Epoch 15, batch 400, loss[loss=0.1006, simple_loss=0.1591, pruned_loss=0.02104, over 24286.00 frames. ], tot_loss[loss=0.08574, simple_loss=0.1377, pruned_loss=0.01687, over 4182639.52 frames. ], batch size: 267, lr: 1.86e-02, grad_scale: 32.0 2024-03-15 22:52:08,148 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer2.prob, batch_count=24993.333333333332, ans=0.125 2024-03-15 22:52:16,928 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=25026.666666666668, ans=0.0 2024-03-15 22:52:17,769 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.899e+01 1.133e+02 1.427e+02 1.789e+02 2.973e+02, threshold=2.854e+02, percent-clipped=2.0 2024-03-15 22:52:49,191 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=4.70 vs. limit=15.0 2024-03-15 22:52:58,718 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=25126.666666666668, ans=0.0 2024-03-15 22:53:02,340 INFO [train_char.py:689] (1/2) Epoch 15, batch 450, loss[loss=0.09382, simple_loss=0.1529, pruned_loss=0.01739, over 24242.00 frames. ], tot_loss[loss=0.08784, simple_loss=0.1411, pruned_loss=0.01731, over 4328059.13 frames. ], batch size: 212, lr: 1.86e-02, grad_scale: 32.0 2024-03-15 22:53:16,610 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.balancer2.prob, batch_count=25193.333333333332, ans=0.125 2024-03-15 22:53:18,885 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass_mid.scale_min, batch_count=25193.333333333332, ans=0.2 2024-03-15 22:53:29,546 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=13.27 vs. limit=15.0 2024-03-15 22:53:30,940 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.conv_module1.whiten, num_groups=1, num_channels=192, metric=6.64 vs. limit=15.0 2024-03-15 22:53:35,238 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.hidden_balancer.prob, batch_count=25226.666666666668, ans=0.125 2024-03-15 22:53:47,022 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=7.04 vs. limit=12.0 2024-03-15 22:53:55,015 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass.skip_rate, batch_count=25293.333333333332, ans=0.09899494936611666 2024-03-15 22:54:07,942 INFO [train_char.py:689] (1/2) Epoch 15, batch 500, loss[loss=0.09747, simple_loss=0.1486, pruned_loss=0.0232, over 24090.00 frames. ], tot_loss[loss=0.08904, simple_loss=0.1432, pruned_loss=0.01745, over 4439656.74 frames. ], batch size: 188, lr: 1.86e-02, grad_scale: 32.0 2024-03-15 22:55:10,879 INFO [train_char.py:689] (1/2) Epoch 16, batch 0, loss[loss=0.0923, simple_loss=0.1445, pruned_loss=0.02004, over 24126.00 frames. ], tot_loss[loss=0.0923, simple_loss=0.1445, pruned_loss=0.02004, over 24126.00 frames. ], batch size: 279, lr: 1.80e-02, grad_scale: 32.0 2024-03-15 22:55:10,880 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 22:55:25,228 INFO [train_char.py:721] (1/2) Epoch 16, validation: loss=0.06804, simple_loss=0.1217, pruned_loss=0.007178, over 657665.00 frames. 2024-03-15 22:55:25,229 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 22:55:34,736 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.602e+01 9.561e+01 1.102e+02 1.272e+02 3.359e+02, threshold=2.205e+02, percent-clipped=1.0 2024-03-15 22:55:40,455 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=25383.333333333332, ans=0.125 2024-03-15 22:55:42,304 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=12.69 vs. limit=15.0 2024-03-15 22:55:53,903 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.prob, batch_count=25416.666666666668, ans=0.125 2024-03-15 22:56:17,449 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.conv_module1.whiten, num_groups=1, num_channels=192, metric=7.37 vs. limit=15.0 2024-03-15 22:56:24,212 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=384, metric=22.73 vs. limit=22.5 2024-03-15 22:56:35,625 INFO [train_char.py:689] (1/2) Epoch 16, batch 50, loss[loss=0.06481, simple_loss=0.08992, pruned_loss=0.01985, over 22613.00 frames. ], tot_loss[loss=0.08292, simple_loss=0.1328, pruned_loss=0.01651, over 1081873.27 frames. ], batch size: 483, lr: 1.79e-02, grad_scale: 32.0 2024-03-15 22:56:40,141 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=25516.666666666668, ans=0.0 2024-03-15 22:56:40,592 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=12.58 vs. limit=15.0 2024-03-15 22:57:20,898 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.ff2_skip_rate, batch_count=25616.666666666668, ans=0.005300724637681159 2024-03-15 22:57:46,072 INFO [train_char.py:689] (1/2) Epoch 16, batch 100, loss[loss=0.09265, simple_loss=0.152, pruned_loss=0.01667, over 24140.00 frames. ], tot_loss[loss=0.08244, simple_loss=0.1333, pruned_loss=0.0158, over 1909977.00 frames. ], batch size: 188, lr: 1.79e-02, grad_scale: 32.0 2024-03-15 22:57:55,179 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.570e+01 1.059e+02 1.253e+02 1.673e+02 2.974e+02, threshold=2.506e+02, percent-clipped=12.0 2024-03-15 22:58:02,592 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=9.15 vs. limit=15.0 2024-03-15 22:58:14,831 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=25750.0, ans=0.125 2024-03-15 22:58:16,229 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=25750.0, ans=0.125 2024-03-15 22:58:50,666 INFO [train_char.py:689] (1/2) Epoch 16, batch 150, loss[loss=0.0695, simple_loss=0.1194, pruned_loss=0.009782, over 24338.00 frames. ], tot_loss[loss=0.082, simple_loss=0.1331, pruned_loss=0.01544, over 2551346.80 frames. ], batch size: 146, lr: 1.79e-02, grad_scale: 32.0 2024-03-15 22:58:56,148 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.out_combiner.scale_min, batch_count=25850.0, ans=0.2 2024-03-15 22:59:11,081 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.attention_skip_rate, batch_count=25883.333333333332, ans=0.0 2024-03-15 22:59:37,346 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_skip_rate, batch_count=25950.0, ans=0.0 2024-03-15 22:59:59,109 INFO [train_char.py:689] (1/2) Epoch 16, batch 200, loss[loss=0.09685, simple_loss=0.1569, pruned_loss=0.01838, over 24124.00 frames. ], tot_loss[loss=0.08231, simple_loss=0.1337, pruned_loss=0.01545, over 3056106.21 frames. ], batch size: 279, lr: 1.79e-02, grad_scale: 32.0 2024-03-15 23:00:07,679 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=4.54 vs. limit=12.0 2024-03-15 23:00:08,163 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.187e+01 1.070e+02 1.349e+02 1.623e+02 3.033e+02, threshold=2.699e+02, percent-clipped=5.0 2024-03-15 23:00:24,746 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.2.prob, batch_count=26050.0, ans=0.125 2024-03-15 23:00:55,371 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer1.prob, batch_count=26150.0, ans=0.125 2024-03-15 23:01:06,618 INFO [train_char.py:689] (1/2) Epoch 16, batch 250, loss[loss=0.06644, simple_loss=0.116, pruned_loss=0.008444, over 24305.00 frames. ], tot_loss[loss=0.08229, simple_loss=0.1339, pruned_loss=0.01534, over 3444787.78 frames. ], batch size: 146, lr: 1.78e-02, grad_scale: 16.0 2024-03-15 23:01:15,081 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=512, metric=17.38 vs. limit=22.5 2024-03-15 23:01:26,078 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_skip_rate, batch_count=26216.666666666668, ans=0.0 2024-03-15 23:01:29,851 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_skip_rate, batch_count=26216.666666666668, ans=0.0 2024-03-15 23:01:46,087 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.skip_rate, batch_count=26283.333333333332, ans=0.09899494936611666 2024-03-15 23:01:48,586 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.nonlin_attention.balancer.min_positive, batch_count=26283.333333333332, ans=0.05 2024-03-15 23:01:48,674 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 23:01:57,665 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.out_balancer.prob, batch_count=26316.666666666668, ans=0.125 2024-03-15 23:01:57,857 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=26316.666666666668, ans=0.1 2024-03-15 23:02:00,365 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.hidden_balancer.prob, batch_count=26316.666666666668, ans=0.125 2024-03-15 23:02:02,138 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=13.76 vs. limit=22.5 2024-03-15 23:02:09,299 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.bypass_mid.scale_min, batch_count=26350.0, ans=0.2 2024-03-15 23:02:10,442 INFO [train_char.py:689] (1/2) Epoch 16, batch 300, loss[loss=0.06996, simple_loss=0.1192, pruned_loss=0.01034, over 24403.00 frames. ], tot_loss[loss=0.0832, simple_loss=0.1353, pruned_loss=0.01556, over 3744695.32 frames. ], batch size: 135, lr: 1.78e-02, grad_scale: 8.0 2024-03-15 23:02:11,406 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=11.38 vs. limit=15.0 2024-03-15 23:02:19,495 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.62 vs. limit=6.0 2024-03-15 23:02:24,983 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.614e+01 9.908e+01 1.237e+02 1.661e+02 2.872e+02, threshold=2.474e+02, percent-clipped=1.0 2024-03-15 23:02:30,769 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=11.29 vs. limit=15.0 2024-03-15 23:03:17,524 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=26516.666666666668, ans=0.1 2024-03-15 23:03:18,476 INFO [train_char.py:689] (1/2) Epoch 16, batch 350, loss[loss=0.09224, simple_loss=0.1457, pruned_loss=0.01937, over 24117.00 frames. ], tot_loss[loss=0.08408, simple_loss=0.1366, pruned_loss=0.01577, over 3987952.15 frames. ], batch size: 188, lr: 1.78e-02, grad_scale: 4.0 2024-03-15 23:03:18,819 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=26516.666666666668, ans=0.125 2024-03-15 23:03:22,787 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=13.39 vs. limit=15.0 2024-03-15 23:03:29,224 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=13.73 vs. limit=15.0 2024-03-15 23:03:46,302 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=256, metric=3.80 vs. limit=15.0 2024-03-15 23:04:25,032 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=2.56 vs. limit=15.0 2024-03-15 23:04:28,234 INFO [train_char.py:689] (1/2) Epoch 16, batch 400, loss[loss=0.09904, simple_loss=0.159, pruned_loss=0.01955, over 24031.00 frames. ], tot_loss[loss=0.08497, simple_loss=0.1381, pruned_loss=0.01592, over 4176922.87 frames. ], batch size: 250, lr: 1.77e-02, grad_scale: 8.0 2024-03-15 23:04:40,587 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.718e+01 9.599e+01 1.139e+02 1.487e+02 2.327e+02, threshold=2.279e+02, percent-clipped=0.0 2024-03-15 23:04:44,539 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=26716.666666666668, ans=0.125 2024-03-15 23:04:47,158 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=26716.666666666668, ans=0.125 2024-03-15 23:04:48,388 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=26716.666666666668, ans=0.0 2024-03-15 23:05:21,670 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=26816.666666666668, ans=0.1 2024-03-15 23:05:30,291 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=26850.0, ans=0.0 2024-03-15 23:05:31,362 INFO [train_char.py:689] (1/2) Epoch 16, batch 450, loss[loss=0.09739, simple_loss=0.1555, pruned_loss=0.01965, over 24277.00 frames. ], tot_loss[loss=0.08607, simple_loss=0.1397, pruned_loss=0.01624, over 4322950.09 frames. ], batch size: 267, lr: 1.77e-02, grad_scale: 8.0 2024-03-15 23:05:42,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=26850.0, ans=0.125 2024-03-15 23:05:58,119 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=12.23 vs. limit=15.0 2024-03-15 23:06:25,461 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.29 vs. limit=6.0 2024-03-15 23:06:36,229 INFO [train_char.py:689] (1/2) Epoch 16, batch 500, loss[loss=0.09772, simple_loss=0.1531, pruned_loss=0.02114, over 24096.00 frames. ], tot_loss[loss=0.08717, simple_loss=0.1416, pruned_loss=0.01637, over 4436521.90 frames. ], batch size: 199, lr: 1.77e-02, grad_scale: 8.0 2024-03-15 23:06:42,672 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.prob, batch_count=27016.666666666668, ans=0.125 2024-03-15 23:07:37,103 INFO [train_char.py:689] (1/2) Epoch 17, batch 0, loss[loss=0.07332, simple_loss=0.1199, pruned_loss=0.01338, over 24154.00 frames. ], tot_loss[loss=0.07332, simple_loss=0.1199, pruned_loss=0.01338, over 24154.00 frames. ], batch size: 362, lr: 1.71e-02, grad_scale: 16.0 2024-03-15 23:07:37,104 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 23:07:50,913 INFO [train_char.py:721] (1/2) Epoch 17, validation: loss=0.06728, simple_loss=0.1211, pruned_loss=0.006709, over 657665.00 frames. 2024-03-15 23:07:50,914 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 23:07:56,238 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.930e+01 9.696e+01 1.142e+02 1.527e+02 2.129e+02, threshold=2.283e+02, percent-clipped=0.0 2024-03-15 23:08:05,100 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=19.78 vs. limit=22.5 2024-03-15 23:08:19,333 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=27106.666666666668, ans=0.1 2024-03-15 23:08:19,365 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=27106.666666666668, ans=0.125 2024-03-15 23:08:46,860 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer1.prob, batch_count=27173.333333333332, ans=0.125 2024-03-15 23:09:05,994 INFO [train_char.py:689] (1/2) Epoch 17, batch 50, loss[loss=0.08328, simple_loss=0.141, pruned_loss=0.01277, over 24144.00 frames. ], tot_loss[loss=0.08211, simple_loss=0.1344, pruned_loss=0.01488, over 1090390.30 frames. ], batch size: 188, lr: 1.71e-02, grad_scale: 8.0 2024-03-15 23:09:11,879 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=27206.666666666668, ans=0.125 2024-03-15 23:09:38,535 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=27273.333333333332, ans=0.1 2024-03-15 23:09:43,636 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=27273.333333333332, ans=0.1 2024-03-15 23:10:12,026 INFO [train_char.py:689] (1/2) Epoch 17, batch 100, loss[loss=0.06299, simple_loss=0.1118, pruned_loss=0.007106, over 24283.00 frames. ], tot_loss[loss=0.08097, simple_loss=0.1323, pruned_loss=0.0148, over 1913199.99 frames. ], batch size: 140, lr: 1.71e-02, grad_scale: 8.0 2024-03-15 23:10:16,895 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.353e+01 1.077e+02 1.241e+02 1.620e+02 2.640e+02, threshold=2.483e+02, percent-clipped=3.0 2024-03-15 23:10:25,499 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.80 vs. limit=6.0 2024-03-15 23:11:20,609 INFO [train_char.py:689] (1/2) Epoch 17, batch 150, loss[loss=0.07511, simple_loss=0.1166, pruned_loss=0.0168, over 23809.00 frames. ], tot_loss[loss=0.08153, simple_loss=0.1334, pruned_loss=0.01481, over 2559737.53 frames. ], batch size: 439, lr: 1.71e-02, grad_scale: 8.0 2024-03-15 23:11:33,530 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.ff3_skip_rate, batch_count=27573.333333333332, ans=0.00487536231884058 2024-03-15 23:11:38,011 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=12.15 vs. limit=15.0 2024-03-15 23:11:57,949 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=8.35 vs. limit=10.0 2024-03-15 23:11:59,721 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer2.prob, batch_count=27606.666666666668, ans=0.125 2024-03-15 23:12:00,046 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=256, metric=4.53 vs. limit=15.0 2024-03-15 23:12:25,927 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer1.prob, batch_count=27673.333333333332, ans=0.125 2024-03-15 23:12:25,950 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=27673.333333333332, ans=0.2 2024-03-15 23:12:29,511 INFO [train_char.py:689] (1/2) Epoch 17, batch 200, loss[loss=0.08458, simple_loss=0.1398, pruned_loss=0.01469, over 24299.00 frames. ], tot_loss[loss=0.08214, simple_loss=0.1342, pruned_loss=0.01505, over 3057280.31 frames. ], batch size: 180, lr: 1.70e-02, grad_scale: 8.0 2024-03-15 23:12:35,567 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.192e+01 1.028e+02 1.235e+02 1.712e+02 3.164e+02, threshold=2.470e+02, percent-clipped=4.0 2024-03-15 23:12:35,977 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=27706.666666666668, ans=0.1 2024-03-15 23:12:39,814 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass_mid.scale_min, batch_count=27706.666666666668, ans=0.2 2024-03-15 23:12:49,951 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff3_skip_rate, batch_count=27740.0, ans=0.004839130434782609 2024-03-15 23:12:53,777 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.nonlin_attention.balancer.min_positive, batch_count=27773.333333333332, ans=0.05 2024-03-15 23:12:53,874 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=27773.333333333332, ans=0.1 2024-03-15 23:12:57,797 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=256, metric=4.11 vs. limit=15.0 2024-03-15 23:13:00,052 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.attention_skip_rate, batch_count=27773.333333333332, ans=0.0 2024-03-15 23:13:21,719 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.hidden_balancer.prob, batch_count=27840.0, ans=0.125 2024-03-15 23:13:30,422 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=14.72 vs. limit=15.0 2024-03-15 23:13:33,588 INFO [train_char.py:689] (1/2) Epoch 17, batch 250, loss[loss=0.09714, simple_loss=0.1576, pruned_loss=0.01832, over 24180.00 frames. ], tot_loss[loss=0.08228, simple_loss=0.1345, pruned_loss=0.01501, over 3449179.89 frames. ], batch size: 212, lr: 1.70e-02, grad_scale: 8.0 2024-03-15 23:13:36,398 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=27873.333333333332, ans=0.125 2024-03-15 23:14:15,307 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn1.whiten, num_groups=1, num_channels=192, metric=14.43 vs. limit=22.5 2024-03-15 23:14:15,990 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=27973.333333333332, ans=0.1 2024-03-15 23:14:25,197 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.whiten, num_groups=1, num_channels=512, metric=6.97 vs. limit=12.0 2024-03-15 23:14:43,994 INFO [train_char.py:689] (1/2) Epoch 17, batch 300, loss[loss=0.07036, simple_loss=0.1231, pruned_loss=0.008801, over 21636.00 frames. ], tot_loss[loss=0.0812, simple_loss=0.1333, pruned_loss=0.01457, over 3753374.63 frames. ], batch size: 86, lr: 1.70e-02, grad_scale: 8.0 2024-03-15 23:14:50,413 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.479e+01 9.549e+01 1.144e+02 1.473e+02 3.101e+02, threshold=2.289e+02, percent-clipped=3.0 2024-03-15 23:14:53,619 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=13.16 vs. limit=15.0 2024-03-15 23:15:15,602 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=28106.666666666668, ans=0.1 2024-03-15 23:15:23,500 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=14.64 vs. limit=15.0 2024-03-15 23:15:32,176 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer2.prob, batch_count=28140.0, ans=0.125 2024-03-15 23:15:48,565 INFO [train_char.py:689] (1/2) Epoch 17, batch 350, loss[loss=0.07035, simple_loss=0.12, pruned_loss=0.01033, over 24287.00 frames. ], tot_loss[loss=0.08199, simple_loss=0.1345, pruned_loss=0.01472, over 3986762.00 frames. ], batch size: 146, lr: 1.69e-02, grad_scale: 8.0 2024-03-15 23:16:08,468 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=28240.0, ans=0.004730434782608696 2024-03-15 23:16:09,078 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=192, metric=5.87 vs. limit=10.0 2024-03-15 23:16:55,527 INFO [train_char.py:689] (1/2) Epoch 17, batch 400, loss[loss=0.08607, simple_loss=0.1388, pruned_loss=0.01667, over 24432.00 frames. ], tot_loss[loss=0.08261, simple_loss=0.1357, pruned_loss=0.01477, over 4176782.86 frames. ], batch size: 165, lr: 1.69e-02, grad_scale: 16.0 2024-03-15 23:16:59,693 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer1.prob, batch_count=28373.333333333332, ans=0.125 2024-03-15 23:17:01,979 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.098e+01 9.793e+01 1.243e+02 1.636e+02 2.677e+02, threshold=2.485e+02, percent-clipped=8.0 2024-03-15 23:17:08,587 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.0.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 23:17:15,583 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.89 vs. limit=6.0 2024-03-15 23:17:17,741 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.self_attn_weights.whiten_keys.whitening_limit, batch_count=28406.666666666668, ans=6.0 2024-03-15 23:17:36,598 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=18.35 vs. limit=15.0 2024-03-15 23:17:40,122 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer1.prob, batch_count=28473.333333333332, ans=0.125 2024-03-15 23:18:00,840 INFO [train_char.py:689] (1/2) Epoch 17, batch 450, loss[loss=0.1013, simple_loss=0.162, pruned_loss=0.02031, over 24265.00 frames. ], tot_loss[loss=0.08256, simple_loss=0.1357, pruned_loss=0.01472, over 4323906.61 frames. ], batch size: 267, lr: 1.69e-02, grad_scale: 16.0 2024-03-15 23:18:03,616 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer2.prob, batch_count=28540.0, ans=0.125 2024-03-15 23:18:16,805 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.balancer1.prob, batch_count=28573.333333333332, ans=0.125 2024-03-15 23:18:20,013 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=10.31 vs. limit=15.0 2024-03-15 23:19:04,830 INFO [train_char.py:689] (1/2) Epoch 17, batch 500, loss[loss=0.0973, simple_loss=0.1557, pruned_loss=0.01943, over 24109.00 frames. ], tot_loss[loss=0.08464, simple_loss=0.139, pruned_loss=0.01515, over 4436070.93 frames. ], batch size: 199, lr: 1.69e-02, grad_scale: 16.0 2024-03-15 23:19:11,109 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.875e+01 9.948e+01 1.161e+02 1.379e+02 4.396e+02, threshold=2.322e+02, percent-clipped=2.0 2024-03-15 23:20:08,754 INFO [train_char.py:689] (1/2) Epoch 18, batch 0, loss[loss=0.07641, simple_loss=0.1262, pruned_loss=0.01331, over 24242.00 frames. ], tot_loss[loss=0.07641, simple_loss=0.1262, pruned_loss=0.01331, over 24242.00 frames. ], batch size: 328, lr: 1.64e-02, grad_scale: 32.0 2024-03-15 23:20:08,754 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 23:20:22,467 INFO [train_char.py:721] (1/2) Epoch 18, validation: loss=0.06657, simple_loss=0.1202, pruned_loss=0.006462, over 657665.00 frames. 2024-03-15 23:20:22,468 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 23:20:35,046 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=28730.0, ans=0.0 2024-03-15 23:20:55,786 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.8.prob, batch_count=28796.666666666668, ans=0.125 2024-03-15 23:21:21,783 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.conv_module2.whiten, num_groups=1, num_channels=384, metric=3.27 vs. limit=15.0 2024-03-15 23:21:35,147 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.self_attn1.whiten.whitening_limit, batch_count=28863.333333333332, ans=22.5 2024-03-15 23:21:37,048 INFO [train_char.py:689] (1/2) Epoch 18, batch 50, loss[loss=0.09092, simple_loss=0.1557, pruned_loss=0.01307, over 24064.00 frames. ], tot_loss[loss=0.07707, simple_loss=0.1288, pruned_loss=0.01269, over 1086791.64 frames. ], batch size: 236, lr: 1.63e-02, grad_scale: 32.0 2024-03-15 23:21:58,944 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer2.prob, batch_count=28930.0, ans=0.125 2024-03-15 23:22:17,606 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=256, metric=3.18 vs. limit=15.0 2024-03-15 23:22:17,823 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.29 vs. limit=6.0 2024-03-15 23:22:22,445 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_skip_rate, batch_count=28996.666666666668, ans=0.0 2024-03-15 23:22:47,206 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.162e+01 1.018e+02 1.249e+02 1.654e+02 4.023e+02, threshold=2.497e+02, percent-clipped=11.0 2024-03-15 23:22:47,569 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 23:22:48,302 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn2.whiten, num_groups=1, num_channels=192, metric=17.44 vs. limit=22.5 2024-03-15 23:22:48,605 INFO [train_char.py:689] (1/2) Epoch 18, batch 100, loss[loss=0.07483, simple_loss=0.1269, pruned_loss=0.01138, over 24184.00 frames. ], tot_loss[loss=0.07852, simple_loss=0.1302, pruned_loss=0.01343, over 1903591.30 frames. ], batch size: 122, lr: 1.63e-02, grad_scale: 16.0 2024-03-15 23:22:56,689 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer1.max_abs, batch_count=29063.333333333332, ans=10.0 2024-03-15 23:23:06,820 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=29096.666666666668, ans=0.1 2024-03-15 23:23:15,985 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=29130.0, ans=0.1 2024-03-15 23:23:28,404 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=15.05 vs. limit=22.5 2024-03-15 23:23:37,008 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=29163.333333333332, ans=0.125 2024-03-15 23:23:51,872 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=192, metric=7.94 vs. limit=15.0 2024-03-15 23:23:57,583 INFO [train_char.py:689] (1/2) Epoch 18, batch 150, loss[loss=0.06477, simple_loss=0.09931, pruned_loss=0.01512, over 22661.00 frames. ], tot_loss[loss=0.07983, simple_loss=0.1324, pruned_loss=0.01364, over 2552352.01 frames. ], batch size: 483, lr: 1.63e-02, grad_scale: 16.0 2024-03-15 23:24:23,069 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=29296.666666666668, ans=0.004500724637681159 2024-03-15 23:24:30,621 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff3_skip_rate, batch_count=29296.666666666668, ans=0.004500724637681159 2024-03-15 23:24:55,721 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=29363.333333333332, ans=0.1 2024-03-15 23:25:00,565 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.073e+01 1.016e+02 1.234e+02 1.645e+02 2.905e+02, threshold=2.468e+02, percent-clipped=3.0 2024-03-15 23:25:01,775 INFO [train_char.py:689] (1/2) Epoch 18, batch 200, loss[loss=0.07959, simple_loss=0.1309, pruned_loss=0.01415, over 24178.00 frames. ], tot_loss[loss=0.08004, simple_loss=0.1324, pruned_loss=0.01383, over 3055157.80 frames. ], batch size: 344, lr: 1.63e-02, grad_scale: 16.0 2024-03-15 23:25:02,142 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=29396.666666666668, ans=0.0 2024-03-15 23:25:02,707 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.conv_module2.whiten, num_groups=1, num_channels=192, metric=8.10 vs. limit=15.0 2024-03-15 23:25:37,673 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer2.prob, batch_count=29463.333333333332, ans=0.125 2024-03-15 23:26:08,451 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=8.23 vs. limit=15.0 2024-03-15 23:26:09,128 INFO [train_char.py:689] (1/2) Epoch 18, batch 250, loss[loss=0.09909, simple_loss=0.1626, pruned_loss=0.01779, over 24062.00 frames. ], tot_loss[loss=0.0809, simple_loss=0.1338, pruned_loss=0.014, over 3446014.74 frames. ], batch size: 223, lr: 1.62e-02, grad_scale: 8.0 2024-03-15 23:26:12,054 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff2_skip_rate, batch_count=29563.333333333332, ans=0.004442753623188406 2024-03-15 23:26:19,673 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 23:26:46,892 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=29630.0, ans=0.0 2024-03-15 23:26:49,488 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer2.prob, batch_count=29663.333333333332, ans=0.125 2024-03-15 23:26:51,964 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=29663.333333333332, ans=0.1 2024-03-15 23:26:54,485 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=29663.333333333332, ans=0.0 2024-03-15 23:27:15,689 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.799e+01 9.910e+01 1.200e+02 1.468e+02 2.725e+02, threshold=2.399e+02, percent-clipped=2.0 2024-03-15 23:27:15,718 INFO [train_char.py:689] (1/2) Epoch 18, batch 300, loss[loss=0.09479, simple_loss=0.1554, pruned_loss=0.01709, over 24184.00 frames. ], tot_loss[loss=0.08141, simple_loss=0.1348, pruned_loss=0.014, over 3754976.88 frames. ], batch size: 266, lr: 1.62e-02, grad_scale: 8.0 2024-03-15 23:27:23,658 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=29730.0, ans=0.0 2024-03-15 23:27:48,771 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=384, metric=6.00 vs. limit=15.0 2024-03-15 23:28:21,609 INFO [train_char.py:689] (1/2) Epoch 18, batch 350, loss[loss=0.08516, simple_loss=0.1426, pruned_loss=0.01388, over 24063.00 frames. ], tot_loss[loss=0.08202, simple_loss=0.1359, pruned_loss=0.01408, over 3995632.53 frames. ], batch size: 199, lr: 1.62e-02, grad_scale: 8.0 2024-03-15 23:28:33,258 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_skip_rate, batch_count=29930.0, ans=0.0 2024-03-15 23:28:34,513 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.prob, batch_count=29930.0, ans=0.125 2024-03-15 23:28:49,204 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.balancer2.prob, batch_count=29963.333333333332, ans=0.125 2024-03-15 23:28:54,165 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer2.prob, batch_count=29963.333333333332, ans=0.125 2024-03-15 23:29:26,850 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.638e+01 9.578e+01 1.055e+02 1.386e+02 2.453e+02, threshold=2.109e+02, percent-clipped=1.0 2024-03-15 23:29:26,881 INFO [train_char.py:689] (1/2) Epoch 18, batch 400, loss[loss=0.09291, simple_loss=0.1563, pruned_loss=0.01476, over 24175.00 frames. ], tot_loss[loss=0.08339, simple_loss=0.1381, pruned_loss=0.01433, over 4183105.52 frames. ], batch size: 266, lr: 1.62e-02, grad_scale: 16.0 2024-03-15 23:29:34,407 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten.whitening_limit, batch_count=30063.333333333332, ans=22.5 2024-03-15 23:29:37,636 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.5.prob, batch_count=30063.333333333332, ans=0.125 2024-03-15 23:29:42,882 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=30096.666666666668, ans=0.1 2024-03-15 23:29:46,347 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=30096.666666666668, ans=0.0 2024-03-15 23:29:53,915 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=30130.0, ans=0.1 2024-03-15 23:29:56,393 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer2.prob, batch_count=30130.0, ans=0.125 2024-03-15 23:30:17,900 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=15.45 vs. limit=15.0 2024-03-15 23:30:26,464 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=5.49 vs. limit=10.0 2024-03-15 23:30:32,011 INFO [train_char.py:689] (1/2) Epoch 18, batch 450, loss[loss=0.08286, simple_loss=0.1325, pruned_loss=0.01663, over 24203.00 frames. ], tot_loss[loss=0.08327, simple_loss=0.138, pruned_loss=0.01427, over 4329566.01 frames. ], batch size: 311, lr: 1.61e-02, grad_scale: 8.0 2024-03-15 23:30:56,181 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=30263.333333333332, ans=0.0 2024-03-15 23:31:36,612 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=30396.666666666668, ans=0.1 2024-03-15 23:31:37,497 INFO [train_char.py:689] (1/2) Epoch 18, batch 500, loss[loss=0.09249, simple_loss=0.151, pruned_loss=0.01699, over 24057.00 frames. ], tot_loss[loss=0.08375, simple_loss=0.1389, pruned_loss=0.01432, over 4442441.37 frames. ], batch size: 236, lr: 1.61e-02, grad_scale: 8.0 2024-03-15 23:31:38,759 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.402e+01 8.967e+01 1.019e+02 1.195e+02 2.230e+02, threshold=2.038e+02, percent-clipped=1.0 2024-03-15 23:32:38,192 INFO [train_char.py:689] (1/2) Epoch 19, batch 0, loss[loss=0.06308, simple_loss=0.1056, pruned_loss=0.01026, over 23950.00 frames. ], tot_loss[loss=0.06308, simple_loss=0.1056, pruned_loss=0.01026, over 23950.00 frames. ], batch size: 407, lr: 1.57e-02, grad_scale: 16.0 2024-03-15 23:32:38,193 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 23:32:45,908 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.0.layers.0.self_attn_weights, attn_weights_entropy = tensor([5.2214, 5.3421, 5.2424, 5.0110], device='cuda:1') 2024-03-15 23:32:51,651 INFO [train_char.py:721] (1/2) Epoch 19, validation: loss=0.06498, simple_loss=0.1179, pruned_loss=0.006013, over 657665.00 frames. 2024-03-15 23:32:51,651 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 23:32:53,222 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer2.prob, batch_count=30420.0, ans=0.125 2024-03-15 23:32:56,564 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=192, metric=11.89 vs. limit=15.0 2024-03-15 23:33:17,746 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn2.whiten, num_groups=1, num_channels=192, metric=17.09 vs. limit=22.5 2024-03-15 23:33:19,837 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=30453.333333333332, ans=0.0 2024-03-15 23:33:21,909 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=256, metric=10.40 vs. limit=15.0 2024-03-15 23:33:35,224 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=10.91 vs. limit=15.0 2024-03-15 23:33:44,266 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer1.prob, batch_count=30520.0, ans=0.125 2024-03-15 23:33:48,461 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=30520.0, ans=0.1 2024-03-15 23:33:52,644 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=4.76 vs. limit=15.0 2024-03-15 23:34:06,504 INFO [train_char.py:689] (1/2) Epoch 19, batch 50, loss[loss=0.07103, simple_loss=0.1228, pruned_loss=0.009656, over 24291.00 frames. ], tot_loss[loss=0.07712, simple_loss=0.1282, pruned_loss=0.01304, over 1080152.21 frames. ], batch size: 140, lr: 1.56e-02, grad_scale: 8.0 2024-03-15 23:34:08,113 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff3_skip_rate, batch_count=30586.666666666668, ans=0.004220289855072464 2024-03-15 23:34:16,537 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer1.prob, batch_count=30586.666666666668, ans=0.125 2024-03-15 23:35:06,093 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 8.255e+01 1.088e+02 1.285e+02 1.838e+02 3.360e+02, threshold=2.569e+02, percent-clipped=17.0 2024-03-15 23:35:12,626 INFO [train_char.py:689] (1/2) Epoch 19, batch 100, loss[loss=0.06063, simple_loss=0.09016, pruned_loss=0.01554, over 22835.00 frames. ], tot_loss[loss=0.07821, simple_loss=0.1294, pruned_loss=0.01349, over 1901519.40 frames. ], batch size: 483, lr: 1.56e-02, grad_scale: 8.0 2024-03-15 23:35:15,535 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.skip_rate, batch_count=30753.333333333332, ans=0.07 2024-03-15 23:35:22,904 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module1.balancer1.prob, batch_count=30753.333333333332, ans=0.125 2024-03-15 23:35:38,249 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=30786.666666666668, ans=0.125 2024-03-15 23:35:42,666 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=192, metric=5.71 vs. limit=10.0 2024-03-15 23:36:07,421 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.attention_skip_rate, batch_count=30886.666666666668, ans=0.0 2024-03-15 23:36:17,641 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.attention_skip_rate, batch_count=30886.666666666668, ans=0.0 2024-03-15 23:36:21,133 INFO [train_char.py:689] (1/2) Epoch 19, batch 150, loss[loss=0.06481, simple_loss=0.1129, pruned_loss=0.008371, over 24256.00 frames. ], tot_loss[loss=0.07865, simple_loss=0.1308, pruned_loss=0.01327, over 2549844.46 frames. ], batch size: 134, lr: 1.56e-02, grad_scale: 8.0 2024-03-15 23:36:35,853 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=14.71 vs. limit=15.0 2024-03-15 23:36:41,660 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=30953.333333333332, ans=0.0 2024-03-15 23:36:42,937 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer2.prob, batch_count=30953.333333333332, ans=0.125 2024-03-15 23:37:10,076 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.balancer1.prob, batch_count=31020.0, ans=0.125 2024-03-15 23:37:21,862 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=31053.333333333332, ans=0.125 2024-03-15 23:37:22,779 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.585e+01 9.639e+01 1.177e+02 1.532e+02 2.685e+02, threshold=2.354e+02, percent-clipped=1.0 2024-03-15 23:37:26,307 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=384, metric=8.24 vs. limit=15.0 2024-03-15 23:37:29,434 INFO [train_char.py:689] (1/2) Epoch 19, batch 200, loss[loss=0.06282, simple_loss=0.1091, pruned_loss=0.008286, over 24178.00 frames. ], tot_loss[loss=0.07776, simple_loss=0.1297, pruned_loss=0.01291, over 3051315.95 frames. ], batch size: 122, lr: 1.56e-02, grad_scale: 8.0 2024-03-15 23:37:30,093 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=11.52 vs. limit=15.0 2024-03-15 23:37:31,025 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=31086.666666666668, ans=0.1 2024-03-15 23:37:39,975 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.scale_min, batch_count=31086.666666666668, ans=0.2 2024-03-15 23:37:41,330 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.prob, batch_count=31120.0, ans=0.125 2024-03-15 23:37:43,884 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer2.prob, batch_count=31120.0, ans=0.125 2024-03-15 23:37:49,000 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.prob, batch_count=31120.0, ans=0.125 2024-03-15 23:38:19,635 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=31220.0, ans=0.0 2024-03-15 23:38:22,121 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_skip_rate, batch_count=31220.0, ans=0.0 2024-03-15 23:38:28,359 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=31220.0, ans=0.1 2024-03-15 23:38:33,172 INFO [train_char.py:689] (1/2) Epoch 19, batch 250, loss[loss=0.08435, simple_loss=0.1404, pruned_loss=0.01417, over 24373.00 frames. ], tot_loss[loss=0.07914, simple_loss=0.1319, pruned_loss=0.01319, over 3448546.96 frames. ], batch size: 172, lr: 1.55e-02, grad_scale: 8.0 2024-03-15 23:38:47,258 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer1.prob, batch_count=31253.333333333332, ans=0.125 2024-03-15 23:38:54,672 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.layerdrop_rate, batch_count=31286.666666666668, ans=0.015 2024-03-15 23:38:57,261 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.hidden_balancer.prob, batch_count=31286.666666666668, ans=0.125 2024-03-15 23:39:00,140 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=14.89 vs. limit=15.0 2024-03-15 23:39:37,392 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=13.46 vs. limit=22.5 2024-03-15 23:39:37,990 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.855e+01 9.459e+01 1.088e+02 1.598e+02 2.803e+02, threshold=2.176e+02, percent-clipped=3.0 2024-03-15 23:39:39,650 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module1.balancer1.prob, batch_count=31386.666666666668, ans=0.125 2024-03-15 23:39:44,206 INFO [train_char.py:689] (1/2) Epoch 19, batch 300, loss[loss=0.09109, simple_loss=0.1521, pruned_loss=0.01502, over 24224.00 frames. ], tot_loss[loss=0.07963, simple_loss=0.1328, pruned_loss=0.01325, over 3753948.58 frames. ], batch size: 212, lr: 1.55e-02, grad_scale: 8.0 2024-03-15 23:39:44,506 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass.skip_rate, batch_count=31420.0, ans=0.07 2024-03-15 23:39:49,546 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer2.prob, batch_count=31420.0, ans=0.125 2024-03-15 23:39:50,666 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_skip_rate, batch_count=31420.0, ans=0.0 2024-03-15 23:40:05,507 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.convnext.out_whiten, num_groups=1, num_channels=128, metric=4.39 vs. limit=5.0 2024-03-15 23:40:17,350 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer1.prob, batch_count=31486.666666666668, ans=0.125 2024-03-15 23:40:29,861 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff3_skip_rate, batch_count=31520.0, ans=0.004017391304347827 2024-03-15 23:40:49,934 INFO [train_char.py:689] (1/2) Epoch 19, batch 350, loss[loss=0.08608, simple_loss=0.1499, pruned_loss=0.01111, over 21615.00 frames. ], tot_loss[loss=0.07966, simple_loss=0.1333, pruned_loss=0.01303, over 3990374.18 frames. ], batch size: 86, lr: 1.55e-02, grad_scale: 8.0 2024-03-15 23:40:54,683 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.self_attn2.whiten, num_groups=1, num_channels=192, metric=14.55 vs. limit=22.5 2024-03-15 23:40:55,812 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=16.80 vs. limit=22.5 2024-03-15 23:40:56,505 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module2.balancer2.prob, batch_count=31586.666666666668, ans=0.125 2024-03-15 23:40:59,136 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=31586.666666666668, ans=0.0 2024-03-15 23:40:59,845 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=7.66 vs. limit=15.0 2024-03-15 23:41:23,911 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=11.30 vs. limit=15.0 2024-03-15 23:41:25,124 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module1.whiten, num_groups=1, num_channels=512, metric=3.84 vs. limit=15.0 2024-03-15 23:41:49,267 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.223e+01 9.840e+01 1.222e+02 1.502e+02 3.919e+02, threshold=2.443e+02, percent-clipped=4.0 2024-03-15 23:41:55,561 INFO [train_char.py:689] (1/2) Epoch 19, batch 400, loss[loss=0.07474, simple_loss=0.125, pruned_loss=0.01222, over 24176.00 frames. ], tot_loss[loss=0.08006, simple_loss=0.1338, pruned_loss=0.01315, over 4173120.75 frames. ], batch size: 344, lr: 1.55e-02, grad_scale: 16.0 2024-03-15 23:42:04,652 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer2.prob, batch_count=31753.333333333332, ans=0.125 2024-03-15 23:42:05,812 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer1.min_positive, batch_count=31753.333333333332, ans=0.025 2024-03-15 23:42:13,345 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module2.balancer1.prob, batch_count=31786.666666666668, ans=0.125 2024-03-15 23:42:19,530 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=31820.0, ans=0.125 2024-03-15 23:42:26,352 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_skip_rate, batch_count=31820.0, ans=0.0 2024-03-15 23:42:57,623 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=31886.666666666668, ans=0.0 2024-03-15 23:43:01,166 INFO [train_char.py:689] (1/2) Epoch 19, batch 450, loss[loss=0.09231, simple_loss=0.1494, pruned_loss=0.01762, over 24062.00 frames. ], tot_loss[loss=0.081, simple_loss=0.1352, pruned_loss=0.0134, over 4321872.10 frames. ], batch size: 199, lr: 1.54e-02, grad_scale: 16.0 2024-03-15 23:43:20,600 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=31953.333333333332, ans=0.125 2024-03-15 23:43:46,404 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=32020.0, ans=0.1 2024-03-15 23:43:46,430 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer2.prob, batch_count=32020.0, ans=0.125 2024-03-15 23:43:49,971 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.hidden_balancer.prob, batch_count=32020.0, ans=0.125 2024-03-15 23:43:59,846 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.543e+01 9.840e+01 1.222e+02 1.592e+02 3.029e+02, threshold=2.443e+02, percent-clipped=4.0 2024-03-15 23:44:00,077 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=32053.333333333332, ans=0.1 2024-03-15 23:44:06,296 INFO [train_char.py:689] (1/2) Epoch 19, batch 500, loss[loss=0.09228, simple_loss=0.149, pruned_loss=0.01777, over 24092.00 frames. ], tot_loss[loss=0.08186, simple_loss=0.1367, pruned_loss=0.01349, over 4435929.51 frames. ], batch size: 223, lr: 1.54e-02, grad_scale: 16.0 2024-03-15 23:45:06,794 INFO [train_char.py:689] (1/2) Epoch 20, batch 0, loss[loss=0.0736, simple_loss=0.1246, pruned_loss=0.01129, over 24237.00 frames. ], tot_loss[loss=0.0736, simple_loss=0.1246, pruned_loss=0.01129, over 24237.00 frames. ], batch size: 311, lr: 1.50e-02, grad_scale: 32.0 2024-03-15 23:45:06,800 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 23:45:14,115 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.4.encoder.layers.2.self_attn_weights, attn_weights_entropy = tensor([2.7072, 2.9985, 3.0530, 2.8715], device='cuda:1') 2024-03-15 23:45:20,125 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.4.encoder.layers.2.self_attn_weights, attn_weights_entropy = tensor([2.0079, 2.5320, 2.2053, 2.3778], device='cuda:1') 2024-03-15 23:45:20,705 INFO [train_char.py:721] (1/2) Epoch 20, validation: loss=0.06562, simple_loss=0.1193, pruned_loss=0.00597, over 657665.00 frames. 2024-03-15 23:45:20,705 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 23:45:33,697 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=32110.0, ans=0.125 2024-03-15 23:45:36,836 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=12.28 vs. limit=15.0 2024-03-15 23:45:41,991 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.prob, batch_count=32143.333333333332, ans=0.125 2024-03-15 23:45:44,727 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer1.prob, batch_count=32143.333333333332, ans=0.125 2024-03-15 23:46:29,689 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.skip_rate, batch_count=32243.333333333332, ans=0.09899494936611666 2024-03-15 23:46:35,673 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=12.79 vs. limit=22.5 2024-03-15 23:46:37,769 INFO [train_char.py:689] (1/2) Epoch 20, batch 50, loss[loss=0.06515, simple_loss=0.1104, pruned_loss=0.009954, over 24146.00 frames. ], tot_loss[loss=0.07709, simple_loss=0.1294, pruned_loss=0.01239, over 1083196.80 frames. ], batch size: 122, lr: 1.50e-02, grad_scale: 16.0 2024-03-15 23:46:57,008 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=32310.0, ans=0.1 2024-03-15 23:47:05,393 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=11.12 vs. limit=15.0 2024-03-15 23:47:19,018 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.balancer2.prob, batch_count=32376.666666666668, ans=0.125 2024-03-15 23:47:34,781 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.077e+01 1.038e+02 1.248e+02 1.651e+02 2.797e+02, threshold=2.495e+02, percent-clipped=4.0 2024-03-15 23:47:39,939 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.prob, batch_count=32410.0, ans=0.125 2024-03-15 23:47:48,947 INFO [train_char.py:689] (1/2) Epoch 20, batch 100, loss[loss=0.08671, simple_loss=0.1472, pruned_loss=0.01309, over 24076.00 frames. ], tot_loss[loss=0.07733, simple_loss=0.1299, pruned_loss=0.0124, over 1903388.80 frames. ], batch size: 188, lr: 1.50e-02, grad_scale: 16.0 2024-03-15 23:48:09,856 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=5.49 vs. limit=12.0 2024-03-15 23:48:12,603 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=8.37 vs. limit=10.0 2024-03-15 23:48:33,867 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=32543.333333333332, ans=0.0 2024-03-15 23:48:34,675 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=192, metric=6.60 vs. limit=15.0 2024-03-15 23:48:37,121 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=12.98 vs. limit=15.0 2024-03-15 23:48:40,476 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=32543.333333333332, ans=0.0 2024-03-15 23:48:48,187 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.ff3_skip_rate, batch_count=32576.666666666668, ans=0.003787681159420289 2024-03-15 23:48:51,913 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.layerdrop_rate, batch_count=32576.666666666668, ans=0.015 2024-03-15 23:48:59,278 INFO [train_char.py:689] (1/2) Epoch 20, batch 150, loss[loss=0.06183, simple_loss=0.09288, pruned_loss=0.01539, over 22728.00 frames. ], tot_loss[loss=0.07668, simple_loss=0.1294, pruned_loss=0.01199, over 2550897.40 frames. ], batch size: 483, lr: 1.49e-02, grad_scale: 16.0 2024-03-15 23:49:33,235 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module1.balancer2.prob, batch_count=32676.666666666668, ans=0.125 2024-03-15 23:49:51,019 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.740e+01 9.891e+01 1.306e+02 1.570e+02 2.698e+02, threshold=2.611e+02, percent-clipped=2.0 2024-03-15 23:50:03,633 INFO [train_char.py:689] (1/2) Epoch 20, batch 200, loss[loss=0.06316, simple_loss=0.1153, pruned_loss=0.005515, over 24289.00 frames. ], tot_loss[loss=0.07699, simple_loss=0.1299, pruned_loss=0.01202, over 3050259.02 frames. ], batch size: 140, lr: 1.49e-02, grad_scale: 8.0 2024-03-15 23:50:03,824 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer1.prob, batch_count=32776.666666666664, ans=0.125 2024-03-15 23:50:49,327 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=32876.666666666664, ans=0.0 2024-03-15 23:50:59,491 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.balancer1.prob, batch_count=32910.0, ans=0.125 2024-03-15 23:51:02,070 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass_mid.scale_min, batch_count=32910.0, ans=0.2 2024-03-15 23:51:07,202 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=32910.0, ans=0.1 2024-03-15 23:51:10,668 INFO [train_char.py:689] (1/2) Epoch 20, batch 250, loss[loss=0.07428, simple_loss=0.1263, pruned_loss=0.01112, over 24388.00 frames. ], tot_loss[loss=0.07731, simple_loss=0.1306, pruned_loss=0.01201, over 3440377.75 frames. ], batch size: 158, lr: 1.49e-02, grad_scale: 8.0 2024-03-15 23:51:22,096 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer2.prob, batch_count=32943.333333333336, ans=0.125 2024-03-15 23:51:34,012 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=256, metric=5.56 vs. limit=15.0 2024-03-15 23:51:35,013 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-15 23:51:48,888 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=33010.0, ans=0.125 2024-03-15 23:51:58,374 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=256, metric=7.09 vs. limit=15.0 2024-03-15 23:52:02,891 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module1.balancer1.prob, batch_count=33043.333333333336, ans=0.125 2024-03-15 23:52:05,094 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.317e+01 9.646e+01 1.177e+02 1.466e+02 2.336e+02, threshold=2.353e+02, percent-clipped=0.0 2024-03-15 23:52:05,356 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=33076.666666666664, ans=0.125 2024-03-15 23:52:07,090 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=3.20 vs. limit=12.0 2024-03-15 23:52:18,069 INFO [train_char.py:689] (1/2) Epoch 20, batch 300, loss[loss=0.06791, simple_loss=0.1178, pruned_loss=0.009018, over 24297.00 frames. ], tot_loss[loss=0.07728, simple_loss=0.1306, pruned_loss=0.012, over 3750464.96 frames. ], batch size: 146, lr: 1.49e-02, grad_scale: 8.0 2024-03-15 23:52:32,535 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=4.81 vs. limit=15.0 2024-03-15 23:52:35,686 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.prob, batch_count=33143.333333333336, ans=0.125 2024-03-15 23:52:59,186 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=33210.0, ans=0.1 2024-03-15 23:53:25,173 INFO [train_char.py:689] (1/2) Epoch 20, batch 350, loss[loss=0.07502, simple_loss=0.1321, pruned_loss=0.008962, over 24366.00 frames. ], tot_loss[loss=0.078, simple_loss=0.1316, pruned_loss=0.01221, over 3991502.57 frames. ], batch size: 180, lr: 1.48e-02, grad_scale: 8.0 2024-03-15 23:53:52,168 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=33343.333333333336, ans=0.0 2024-03-15 23:53:56,066 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff2_skip_rate, batch_count=33343.333333333336, ans=0.003621014492753623 2024-03-15 23:54:15,547 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.374e+01 9.351e+01 1.091e+02 1.464e+02 2.330e+02, threshold=2.182e+02, percent-clipped=0.0 2024-03-15 23:54:22,139 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module1.balancer1.prob, batch_count=33410.0, ans=0.125 2024-03-15 23:54:27,079 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=33443.333333333336, ans=0.1 2024-03-15 23:54:27,934 INFO [train_char.py:689] (1/2) Epoch 20, batch 400, loss[loss=0.07232, simple_loss=0.1259, pruned_loss=0.009357, over 24217.00 frames. ], tot_loss[loss=0.07813, simple_loss=0.1319, pruned_loss=0.0122, over 4180371.29 frames. ], batch size: 328, lr: 1.48e-02, grad_scale: 16.0 2024-03-15 23:54:39,089 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.out_whiten.whitening_limit, batch_count=33443.333333333336, ans=15.0 2024-03-15 23:54:55,734 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn2.whiten, num_groups=1, num_channels=512, metric=12.03 vs. limit=22.5 2024-03-15 23:55:01,720 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 23:55:09,182 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass.scale_min, batch_count=33543.333333333336, ans=0.2 2024-03-15 23:55:27,793 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.min_positive, batch_count=33576.666666666664, ans=0.05 2024-03-15 23:55:33,919 INFO [train_char.py:689] (1/2) Epoch 20, batch 450, loss[loss=0.07352, simple_loss=0.1294, pruned_loss=0.008809, over 24423.00 frames. ], tot_loss[loss=0.07924, simple_loss=0.1336, pruned_loss=0.01242, over 4326880.63 frames. ], batch size: 165, lr: 1.48e-02, grad_scale: 16.0 2024-03-15 23:55:37,867 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer1.prob, batch_count=33610.0, ans=0.125 2024-03-15 23:56:02,286 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module2.balancer2.min_abs, batch_count=33676.666666666664, ans=0.5 2024-03-15 23:56:12,251 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=33710.0, ans=0.0 2024-03-15 23:56:14,298 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.whiten, num_groups=1, num_channels=192, metric=4.57 vs. limit=12.0 2024-03-15 23:56:17,260 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=33710.0, ans=0.0 2024-03-15 23:56:18,584 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.skip_rate, batch_count=33710.0, ans=0.04949747468305833 2024-03-15 23:56:23,439 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=33710.0, ans=0.1 2024-03-15 23:56:25,714 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.181e+01 9.140e+01 1.097e+02 1.429e+02 2.929e+02, threshold=2.193e+02, percent-clipped=4.0 2024-03-15 23:56:25,912 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.0.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-15 23:56:38,421 INFO [train_char.py:689] (1/2) Epoch 20, batch 500, loss[loss=0.08828, simple_loss=0.1467, pruned_loss=0.01495, over 24250.00 frames. ], tot_loss[loss=0.08052, simple_loss=0.1359, pruned_loss=0.01258, over 4438865.90 frames. ], batch size: 224, lr: 1.48e-02, grad_scale: 16.0 2024-03-15 23:57:40,515 INFO [train_char.py:689] (1/2) Epoch 21, batch 0, loss[loss=0.07438, simple_loss=0.1296, pruned_loss=0.009571, over 24387.00 frames. ], tot_loss[loss=0.07438, simple_loss=0.1296, pruned_loss=0.009571, over 24387.00 frames. ], batch size: 152, lr: 1.44e-02, grad_scale: 32.0 2024-03-15 23:57:40,515 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-15 23:57:54,054 INFO [train_char.py:721] (1/2) Epoch 21, validation: loss=0.06442, simple_loss=0.1177, pruned_loss=0.005578, over 657665.00 frames. 2024-03-15 23:57:54,055 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-15 23:58:33,777 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_skip_rate, batch_count=33900.0, ans=0.0 2024-03-15 23:58:40,431 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=33900.0, ans=0.0 2024-03-15 23:58:54,127 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=33933.333333333336, ans=0.125 2024-03-15 23:58:55,353 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.nonlin_attention.balancer.prob, batch_count=33933.333333333336, ans=0.125 2024-03-15 23:59:01,950 INFO [train_char.py:689] (1/2) Epoch 21, batch 50, loss[loss=0.07365, simple_loss=0.131, pruned_loss=0.008163, over 24334.00 frames. ], tot_loss[loss=0.07479, simple_loss=0.1273, pruned_loss=0.01115, over 1076955.57 frames. ], batch size: 180, lr: 1.44e-02, grad_scale: 32.0 2024-03-15 23:59:06,292 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer1.prob, batch_count=33966.666666666664, ans=0.125 2024-03-15 23:59:07,665 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=33966.666666666664, ans=0.125 2024-03-15 23:59:07,673 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer2.prob, batch_count=33966.666666666664, ans=0.125 2024-03-15 23:59:10,267 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=33966.666666666664, ans=0.0 2024-03-15 23:59:30,906 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.ff3_skip_rate, batch_count=34000.0, ans=0.003478260869565218 2024-03-15 23:59:44,822 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=34033.333333333336, ans=0.2 2024-03-15 23:59:46,406 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=384, metric=16.55 vs. limit=22.5 2024-03-15 23:59:52,536 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=34066.666666666664, ans=0.0 2024-03-15 23:59:54,818 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.798e+01 9.597e+01 1.137e+02 1.446e+02 3.282e+02, threshold=2.274e+02, percent-clipped=6.0 2024-03-16 00:00:06,655 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.attention_skip_rate, batch_count=34100.0, ans=0.0 2024-03-16 00:00:14,713 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=11.62 vs. limit=15.0 2024-03-16 00:00:15,260 INFO [train_char.py:689] (1/2) Epoch 21, batch 100, loss[loss=0.06909, simple_loss=0.1254, pruned_loss=0.00641, over 24285.00 frames. ], tot_loss[loss=0.07421, simple_loss=0.1263, pruned_loss=0.01108, over 1908188.82 frames. ], batch size: 116, lr: 1.44e-02, grad_scale: 16.0 2024-03-16 00:00:16,361 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.out_whiten, num_groups=1, num_channels=192, metric=7.66 vs. limit=8.0 2024-03-16 00:00:27,565 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=7.70 vs. limit=15.0 2024-03-16 00:00:55,874 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=34233.333333333336, ans=0.125 2024-03-16 00:01:01,017 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass.skip_rate, batch_count=34233.333333333336, ans=0.09899494936611666 2024-03-16 00:01:01,047 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer1.prob, batch_count=34233.333333333336, ans=0.125 2024-03-16 00:01:04,788 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=34233.333333333336, ans=0.0 2024-03-16 00:01:07,332 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer1.prob, batch_count=34266.666666666664, ans=0.125 2024-03-16 00:01:20,139 INFO [train_char.py:689] (1/2) Epoch 21, batch 150, loss[loss=0.06718, simple_loss=0.116, pruned_loss=0.009169, over 24115.00 frames. ], tot_loss[loss=0.07419, simple_loss=0.1267, pruned_loss=0.01084, over 2554317.08 frames. ], batch size: 362, lr: 1.43e-02, grad_scale: 16.0 2024-03-16 00:01:20,417 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer2.min_positive, batch_count=34300.0, ans=0.05 2024-03-16 00:01:23,089 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer2.prob, batch_count=34300.0, ans=0.125 2024-03-16 00:02:04,025 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.613e+01 9.775e+01 1.277e+02 1.595e+02 3.121e+02, threshold=2.554e+02, percent-clipped=7.0 2024-03-16 00:02:12,248 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.balancer1.prob, batch_count=34433.333333333336, ans=0.125 2024-03-16 00:02:14,791 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer2.min_positive, batch_count=34433.333333333336, ans=0.05 2024-03-16 00:02:14,813 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer1.prob, batch_count=34433.333333333336, ans=0.125 2024-03-16 00:02:16,099 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=34433.333333333336, ans=0.125 2024-03-16 00:02:28,872 INFO [train_char.py:689] (1/2) Epoch 21, batch 200, loss[loss=0.09057, simple_loss=0.1563, pruned_loss=0.01241, over 24108.00 frames. ], tot_loss[loss=0.07502, simple_loss=0.1277, pruned_loss=0.01115, over 3052447.95 frames. ], batch size: 236, lr: 1.43e-02, grad_scale: 16.0 2024-03-16 00:02:29,066 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=34466.666666666664, ans=0.1 2024-03-16 00:02:35,707 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer1.prob, batch_count=34466.666666666664, ans=0.125 2024-03-16 00:02:39,520 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer2.prob, batch_count=34466.666666666664, ans=0.125 2024-03-16 00:02:39,551 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 00:03:00,762 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=34533.333333333336, ans=0.1 2024-03-16 00:03:05,835 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer1.min_positive, batch_count=34533.333333333336, ans=0.025 2024-03-16 00:03:13,618 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer2.prob, batch_count=34566.666666666664, ans=0.125 2024-03-16 00:03:36,402 INFO [train_char.py:689] (1/2) Epoch 21, batch 250, loss[loss=0.08842, simple_loss=0.1511, pruned_loss=0.01289, over 24104.00 frames. ], tot_loss[loss=0.07561, simple_loss=0.1284, pruned_loss=0.0114, over 3445456.39 frames. ], batch size: 279, lr: 1.43e-02, grad_scale: 16.0 2024-03-16 00:03:37,918 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=34633.333333333336, ans=0.125 2024-03-16 00:03:38,009 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.min_positive, batch_count=34633.333333333336, ans=0.05 2024-03-16 00:04:05,994 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.80 vs. limit=6.0 2024-03-16 00:04:18,597 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=256, metric=4.50 vs. limit=15.0 2024-03-16 00:04:20,354 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.173e+01 9.619e+01 1.155e+02 1.493e+02 2.796e+02, threshold=2.311e+02, percent-clipped=1.0 2024-03-16 00:04:40,728 INFO [train_char.py:689] (1/2) Epoch 21, batch 300, loss[loss=0.07139, simple_loss=0.1237, pruned_loss=0.009527, over 24378.00 frames. ], tot_loss[loss=0.07599, simple_loss=0.1288, pruned_loss=0.0116, over 3753269.67 frames. ], batch size: 158, lr: 1.43e-02, grad_scale: 16.0 2024-03-16 00:05:14,676 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=34866.666666666664, ans=0.2 2024-03-16 00:05:30,249 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=34900.0, ans=0.0 2024-03-16 00:05:36,528 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer1.prob, batch_count=34933.333333333336, ans=0.125 2024-03-16 00:05:37,807 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer2.prob, batch_count=34933.333333333336, ans=0.125 2024-03-16 00:05:47,538 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=13.46 vs. limit=15.0 2024-03-16 00:05:50,488 INFO [train_char.py:689] (1/2) Epoch 21, batch 350, loss[loss=0.06833, simple_loss=0.1176, pruned_loss=0.00955, over 24152.00 frames. ], tot_loss[loss=0.07669, simple_loss=0.1299, pruned_loss=0.01171, over 3993861.63 frames. ], batch size: 362, lr: 1.42e-02, grad_scale: 16.0 2024-03-16 00:06:02,090 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=35000.0, ans=0.0 2024-03-16 00:06:33,004 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.808e+01 1.023e+02 1.285e+02 1.704e+02 2.901e+02, threshold=2.569e+02, percent-clipped=6.0 2024-03-16 00:06:33,695 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=6.96 vs. limit=10.0 2024-03-16 00:06:41,627 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module1.whiten, num_groups=1, num_channels=512, metric=4.03 vs. limit=15.0 2024-03-16 00:06:45,028 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.prob, batch_count=35100.0, ans=0.125 2024-03-16 00:06:56,260 INFO [train_char.py:689] (1/2) Epoch 21, batch 400, loss[loss=0.07228, simple_loss=0.1242, pruned_loss=0.0102, over 24391.00 frames. ], tot_loss[loss=0.07756, simple_loss=0.1314, pruned_loss=0.01187, over 4181555.47 frames. ], batch size: 158, lr: 1.42e-02, grad_scale: 32.0 2024-03-16 00:07:19,571 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=9.70 vs. limit=15.0 2024-03-16 00:07:30,354 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.scale_min, batch_count=35200.0, ans=0.2 2024-03-16 00:07:32,225 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.30 vs. limit=6.0 2024-03-16 00:07:47,983 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer1.max_abs, batch_count=35266.666666666664, ans=10.0 2024-03-16 00:08:01,180 INFO [train_char.py:689] (1/2) Epoch 21, batch 450, loss[loss=0.07621, simple_loss=0.1297, pruned_loss=0.01135, over 24368.00 frames. ], tot_loss[loss=0.07811, simple_loss=0.1325, pruned_loss=0.01184, over 4328133.90 frames. ], batch size: 172, lr: 1.42e-02, grad_scale: 32.0 2024-03-16 00:08:01,416 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=35300.0, ans=0.1 2024-03-16 00:08:04,408 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=14.49 vs. limit=22.5 2024-03-16 00:08:09,138 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.out_combiner.scale_min, batch_count=35300.0, ans=0.2 2024-03-16 00:08:28,419 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass.scale_min, batch_count=35366.666666666664, ans=0.2 2024-03-16 00:08:44,289 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.539e+01 9.378e+01 1.043e+02 1.385e+02 2.513e+02, threshold=2.085e+02, percent-clipped=0.0 2024-03-16 00:08:47,152 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass_mid.scale_min, batch_count=35400.0, ans=0.2 2024-03-16 00:09:05,578 INFO [train_char.py:689] (1/2) Epoch 21, batch 500, loss[loss=0.08561, simple_loss=0.1448, pruned_loss=0.0132, over 24134.00 frames. ], tot_loss[loss=0.07924, simple_loss=0.1347, pruned_loss=0.01191, over 4440700.00 frames. ], batch size: 236, lr: 1.42e-02, grad_scale: 32.0 2024-03-16 00:10:02,947 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.whiten, num_groups=1, num_channels=512, metric=12.43 vs. limit=12.0 2024-03-16 00:10:07,692 INFO [train_char.py:689] (1/2) Epoch 22, batch 0, loss[loss=0.06737, simple_loss=0.1128, pruned_loss=0.01099, over 23918.00 frames. ], tot_loss[loss=0.06737, simple_loss=0.1128, pruned_loss=0.01099, over 23918.00 frames. ], batch size: 407, lr: 1.38e-02, grad_scale: 32.0 2024-03-16 00:10:07,692 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 00:10:21,687 INFO [train_char.py:721] (1/2) Epoch 22, validation: loss=0.06405, simple_loss=0.1169, pruned_loss=0.005585, over 657665.00 frames. 2024-03-16 00:10:21,688 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 00:10:34,350 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer1.prob, batch_count=35523.333333333336, ans=0.125 2024-03-16 00:10:50,477 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.scale_min, batch_count=35556.666666666664, ans=0.2 2024-03-16 00:10:50,528 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.prob, batch_count=35556.666666666664, ans=0.125 2024-03-16 00:11:06,305 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.balancer2.prob, batch_count=35590.0, ans=0.125 2024-03-16 00:11:31,872 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.whiten, num_groups=1, num_channels=384, metric=6.40 vs. limit=12.0 2024-03-16 00:11:33,680 INFO [train_char.py:689] (1/2) Epoch 22, batch 50, loss[loss=0.07054, simple_loss=0.1257, pruned_loss=0.00768, over 24212.00 frames. ], tot_loss[loss=0.07396, simple_loss=0.1271, pruned_loss=0.01038, over 1089297.88 frames. ], batch size: 122, lr: 1.38e-02, grad_scale: 32.0 2024-03-16 00:11:42,180 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.ff3_skip_rate, batch_count=35656.666666666664, ans=0.0031181159420289855 2024-03-16 00:12:10,728 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.629e+01 9.247e+01 1.091e+02 1.260e+02 3.544e+02, threshold=2.182e+02, percent-clipped=4.0 2024-03-16 00:12:18,883 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=35756.666666666664, ans=0.125 2024-03-16 00:12:39,204 INFO [train_char.py:689] (1/2) Epoch 22, batch 100, loss[loss=0.07113, simple_loss=0.1283, pruned_loss=0.006999, over 23884.00 frames. ], tot_loss[loss=0.07458, simple_loss=0.1283, pruned_loss=0.01042, over 1917880.95 frames. ], batch size: 107, lr: 1.38e-02, grad_scale: 16.0 2024-03-16 00:12:43,503 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.prob, batch_count=35823.333333333336, ans=0.125 2024-03-16 00:12:56,134 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=35856.666666666664, ans=0.1 2024-03-16 00:12:58,839 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=35856.666666666664, ans=0.125 2024-03-16 00:13:15,504 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer2.prob, batch_count=35890.0, ans=0.125 2024-03-16 00:13:21,569 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.bypass_mid.scale_min, batch_count=35923.333333333336, ans=0.2 2024-03-16 00:13:31,867 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module2.balancer2.prob, batch_count=35956.666666666664, ans=0.125 2024-03-16 00:13:34,166 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer1.prob, batch_count=35956.666666666664, ans=0.125 2024-03-16 00:13:43,050 INFO [train_char.py:689] (1/2) Epoch 22, batch 150, loss[loss=0.07894, simple_loss=0.1288, pruned_loss=0.01454, over 24164.00 frames. ], tot_loss[loss=0.07401, simple_loss=0.1271, pruned_loss=0.01045, over 2557218.09 frames. ], batch size: 344, lr: 1.38e-02, grad_scale: 16.0 2024-03-16 00:13:51,335 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass.scale_min, batch_count=35990.0, ans=0.2 2024-03-16 00:14:26,679 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.506e+01 9.508e+01 1.078e+02 1.395e+02 3.027e+02, threshold=2.155e+02, percent-clipped=6.0 2024-03-16 00:14:35,069 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.09 vs. limit=6.0 2024-03-16 00:14:41,741 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn2.whiten, num_groups=1, num_channels=192, metric=16.65 vs. limit=22.5 2024-03-16 00:14:51,571 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 00:14:54,234 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.attention_skip_rate, batch_count=36156.666666666664, ans=0.0 2024-03-16 00:14:55,164 INFO [train_char.py:689] (1/2) Epoch 22, batch 200, loss[loss=0.06133, simple_loss=0.09267, pruned_loss=0.01499, over 22771.00 frames. ], tot_loss[loss=0.07341, simple_loss=0.1261, pruned_loss=0.01037, over 3062057.05 frames. ], batch size: 483, lr: 1.38e-02, grad_scale: 16.0 2024-03-16 00:15:16,765 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass.scale_min, batch_count=36190.0, ans=0.2 2024-03-16 00:15:24,549 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer1.min_positive, batch_count=36223.333333333336, ans=0.025 2024-03-16 00:15:24,593 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer2.min_abs, batch_count=36223.333333333336, ans=0.5 2024-03-16 00:15:35,909 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass_mid.scale_min, batch_count=36256.666666666664, ans=0.2 2024-03-16 00:15:48,688 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=36290.0, ans=0.1 2024-03-16 00:15:58,754 INFO [train_char.py:689] (1/2) Epoch 22, batch 250, loss[loss=0.08157, simple_loss=0.1363, pruned_loss=0.01341, over 24239.00 frames. ], tot_loss[loss=0.07352, simple_loss=0.1263, pruned_loss=0.01036, over 3447444.81 frames. ], batch size: 296, lr: 1.37e-02, grad_scale: 8.0 2024-03-16 00:15:59,353 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=256, metric=2.61 vs. limit=15.0 2024-03-16 00:16:00,938 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=5.08 vs. limit=15.0 2024-03-16 00:16:18,084 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer1.prob, batch_count=36356.666666666664, ans=0.125 2024-03-16 00:16:35,847 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.385e+01 9.496e+01 1.113e+02 1.323e+02 2.572e+02, threshold=2.227e+02, percent-clipped=2.0 2024-03-16 00:16:37,776 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.self_attn_weights.whiten_keys.whitening_limit, batch_count=36423.333333333336, ans=6.0 2024-03-16 00:17:02,190 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=384, metric=4.90 vs. limit=15.0 2024-03-16 00:17:05,699 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=36456.666666666664, ans=0.2 2024-03-16 00:17:09,067 INFO [train_char.py:689] (1/2) Epoch 22, batch 300, loss[loss=0.06557, simple_loss=0.114, pruned_loss=0.008554, over 24343.00 frames. ], tot_loss[loss=0.07346, simple_loss=0.1261, pruned_loss=0.0104, over 3747494.93 frames. ], batch size: 129, lr: 1.37e-02, grad_scale: 8.0 2024-03-16 00:17:24,267 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward3.hidden_balancer.prob, batch_count=36523.333333333336, ans=0.125 2024-03-16 00:17:30,038 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.40 vs. limit=6.0 2024-03-16 00:17:33,316 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff2_skip_rate, batch_count=36556.666666666664, ans=0.0029224637681159425 2024-03-16 00:18:12,013 INFO [train_char.py:689] (1/2) Epoch 22, batch 350, loss[loss=0.07387, simple_loss=0.125, pruned_loss=0.0114, over 24379.00 frames. ], tot_loss[loss=0.0743, simple_loss=0.1274, pruned_loss=0.01061, over 3989333.07 frames. ], batch size: 152, lr: 1.37e-02, grad_scale: 8.0 2024-03-16 00:18:12,678 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=384, metric=15.32 vs. limit=22.5 2024-03-16 00:18:14,940 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-16 00:18:53,111 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.793e+01 9.479e+01 1.096e+02 1.512e+02 2.433e+02, threshold=2.193e+02, percent-clipped=6.0 2024-03-16 00:18:53,407 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=36756.666666666664, ans=0.125 2024-03-16 00:18:59,890 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer1.prob, batch_count=36756.666666666664, ans=0.125 2024-03-16 00:19:03,472 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.layerdrop_rate, batch_count=36756.666666666664, ans=0.015 2024-03-16 00:19:13,682 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=36790.0, ans=0.0 2024-03-16 00:19:19,697 INFO [train_char.py:689] (1/2) Epoch 22, batch 400, loss[loss=0.08594, simple_loss=0.1459, pruned_loss=0.013, over 24105.00 frames. ], tot_loss[loss=0.07509, simple_loss=0.1286, pruned_loss=0.01078, over 4177542.71 frames. ], batch size: 199, lr: 1.37e-02, grad_scale: 16.0 2024-03-16 00:19:21,252 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.skip_rate, batch_count=36823.333333333336, ans=0.09899494936611666 2024-03-16 00:19:35,398 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=36856.666666666664, ans=0.125 2024-03-16 00:19:36,126 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=192, metric=11.70 vs. limit=15.0 2024-03-16 00:19:42,350 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.13 vs. limit=6.0 2024-03-16 00:19:47,913 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.ff2_skip_rate, batch_count=36890.0, ans=0.0028499999999999992 2024-03-16 00:19:54,466 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.ff2_skip_rate, batch_count=36890.0, ans=0.0028499999999999992 2024-03-16 00:19:58,285 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=36923.333333333336, ans=0.0 2024-03-16 00:20:00,883 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=36923.333333333336, ans=0.0 2024-03-16 00:20:09,790 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass_mid.scale_min, batch_count=36923.333333333336, ans=0.2 2024-03-16 00:20:19,803 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer1.prob, batch_count=36956.666666666664, ans=0.125 2024-03-16 00:20:23,523 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=36956.666666666664, ans=0.1 2024-03-16 00:20:25,854 INFO [train_char.py:689] (1/2) Epoch 22, batch 450, loss[loss=0.07522, simple_loss=0.1341, pruned_loss=0.008162, over 24097.00 frames. ], tot_loss[loss=0.07609, simple_loss=0.1303, pruned_loss=0.01095, over 4324164.73 frames. ], batch size: 199, lr: 1.37e-02, grad_scale: 16.0 2024-03-16 00:20:28,565 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=36990.0, ans=0.0 2024-03-16 00:20:48,565 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.attention_skip_rate, batch_count=37023.333333333336, ans=0.0 2024-03-16 00:20:49,968 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.balancer2.prob, batch_count=37056.666666666664, ans=0.125 2024-03-16 00:21:02,542 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.162e+01 9.120e+01 1.035e+02 1.225e+02 2.594e+02, threshold=2.069e+02, percent-clipped=6.0 2024-03-16 00:21:22,197 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.bypass.scale_min, batch_count=37123.333333333336, ans=0.2 2024-03-16 00:21:29,367 INFO [train_char.py:689] (1/2) Epoch 22, batch 500, loss[loss=0.08906, simple_loss=0.1485, pruned_loss=0.01482, over 24229.00 frames. ], tot_loss[loss=0.07743, simple_loss=0.1327, pruned_loss=0.0111, over 4437307.42 frames. ], batch size: 224, lr: 1.36e-02, grad_scale: 16.0 2024-03-16 00:21:29,617 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=37156.666666666664, ans=0.1 2024-03-16 00:21:33,178 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=37156.666666666664, ans=0.125 2024-03-16 00:22:27,666 INFO [train_char.py:689] (1/2) Epoch 23, batch 0, loss[loss=0.07846, simple_loss=0.1358, pruned_loss=0.01054, over 24217.00 frames. ], tot_loss[loss=0.07846, simple_loss=0.1358, pruned_loss=0.01054, over 24217.00 frames. ], batch size: 224, lr: 1.33e-02, grad_scale: 32.0 2024-03-16 00:22:27,667 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 00:22:41,530 INFO [train_char.py:721] (1/2) Epoch 23, validation: loss=0.06381, simple_loss=0.1166, pruned_loss=0.005493, over 657665.00 frames. 2024-03-16 00:22:41,530 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 00:23:11,192 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.ff2_skip_rate, batch_count=37213.333333333336, ans=0.0027797101449275356 2024-03-16 00:23:35,956 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten.whitening_limit, batch_count=37280.0, ans=15.0 2024-03-16 00:23:57,125 INFO [train_char.py:689] (1/2) Epoch 23, batch 50, loss[loss=0.0649, simple_loss=0.1093, pruned_loss=0.01024, over 23972.00 frames. ], tot_loss[loss=0.07095, simple_loss=0.1232, pruned_loss=0.009358, over 1085780.84 frames. ], batch size: 407, lr: 1.33e-02, grad_scale: 32.0 2024-03-16 00:24:02,854 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer1.prob, batch_count=37346.666666666664, ans=0.125 2024-03-16 00:24:10,994 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=37380.0, ans=0.0 2024-03-16 00:24:12,461 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.scale_min, batch_count=37380.0, ans=0.2 2024-03-16 00:24:19,053 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=37380.0, ans=0.1 2024-03-16 00:24:26,276 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.505e+01 8.832e+01 1.099e+02 1.359e+02 2.976e+02, threshold=2.198e+02, percent-clipped=5.0 2024-03-16 00:24:27,843 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=37413.333333333336, ans=0.0 2024-03-16 00:24:36,198 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=512, metric=4.69 vs. limit=15.0 2024-03-16 00:24:51,008 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.bypass_mid.scale_min, batch_count=37480.0, ans=0.2 2024-03-16 00:25:02,325 INFO [train_char.py:689] (1/2) Epoch 23, batch 100, loss[loss=0.06521, simple_loss=0.1099, pruned_loss=0.01025, over 23961.00 frames. ], tot_loss[loss=0.07292, simple_loss=0.1253, pruned_loss=0.01027, over 1914168.43 frames. ], batch size: 407, lr: 1.33e-02, grad_scale: 32.0 2024-03-16 00:25:20,246 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=37546.666666666664, ans=0.1 2024-03-16 00:26:08,505 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer1.prob, batch_count=37646.666666666664, ans=0.125 2024-03-16 00:26:16,005 INFO [train_char.py:689] (1/2) Epoch 23, batch 150, loss[loss=0.08062, simple_loss=0.1368, pruned_loss=0.01223, over 24255.00 frames. ], tot_loss[loss=0.07373, simple_loss=0.127, pruned_loss=0.01022, over 2558794.64 frames. ], batch size: 296, lr: 1.33e-02, grad_scale: 32.0 2024-03-16 00:26:27,735 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=37713.333333333336, ans=0.1 2024-03-16 00:26:44,409 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.776e+01 9.534e+01 1.222e+02 1.511e+02 3.389e+02, threshold=2.445e+02, percent-clipped=7.0 2024-03-16 00:26:50,372 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=4.98 vs. limit=15.0 2024-03-16 00:27:20,182 INFO [train_char.py:689] (1/2) Epoch 23, batch 200, loss[loss=0.0626, simple_loss=0.107, pruned_loss=0.009125, over 23994.00 frames. ], tot_loss[loss=0.07389, simple_loss=0.1274, pruned_loss=0.01021, over 3060326.01 frames. ], batch size: 408, lr: 1.32e-02, grad_scale: 32.0 2024-03-16 00:27:29,279 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.attention_skip_rate, batch_count=37846.666666666664, ans=0.0 2024-03-16 00:27:29,772 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=8.12 vs. limit=10.0 2024-03-16 00:27:31,931 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.attention_skip_rate, batch_count=37880.0, ans=0.0 2024-03-16 00:28:10,148 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.09 vs. limit=6.0 2024-03-16 00:28:20,435 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass_mid.scale_min, batch_count=37980.0, ans=0.2 2024-03-16 00:28:23,930 INFO [train_char.py:689] (1/2) Epoch 23, batch 250, loss[loss=0.06041, simple_loss=0.1059, pruned_loss=0.007475, over 24330.00 frames. ], tot_loss[loss=0.07307, simple_loss=0.1261, pruned_loss=0.01003, over 3453987.05 frames. ], batch size: 129, lr: 1.32e-02, grad_scale: 32.0 2024-03-16 00:28:41,300 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=38013.333333333336, ans=0.0 2024-03-16 00:28:56,738 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.skip_rate, batch_count=38080.0, ans=0.09899494936611666 2024-03-16 00:28:58,982 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.329e+01 1.019e+02 1.304e+02 1.708e+02 3.266e+02, threshold=2.609e+02, percent-clipped=5.0 2024-03-16 00:29:20,862 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.ff3_skip_rate, batch_count=38146.666666666664, ans=0.0025768115942028996 2024-03-16 00:29:24,681 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.ff2_skip_rate, batch_count=38146.666666666664, ans=0.0025768115942028996 2024-03-16 00:29:27,372 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer1.prob, batch_count=38146.666666666664, ans=0.125 2024-03-16 00:29:34,765 INFO [train_char.py:689] (1/2) Epoch 23, batch 300, loss[loss=0.08128, simple_loss=0.136, pruned_loss=0.01328, over 24161.00 frames. ], tot_loss[loss=0.07385, simple_loss=0.1273, pruned_loss=0.0102, over 3753467.53 frames. ], batch size: 188, lr: 1.32e-02, grad_scale: 32.0 2024-03-16 00:29:55,956 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=11.93 vs. limit=15.0 2024-03-16 00:30:05,410 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=38246.666666666664, ans=0.0 2024-03-16 00:30:16,977 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=38280.0, ans=0.1 2024-03-16 00:30:31,540 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass.skip_rate, batch_count=38313.333333333336, ans=0.035 2024-03-16 00:30:37,487 INFO [train_char.py:689] (1/2) Epoch 23, batch 350, loss[loss=0.08851, simple_loss=0.1504, pruned_loss=0.01331, over 24257.00 frames. ], tot_loss[loss=0.07426, simple_loss=0.128, pruned_loss=0.01028, over 3990708.20 frames. ], batch size: 212, lr: 1.32e-02, grad_scale: 32.0 2024-03-16 00:30:41,430 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=38346.666666666664, ans=0.125 2024-03-16 00:31:09,688 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.909e+01 9.004e+01 1.052e+02 1.311e+02 2.813e+02, threshold=2.104e+02, percent-clipped=1.0 2024-03-16 00:31:18,875 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 00:31:27,764 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.attention_skip_rate, batch_count=38446.666666666664, ans=0.0 2024-03-16 00:31:29,135 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=38446.666666666664, ans=0.125 2024-03-16 00:31:36,700 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=38480.0, ans=0.1 2024-03-16 00:31:40,857 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=12.12 vs. limit=15.0 2024-03-16 00:31:45,104 INFO [train_char.py:689] (1/2) Epoch 23, batch 400, loss[loss=0.0753, simple_loss=0.1351, pruned_loss=0.007748, over 24345.00 frames. ], tot_loss[loss=0.0754, simple_loss=0.1299, pruned_loss=0.01046, over 4178557.80 frames. ], batch size: 180, lr: 1.32e-02, grad_scale: 32.0 2024-03-16 00:32:15,877 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=38580.0, ans=0.0 2024-03-16 00:32:15,982 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff3_skip_rate, batch_count=38580.0, ans=0.0024826086956521737 2024-03-16 00:32:16,340 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=11.52 vs. limit=15.0 2024-03-16 00:32:41,130 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=38646.666666666664, ans=0.1 2024-03-16 00:32:51,177 INFO [train_char.py:689] (1/2) Epoch 23, batch 450, loss[loss=0.08531, simple_loss=0.1457, pruned_loss=0.01248, over 24140.00 frames. ], tot_loss[loss=0.07637, simple_loss=0.1314, pruned_loss=0.01065, over 4324441.69 frames. ], batch size: 188, lr: 1.31e-02, grad_scale: 32.0 2024-03-16 00:33:20,464 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.094e+01 9.392e+01 1.088e+02 1.385e+02 2.141e+02, threshold=2.177e+02, percent-clipped=1.0 2024-03-16 00:33:45,460 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer2.prob, batch_count=38813.333333333336, ans=0.125 2024-03-16 00:33:55,181 INFO [train_char.py:689] (1/2) Epoch 23, batch 500, loss[loss=0.08571, simple_loss=0.1425, pruned_loss=0.01445, over 24094.00 frames. ], tot_loss[loss=0.07684, simple_loss=0.1323, pruned_loss=0.0107, over 4437108.02 frames. ], batch size: 199, lr: 1.31e-02, grad_scale: 32.0 2024-03-16 00:34:50,095 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=16.66 vs. limit=22.5 2024-03-16 00:34:55,667 INFO [train_char.py:689] (1/2) Epoch 24, batch 0, loss[loss=0.08033, simple_loss=0.1364, pruned_loss=0.01214, over 24217.00 frames. ], tot_loss[loss=0.08033, simple_loss=0.1364, pruned_loss=0.01214, over 24217.00 frames. ], batch size: 212, lr: 1.28e-02, grad_scale: 32.0 2024-03-16 00:34:55,668 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 00:35:09,607 INFO [train_char.py:721] (1/2) Epoch 24, validation: loss=0.06293, simple_loss=0.1149, pruned_loss=0.00547, over 657665.00 frames. 2024-03-16 00:35:09,608 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 00:35:19,007 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=38870.0, ans=0.125 2024-03-16 00:35:33,776 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.prob, batch_count=38903.333333333336, ans=0.125 2024-03-16 00:36:11,032 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=39003.333333333336, ans=0.0023905797101449276 2024-03-16 00:36:18,596 INFO [train_char.py:689] (1/2) Epoch 24, batch 50, loss[loss=0.07179, simple_loss=0.1229, pruned_loss=0.01035, over 24175.00 frames. ], tot_loss[loss=0.07122, simple_loss=0.1238, pruned_loss=0.009315, over 1085072.50 frames. ], batch size: 344, lr: 1.28e-02, grad_scale: 32.0 2024-03-16 00:36:38,748 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.636e+01 9.591e+01 1.123e+02 1.366e+02 2.605e+02, threshold=2.246e+02, percent-clipped=2.0 2024-03-16 00:37:18,475 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=39136.666666666664, ans=0.0 2024-03-16 00:37:23,105 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=11.56 vs. limit=15.0 2024-03-16 00:37:35,073 INFO [train_char.py:689] (1/2) Epoch 24, batch 100, loss[loss=0.06627, simple_loss=0.1175, pruned_loss=0.007505, over 24274.00 frames. ], tot_loss[loss=0.07178, simple_loss=0.1251, pruned_loss=0.00923, over 1907982.62 frames. ], batch size: 116, lr: 1.28e-02, grad_scale: 32.0 2024-03-16 00:38:10,917 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=512, metric=4.42 vs. limit=15.0 2024-03-16 00:38:18,026 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=39303.333333333336, ans=0.125 2024-03-16 00:38:21,780 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=39303.333333333336, ans=0.1 2024-03-16 00:38:24,467 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=39303.333333333336, ans=0.125 2024-03-16 00:38:40,239 INFO [train_char.py:689] (1/2) Epoch 24, batch 150, loss[loss=0.0873, simple_loss=0.1538, pruned_loss=0.0104, over 24096.00 frames. ], tot_loss[loss=0.07147, simple_loss=0.1244, pruned_loss=0.009287, over 2551759.29 frames. ], batch size: 236, lr: 1.28e-02, grad_scale: 32.0 2024-03-16 00:38:40,438 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.bypass.skip_rate, batch_count=39370.0, ans=0.035 2024-03-16 00:38:57,520 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=39403.333333333336, ans=0.0 2024-03-16 00:38:59,617 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.918e+01 9.440e+01 1.166e+02 1.727e+02 3.008e+02, threshold=2.332e+02, percent-clipped=10.0 2024-03-16 00:39:00,416 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=13.62 vs. limit=15.0 2024-03-16 00:39:19,783 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=384, metric=8.06 vs. limit=15.0 2024-03-16 00:39:44,691 INFO [train_char.py:689] (1/2) Epoch 24, batch 200, loss[loss=0.0634, simple_loss=0.1129, pruned_loss=0.006926, over 24253.00 frames. ], tot_loss[loss=0.07178, simple_loss=0.1252, pruned_loss=0.009155, over 3057878.72 frames. ], batch size: 116, lr: 1.28e-02, grad_scale: 32.0 2024-03-16 00:40:28,069 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=14.73 vs. limit=15.0 2024-03-16 00:40:38,498 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=15.18 vs. limit=15.0 2024-03-16 00:40:40,601 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.scale_min, batch_count=39636.666666666664, ans=0.2 2024-03-16 00:40:56,958 INFO [train_char.py:689] (1/2) Epoch 24, batch 250, loss[loss=0.05718, simple_loss=0.1004, pruned_loss=0.006974, over 23826.00 frames. ], tot_loss[loss=0.07205, simple_loss=0.1255, pruned_loss=0.009312, over 3445751.24 frames. ], batch size: 439, lr: 1.27e-02, grad_scale: 32.0 2024-03-16 00:41:15,996 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.764e+01 9.455e+01 1.157e+02 1.660e+02 2.705e+02, threshold=2.315e+02, percent-clipped=5.0 2024-03-16 00:41:21,667 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer2.prob, batch_count=39770.0, ans=0.125 2024-03-16 00:41:49,617 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer1.prob, batch_count=39836.666666666664, ans=0.125 2024-03-16 00:42:00,749 INFO [train_char.py:689] (1/2) Epoch 24, batch 300, loss[loss=0.06945, simple_loss=0.1163, pruned_loss=0.01132, over 24337.00 frames. ], tot_loss[loss=0.07315, simple_loss=0.1273, pruned_loss=0.009508, over 3753298.56 frames. ], batch size: 152, lr: 1.27e-02, grad_scale: 32.0 2024-03-16 00:42:51,885 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.skip_rate, batch_count=39970.0, ans=0.04949747468305833 2024-03-16 00:43:06,040 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=40003.333333333336, ans=0.125 2024-03-16 00:43:11,860 INFO [train_char.py:689] (1/2) Epoch 24, batch 350, loss[loss=0.05932, simple_loss=0.1091, pruned_loss=0.004769, over 24298.00 frames. ], tot_loss[loss=0.07342, simple_loss=0.1277, pruned_loss=0.009578, over 3994158.74 frames. ], batch size: 140, lr: 1.27e-02, grad_scale: 32.0 2024-03-16 00:43:27,763 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.29 vs. limit=6.0 2024-03-16 00:43:30,646 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.130e+01 9.112e+01 1.091e+02 1.543e+02 3.899e+02, threshold=2.182e+02, percent-clipped=9.0 2024-03-16 00:43:42,221 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.balancer1.prob, batch_count=40103.333333333336, ans=0.125 2024-03-16 00:43:53,526 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_skip_rate, batch_count=40136.666666666664, ans=0.0 2024-03-16 00:43:59,889 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer1.prob, batch_count=40136.666666666664, ans=0.125 2024-03-16 00:44:01,224 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=40170.0, ans=0.125 2024-03-16 00:44:11,876 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.whiten, num_groups=1, num_channels=192, metric=4.49 vs. limit=12.0 2024-03-16 00:44:14,815 INFO [train_char.py:689] (1/2) Epoch 24, batch 400, loss[loss=0.09013, simple_loss=0.1529, pruned_loss=0.01366, over 24210.00 frames. ], tot_loss[loss=0.07363, simple_loss=0.128, pruned_loss=0.009636, over 4182465.45 frames. ], batch size: 212, lr: 1.27e-02, grad_scale: 32.0 2024-03-16 00:44:15,734 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=10.68 vs. limit=15.0 2024-03-16 00:44:24,645 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=40203.333333333336, ans=0.125 2024-03-16 00:44:25,972 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.self_attn2.whiten.whitening_limit, batch_count=40203.333333333336, ans=22.5 2024-03-16 00:44:58,003 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer1.prob, batch_count=40303.333333333336, ans=0.125 2024-03-16 00:45:19,293 INFO [train_char.py:689] (1/2) Epoch 24, batch 450, loss[loss=0.07428, simple_loss=0.1327, pruned_loss=0.007945, over 24331.00 frames. ], tot_loss[loss=0.0742, simple_loss=0.1289, pruned_loss=0.009766, over 4327549.87 frames. ], batch size: 172, lr: 1.27e-02, grad_scale: 32.0 2024-03-16 00:45:29,337 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=40370.0, ans=0.125 2024-03-16 00:45:36,308 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=10.50 vs. limit=15.0 2024-03-16 00:45:38,207 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=40403.333333333336, ans=0.1 2024-03-16 00:45:40,473 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.338e+01 9.125e+01 1.057e+02 1.365e+02 2.938e+02, threshold=2.113e+02, percent-clipped=6.0 2024-03-16 00:46:24,701 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.balancer2.prob, batch_count=40536.666666666664, ans=0.125 2024-03-16 00:46:25,758 INFO [train_char.py:689] (1/2) Epoch 24, batch 500, loss[loss=0.07743, simple_loss=0.1374, pruned_loss=0.008709, over 24079.00 frames. ], tot_loss[loss=0.07561, simple_loss=0.1313, pruned_loss=0.009946, over 4437566.18 frames. ], batch size: 199, lr: 1.26e-02, grad_scale: 32.0 2024-03-16 00:46:28,502 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.bypass.skip_rate, batch_count=40536.666666666664, ans=0.035 2024-03-16 00:47:26,528 INFO [train_char.py:689] (1/2) Epoch 25, batch 0, loss[loss=0.06227, simple_loss=0.1043, pruned_loss=0.01012, over 23920.00 frames. ], tot_loss[loss=0.06227, simple_loss=0.1043, pruned_loss=0.01012, over 23920.00 frames. ], batch size: 407, lr: 1.24e-02, grad_scale: 32.0 2024-03-16 00:47:26,528 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 00:47:40,438 INFO [train_char.py:721] (1/2) Epoch 25, validation: loss=0.06241, simple_loss=0.1148, pruned_loss=0.005015, over 657665.00 frames. 2024-03-16 00:47:40,439 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 00:48:20,528 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer1.prob, batch_count=40660.0, ans=0.125 2024-03-16 00:48:25,087 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass.scale_min, batch_count=40660.0, ans=0.2 2024-03-16 00:48:47,659 INFO [train_char.py:689] (1/2) Epoch 25, batch 50, loss[loss=0.07645, simple_loss=0.1358, pruned_loss=0.008526, over 24231.00 frames. ], tot_loss[loss=0.07163, simple_loss=0.1257, pruned_loss=0.0088, over 1092860.96 frames. ], batch size: 212, lr: 1.24e-02, grad_scale: 32.0 2024-03-16 00:48:58,710 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer1.prob, batch_count=40726.666666666664, ans=0.125 2024-03-16 00:49:03,488 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.592e+01 8.746e+01 1.014e+02 1.328e+02 3.091e+02, threshold=2.027e+02, percent-clipped=5.0 2024-03-16 00:49:24,037 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=40793.333333333336, ans=0.0 2024-03-16 00:49:39,577 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward3.hidden_balancer.prob, batch_count=40826.666666666664, ans=0.125 2024-03-16 00:49:55,606 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=6.05 vs. limit=15.0 2024-03-16 00:49:58,915 INFO [train_char.py:689] (1/2) Epoch 25, batch 100, loss[loss=0.08134, simple_loss=0.1445, pruned_loss=0.009063, over 24038.00 frames. ], tot_loss[loss=0.07157, simple_loss=0.1248, pruned_loss=0.009157, over 1915621.87 frames. ], batch size: 250, lr: 1.23e-02, grad_scale: 32.0 2024-03-16 00:50:04,539 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module1.balancer1.prob, batch_count=40893.333333333336, ans=0.125 2024-03-16 00:50:34,210 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.skip_rate, batch_count=40960.0, ans=0.07 2024-03-16 00:51:03,087 INFO [train_char.py:689] (1/2) Epoch 25, batch 150, loss[loss=0.08982, simple_loss=0.1559, pruned_loss=0.01189, over 24064.00 frames. ], tot_loss[loss=0.07191, simple_loss=0.1253, pruned_loss=0.009249, over 2556020.76 frames. ], batch size: 236, lr: 1.23e-02, grad_scale: 32.0 2024-03-16 00:51:12,925 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=12.76 vs. limit=22.5 2024-03-16 00:51:13,475 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.183e+01 8.559e+01 1.034e+02 1.250e+02 2.777e+02, threshold=2.067e+02, percent-clipped=6.0 2024-03-16 00:51:24,218 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=41093.333333333336, ans=0.0 2024-03-16 00:51:25,375 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.prob, batch_count=41093.333333333336, ans=0.125 2024-03-16 00:51:34,327 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=41126.666666666664, ans=0.125 2024-03-16 00:51:43,187 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer2.prob, batch_count=41160.0, ans=0.125 2024-03-16 00:52:05,327 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.0.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 00:52:11,794 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.4.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-16 00:52:12,268 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.70 vs. limit=6.0 2024-03-16 00:52:15,418 INFO [train_char.py:689] (1/2) Epoch 25, batch 200, loss[loss=0.06011, simple_loss=0.1059, pruned_loss=0.007165, over 23984.00 frames. ], tot_loss[loss=0.07132, simple_loss=0.1244, pruned_loss=0.009143, over 3056390.78 frames. ], batch size: 407, lr: 1.23e-02, grad_scale: 32.0 2024-03-16 00:52:43,371 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module1.balancer1.min_positive, batch_count=41293.333333333336, ans=0.025 2024-03-16 00:52:45,965 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=41293.333333333336, ans=0.0 2024-03-16 00:52:56,415 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.skip_rate, batch_count=41326.666666666664, ans=0.04949747468305833 2024-03-16 00:53:19,649 INFO [train_char.py:689] (1/2) Epoch 25, batch 250, loss[loss=0.08539, simple_loss=0.1464, pruned_loss=0.01221, over 24205.00 frames. ], tot_loss[loss=0.07184, simple_loss=0.1255, pruned_loss=0.009073, over 3445014.60 frames. ], batch size: 212, lr: 1.23e-02, grad_scale: 32.0 2024-03-16 00:53:22,444 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer1.prob, batch_count=41393.333333333336, ans=0.125 2024-03-16 00:53:29,929 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.523e+01 9.981e+01 1.281e+02 1.798e+02 3.331e+02, threshold=2.561e+02, percent-clipped=15.0 2024-03-16 00:53:44,260 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer2.prob, batch_count=41460.0, ans=0.125 2024-03-16 00:53:57,523 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=12.01 vs. limit=15.0 2024-03-16 00:54:14,776 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=41526.666666666664, ans=0.1 2024-03-16 00:54:27,369 INFO [train_char.py:689] (1/2) Epoch 25, batch 300, loss[loss=0.08249, simple_loss=0.1464, pruned_loss=0.009281, over 24232.00 frames. ], tot_loss[loss=0.07199, simple_loss=0.1257, pruned_loss=0.009153, over 3746080.43 frames. ], batch size: 212, lr: 1.23e-02, grad_scale: 32.0 2024-03-16 00:54:54,752 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff2_skip_rate, batch_count=41626.666666666664, ans=0.0018202898550724635 2024-03-16 00:55:16,851 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module1.whiten, num_groups=1, num_channels=512, metric=3.71 vs. limit=15.0 2024-03-16 00:55:27,162 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=7.33 vs. limit=10.0 2024-03-16 00:55:29,217 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.attention_skip_rate, batch_count=41693.333333333336, ans=0.0 2024-03-16 00:55:33,941 INFO [train_char.py:689] (1/2) Epoch 25, batch 350, loss[loss=0.0602, simple_loss=0.1124, pruned_loss=0.004007, over 24279.00 frames. ], tot_loss[loss=0.07177, simple_loss=0.1252, pruned_loss=0.009161, over 3987958.17 frames. ], batch size: 140, lr: 1.23e-02, grad_scale: 32.0 2024-03-16 00:55:43,858 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.725e+01 9.360e+01 1.127e+02 1.468e+02 2.530e+02, threshold=2.255e+02, percent-clipped=0.0 2024-03-16 00:55:59,467 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=41793.333333333336, ans=0.125 2024-03-16 00:56:13,435 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass.skip_rate, batch_count=41826.666666666664, ans=0.09899494936611666 2024-03-16 00:56:40,667 INFO [train_char.py:689] (1/2) Epoch 25, batch 400, loss[loss=0.08224, simple_loss=0.1471, pruned_loss=0.008666, over 24082.00 frames. ], tot_loss[loss=0.07262, simple_loss=0.1267, pruned_loss=0.009263, over 4175978.60 frames. ], batch size: 199, lr: 1.22e-02, grad_scale: 32.0 2024-03-16 00:56:50,913 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.out_combiner.scale_min, batch_count=41893.333333333336, ans=0.2 2024-03-16 00:57:07,183 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer2.min_abs, batch_count=41960.0, ans=0.5 2024-03-16 00:57:12,720 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.whiten, num_groups=1, num_channels=512, metric=7.79 vs. limit=12.0 2024-03-16 00:57:23,004 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=41993.333333333336, ans=0.1 2024-03-16 00:57:33,320 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=42026.666666666664, ans=0.125 2024-03-16 00:57:46,914 INFO [train_char.py:689] (1/2) Epoch 25, batch 450, loss[loss=0.06819, simple_loss=0.1211, pruned_loss=0.007657, over 24222.00 frames. ], tot_loss[loss=0.07342, simple_loss=0.1281, pruned_loss=0.00936, over 4323966.55 frames. ], batch size: 296, lr: 1.22e-02, grad_scale: 32.0 2024-03-16 00:57:52,166 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff3_skip_rate, batch_count=42060.0, ans=0.0017260869565217385 2024-03-16 00:57:56,976 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.340e+01 9.649e+01 1.189e+02 1.567e+02 2.547e+02, threshold=2.378e+02, percent-clipped=5.0 2024-03-16 00:58:01,169 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=42093.333333333336, ans=0.125 2024-03-16 00:58:34,373 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=42160.0, ans=0.1 2024-03-16 00:58:50,826 INFO [train_char.py:689] (1/2) Epoch 25, batch 500, loss[loss=0.0756, simple_loss=0.1349, pruned_loss=0.008175, over 24085.00 frames. ], tot_loss[loss=0.07407, simple_loss=0.1294, pruned_loss=0.009358, over 4435812.52 frames. ], batch size: 199, lr: 1.22e-02, grad_scale: 32.0 2024-03-16 00:58:51,109 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_skip_rate, batch_count=42226.666666666664, ans=0.0 2024-03-16 00:58:55,034 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=42226.666666666664, ans=0.125 2024-03-16 00:59:51,613 INFO [train_char.py:689] (1/2) Epoch 26, batch 0, loss[loss=0.05096, simple_loss=0.07995, pruned_loss=0.01099, over 22500.00 frames. ], tot_loss[loss=0.05096, simple_loss=0.07995, pruned_loss=0.01099, over 22500.00 frames. ], batch size: 483, lr: 1.20e-02, grad_scale: 32.0 2024-03-16 00:59:51,614 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 01:00:05,299 INFO [train_char.py:721] (1/2) Epoch 26, validation: loss=0.06222, simple_loss=0.1142, pruned_loss=0.005137, over 657665.00 frames. 2024-03-16 01:00:05,300 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 01:00:09,733 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=42250.0, ans=0.0 2024-03-16 01:00:23,056 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=42283.333333333336, ans=0.1 2024-03-16 01:00:44,002 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.balancer2.prob, batch_count=42316.666666666664, ans=0.125 2024-03-16 01:01:22,067 INFO [train_char.py:689] (1/2) Epoch 26, batch 50, loss[loss=0.07395, simple_loss=0.1302, pruned_loss=0.008875, over 24392.00 frames. ], tot_loss[loss=0.07027, simple_loss=0.1226, pruned_loss=0.00897, over 1080303.66 frames. ], batch size: 165, lr: 1.19e-02, grad_scale: 32.0 2024-03-16 01:01:23,443 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.542e+01 9.054e+01 1.032e+02 1.316e+02 2.856e+02, threshold=2.065e+02, percent-clipped=1.0 2024-03-16 01:01:31,865 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=42416.666666666664, ans=0.125 2024-03-16 01:01:34,718 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass.skip_rate, batch_count=42450.0, ans=0.035 2024-03-16 01:02:03,350 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.attention_skip_rate, batch_count=42516.666666666664, ans=0.0 2024-03-16 01:02:08,848 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=5.97 vs. limit=10.0 2024-03-16 01:02:18,811 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.balancer2.prob, batch_count=42550.0, ans=0.125 2024-03-16 01:02:23,960 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=42550.0, ans=0.1 2024-03-16 01:02:26,625 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=42583.333333333336, ans=0.1 2024-03-16 01:02:27,629 INFO [train_char.py:689] (1/2) Epoch 26, batch 100, loss[loss=0.07202, simple_loss=0.1254, pruned_loss=0.009308, over 24455.00 frames. ], tot_loss[loss=0.07091, simple_loss=0.1238, pruned_loss=0.009002, over 1910069.75 frames. ], batch size: 165, lr: 1.19e-02, grad_scale: 32.0 2024-03-16 01:02:57,811 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer2.prob, batch_count=42650.0, ans=0.125 2024-03-16 01:03:25,057 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff2_skip_rate, batch_count=42716.666666666664, ans=0.0015833333333333342 2024-03-16 01:03:36,995 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=256, metric=2.25 vs. limit=15.0 2024-03-16 01:03:37,635 INFO [train_char.py:689] (1/2) Epoch 26, batch 150, loss[loss=0.07775, simple_loss=0.1369, pruned_loss=0.009274, over 24119.00 frames. ], tot_loss[loss=0.07158, simple_loss=0.1255, pruned_loss=0.008828, over 2558498.16 frames. ], batch size: 279, lr: 1.19e-02, grad_scale: 32.0 2024-03-16 01:03:38,869 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.214e+01 8.616e+01 1.049e+02 1.357e+02 3.119e+02, threshold=2.097e+02, percent-clipped=5.0 2024-03-16 01:03:48,329 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=42750.0, ans=0.125 2024-03-16 01:03:57,543 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=384, metric=2.78 vs. limit=15.0 2024-03-16 01:04:12,815 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff2_skip_rate, batch_count=42816.666666666664, ans=0.0015615942028985516 2024-03-16 01:04:20,625 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass_mid.scale_min, batch_count=42850.0, ans=0.2 2024-03-16 01:04:31,073 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer1.prob, batch_count=42850.0, ans=0.125 2024-03-16 01:04:37,356 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff3_skip_rate, batch_count=42883.333333333336, ans=0.001547101449275362 2024-03-16 01:04:42,427 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=42883.333333333336, ans=0.125 2024-03-16 01:04:44,263 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.48 vs. limit=6.0 2024-03-16 01:04:47,238 INFO [train_char.py:689] (1/2) Epoch 26, batch 200, loss[loss=0.06914, simple_loss=0.1226, pruned_loss=0.00783, over 23961.00 frames. ], tot_loss[loss=0.0711, simple_loss=0.1244, pruned_loss=0.008891, over 3054089.56 frames. ], batch size: 107, lr: 1.19e-02, grad_scale: 64.0 2024-03-16 01:05:25,954 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=43016.666666666664, ans=0.125 2024-03-16 01:05:27,527 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=6.25 vs. limit=10.0 2024-03-16 01:05:53,153 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=43083.333333333336, ans=0.125 2024-03-16 01:05:54,201 INFO [train_char.py:689] (1/2) Epoch 26, batch 250, loss[loss=0.0692, simple_loss=0.1257, pruned_loss=0.006355, over 24296.00 frames. ], tot_loss[loss=0.07159, simple_loss=0.1253, pruned_loss=0.008957, over 3441752.31 frames. ], batch size: 180, lr: 1.19e-02, grad_scale: 64.0 2024-03-16 01:05:55,325 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.181e+01 9.839e+01 1.170e+02 1.591e+02 2.994e+02, threshold=2.341e+02, percent-clipped=13.0 2024-03-16 01:06:24,868 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer2.prob, batch_count=43150.0, ans=0.125 2024-03-16 01:06:35,606 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module2.balancer1.max_abs, batch_count=43183.333333333336, ans=10.0 2024-03-16 01:06:36,718 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass_mid.scale_min, batch_count=43183.333333333336, ans=0.2 2024-03-16 01:06:36,790 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.attention_skip_rate, batch_count=43183.333333333336, ans=0.0 2024-03-16 01:06:44,553 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=43183.333333333336, ans=0.125 2024-03-16 01:06:53,582 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 01:07:01,061 INFO [train_char.py:689] (1/2) Epoch 26, batch 300, loss[loss=0.08155, simple_loss=0.1442, pruned_loss=0.00943, over 24275.00 frames. ], tot_loss[loss=0.07173, simple_loss=0.1256, pruned_loss=0.008928, over 3745364.92 frames. ], batch size: 267, lr: 1.19e-02, grad_scale: 64.0 2024-03-16 01:07:03,055 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=14.41 vs. limit=15.0 2024-03-16 01:07:48,030 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer2.prob, batch_count=43350.0, ans=0.125 2024-03-16 01:07:51,780 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=43383.333333333336, ans=0.1 2024-03-16 01:07:54,405 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.skip_rate, batch_count=43383.333333333336, ans=0.07 2024-03-16 01:08:01,018 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_skip_rate, batch_count=43383.333333333336, ans=0.0 2024-03-16 01:08:08,331 INFO [train_char.py:689] (1/2) Epoch 26, batch 350, loss[loss=0.08933, simple_loss=0.152, pruned_loss=0.01335, over 24132.00 frames. ], tot_loss[loss=0.07137, simple_loss=0.1253, pruned_loss=0.008732, over 3984244.19 frames. ], batch size: 223, lr: 1.18e-02, grad_scale: 64.0 2024-03-16 01:08:09,593 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.708e+01 9.030e+01 1.152e+02 1.439e+02 3.027e+02, threshold=2.303e+02, percent-clipped=2.0 2024-03-16 01:08:13,524 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer1.prob, batch_count=43416.666666666664, ans=0.125 2024-03-16 01:08:28,788 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer1.prob, batch_count=43450.0, ans=0.125 2024-03-16 01:08:30,680 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.92 vs. limit=6.0 2024-03-16 01:08:34,809 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=43483.333333333336, ans=0.0 2024-03-16 01:08:40,607 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer2.min_positive, batch_count=43483.333333333336, ans=0.05 2024-03-16 01:08:59,127 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.conv_module2.whiten, num_groups=1, num_channels=192, metric=2.84 vs. limit=15.0 2024-03-16 01:09:10,823 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer1.prob, batch_count=43550.0, ans=0.125 2024-03-16 01:09:13,259 INFO [train_char.py:689] (1/2) Epoch 26, batch 400, loss[loss=0.07599, simple_loss=0.1366, pruned_loss=0.007675, over 24224.00 frames. ], tot_loss[loss=0.07185, simple_loss=0.126, pruned_loss=0.008844, over 4168455.40 frames. ], batch size: 188, lr: 1.18e-02, grad_scale: 64.0 2024-03-16 01:09:14,742 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=43583.333333333336, ans=0.0 2024-03-16 01:09:18,776 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=384, metric=15.49 vs. limit=22.5 2024-03-16 01:09:48,242 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=9.88 vs. limit=15.0 2024-03-16 01:09:52,001 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.whiten, num_groups=1, num_channels=256, metric=4.48 vs. limit=12.0 2024-03-16 01:09:58,595 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=43683.333333333336, ans=0.1 2024-03-16 01:10:06,120 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=43716.666666666664, ans=0.0 2024-03-16 01:10:09,433 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.self_attn1.whiten, num_groups=1, num_channels=192, metric=15.15 vs. limit=22.5 2024-03-16 01:10:13,621 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=43716.666666666664, ans=0.125 2024-03-16 01:10:18,223 INFO [train_char.py:689] (1/2) Epoch 26, batch 450, loss[loss=0.08058, simple_loss=0.14, pruned_loss=0.0106, over 24124.00 frames. ], tot_loss[loss=0.0725, simple_loss=0.127, pruned_loss=0.008986, over 4317777.47 frames. ], batch size: 199, lr: 1.18e-02, grad_scale: 64.0 2024-03-16 01:10:19,472 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.036e+01 9.228e+01 1.049e+02 1.301e+02 2.571e+02, threshold=2.097e+02, percent-clipped=3.0 2024-03-16 01:10:31,005 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=43783.333333333336, ans=0.1 2024-03-16 01:11:06,642 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=43850.0, ans=0.0 2024-03-16 01:11:21,377 INFO [train_char.py:689] (1/2) Epoch 26, batch 500, loss[loss=0.08316, simple_loss=0.1467, pruned_loss=0.009797, over 24077.00 frames. ], tot_loss[loss=0.07339, simple_loss=0.1288, pruned_loss=0.008986, over 4432357.77 frames. ], batch size: 236, lr: 1.18e-02, grad_scale: 64.0 2024-03-16 01:12:22,592 INFO [train_char.py:689] (1/2) Epoch 27, batch 0, loss[loss=0.05178, simple_loss=0.08293, pruned_loss=0.01032, over 22490.00 frames. ], tot_loss[loss=0.05178, simple_loss=0.08293, pruned_loss=0.01032, over 22490.00 frames. ], batch size: 483, lr: 1.16e-02, grad_scale: 64.0 2024-03-16 01:12:22,593 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 01:12:36,245 INFO [train_char.py:721] (1/2) Epoch 27, validation: loss=0.06174, simple_loss=0.1133, pruned_loss=0.005092, over 657665.00 frames. 2024-03-16 01:12:36,246 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 01:13:16,023 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=44006.666666666664, ans=0.125 2024-03-16 01:13:21,398 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=44040.0, ans=0.125 2024-03-16 01:13:21,476 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=44040.0, ans=0.0 2024-03-16 01:13:26,855 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass_mid.scale_min, batch_count=44040.0, ans=0.2 2024-03-16 01:13:38,434 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.805e+01 9.058e+01 1.114e+02 1.474e+02 2.719e+02, threshold=2.229e+02, percent-clipped=8.0 2024-03-16 01:13:42,206 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=26.87 vs. limit=22.5 2024-03-16 01:13:46,529 INFO [train_char.py:689] (1/2) Epoch 27, batch 50, loss[loss=0.07883, simple_loss=0.1413, pruned_loss=0.008165, over 24187.00 frames. ], tot_loss[loss=0.06711, simple_loss=0.1181, pruned_loss=0.008042, over 1085475.23 frames. ], batch size: 212, lr: 1.15e-02, grad_scale: 64.0 2024-03-16 01:14:02,570 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass_mid.scale_min, batch_count=44106.666666666664, ans=0.2 2024-03-16 01:14:27,416 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=44173.333333333336, ans=0.0 2024-03-16 01:14:33,895 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.4.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 01:14:36,505 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=44206.666666666664, ans=0.0 2024-03-16 01:14:58,427 INFO [train_char.py:689] (1/2) Epoch 27, batch 100, loss[loss=0.05167, simple_loss=0.0856, pruned_loss=0.008869, over 22850.00 frames. ], tot_loss[loss=0.06691, simple_loss=0.1186, pruned_loss=0.007619, over 1912898.65 frames. ], batch size: 483, lr: 1.15e-02, grad_scale: 64.0 2024-03-16 01:15:06,656 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer1.prob, batch_count=44273.333333333336, ans=0.125 2024-03-16 01:15:38,364 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.2.prob, batch_count=44340.0, ans=0.125 2024-03-16 01:15:58,696 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.926e+01 8.384e+01 1.010e+02 1.337e+02 2.347e+02, threshold=2.019e+02, percent-clipped=1.0 2024-03-16 01:16:06,523 INFO [train_char.py:689] (1/2) Epoch 27, batch 150, loss[loss=0.07822, simple_loss=0.1385, pruned_loss=0.008953, over 24168.00 frames. ], tot_loss[loss=0.06859, simple_loss=0.1212, pruned_loss=0.008002, over 2552156.58 frames. ], batch size: 251, lr: 1.15e-02, grad_scale: 64.0 2024-03-16 01:16:15,413 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=44440.0, ans=0.1 2024-03-16 01:16:32,030 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass.scale_min, batch_count=44506.666666666664, ans=0.2 2024-03-16 01:16:33,318 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=44506.666666666664, ans=0.0 2024-03-16 01:16:38,283 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=44506.666666666664, ans=0.125 2024-03-16 01:16:57,546 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=44573.333333333336, ans=0.125 2024-03-16 01:17:10,147 INFO [train_char.py:689] (1/2) Epoch 27, batch 200, loss[loss=0.06775, simple_loss=0.1209, pruned_loss=0.007314, over 24304.00 frames. ], tot_loss[loss=0.06995, simple_loss=0.1235, pruned_loss=0.008204, over 3045580.25 frames. ], batch size: 116, lr: 1.15e-02, grad_scale: 32.0 2024-03-16 01:17:18,877 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=384, metric=4.06 vs. limit=15.0 2024-03-16 01:17:32,412 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass.skip_rate, batch_count=44640.0, ans=0.04949747468305833 2024-03-16 01:17:34,976 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass_mid.scale_min, batch_count=44640.0, ans=0.2 2024-03-16 01:17:45,630 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=44673.333333333336, ans=0.125 2024-03-16 01:18:05,397 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.whiten, num_groups=1, num_channels=192, metric=4.46 vs. limit=12.0 2024-03-16 01:18:11,978 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.091e+01 8.642e+01 1.018e+02 1.321e+02 3.556e+02, threshold=2.035e+02, percent-clipped=6.0 2024-03-16 01:18:18,394 INFO [train_char.py:689] (1/2) Epoch 27, batch 250, loss[loss=0.06295, simple_loss=0.1076, pruned_loss=0.009144, over 23932.00 frames. ], tot_loss[loss=0.07019, simple_loss=0.1238, pruned_loss=0.008278, over 3439589.14 frames. ], batch size: 407, lr: 1.15e-02, grad_scale: 32.0 2024-03-16 01:18:37,690 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=44806.666666666664, ans=0.125 2024-03-16 01:18:41,504 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=44806.666666666664, ans=0.0011289855072463776 2024-03-16 01:18:52,577 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.57 vs. limit=6.0 2024-03-16 01:19:03,078 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=44873.333333333336, ans=0.1 2024-03-16 01:19:10,573 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 01:19:11,720 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=44906.666666666664, ans=0.125 2024-03-16 01:19:12,953 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.ff2_skip_rate, batch_count=44906.666666666664, ans=0.001107246376811595 2024-03-16 01:19:15,531 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer1.prob, batch_count=44906.666666666664, ans=0.125 2024-03-16 01:19:25,549 INFO [train_char.py:689] (1/2) Epoch 27, batch 300, loss[loss=0.06269, simple_loss=0.1167, pruned_loss=0.00436, over 23931.00 frames. ], tot_loss[loss=0.06988, simple_loss=0.1233, pruned_loss=0.008225, over 3741150.63 frames. ], batch size: 107, lr: 1.15e-02, grad_scale: 32.0 2024-03-16 01:19:32,513 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=2.91 vs. limit=15.0 2024-03-16 01:19:52,593 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.balancer2.prob, batch_count=45006.666666666664, ans=0.125 2024-03-16 01:19:59,898 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=45006.666666666664, ans=0.1 2024-03-16 01:20:04,856 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer1.min_positive, batch_count=45040.0, ans=0.025 2024-03-16 01:20:19,087 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module1.balancer2.min_positive, batch_count=45073.333333333336, ans=0.05 2024-03-16 01:20:25,183 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.367e+01 8.387e+01 1.017e+02 1.390e+02 3.075e+02, threshold=2.034e+02, percent-clipped=7.0 2024-03-16 01:20:30,678 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass.skip_rate, batch_count=45106.666666666664, ans=0.09899494936611666 2024-03-16 01:20:31,657 INFO [train_char.py:689] (1/2) Epoch 27, batch 350, loss[loss=0.06536, simple_loss=0.1158, pruned_loss=0.007464, over 24146.00 frames. ], tot_loss[loss=0.07064, simple_loss=0.1246, pruned_loss=0.008343, over 3985579.41 frames. ], batch size: 362, lr: 1.14e-02, grad_scale: 32.0 2024-03-16 01:20:49,201 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=45140.0, ans=0.125 2024-03-16 01:20:53,574 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=10.71 vs. limit=15.0 2024-03-16 01:20:54,478 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.out_combiner.scale_min, batch_count=45140.0, ans=0.2 2024-03-16 01:20:59,534 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.hidden_balancer.prob, batch_count=45173.333333333336, ans=0.125 2024-03-16 01:21:00,760 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer2.prob, batch_count=45173.333333333336, ans=0.125 2024-03-16 01:21:06,853 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer1.min_positive, batch_count=45173.333333333336, ans=0.025 2024-03-16 01:21:22,593 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=45206.666666666664, ans=0.1 2024-03-16 01:21:24,425 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.whiten, num_groups=1, num_channels=256, metric=4.94 vs. limit=12.0 2024-03-16 01:21:34,559 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.5.prob, batch_count=45240.0, ans=0.125 2024-03-16 01:21:38,234 INFO [train_char.py:689] (1/2) Epoch 27, batch 400, loss[loss=0.06079, simple_loss=0.1113, pruned_loss=0.005164, over 24301.00 frames. ], tot_loss[loss=0.07069, simple_loss=0.1246, pruned_loss=0.008403, over 4174686.43 frames. ], batch size: 146, lr: 1.14e-02, grad_scale: 32.0 2024-03-16 01:21:48,039 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.bypass.scale_min, batch_count=45273.333333333336, ans=0.2 2024-03-16 01:21:49,692 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module2.whiten, num_groups=1, num_channels=512, metric=6.79 vs. limit=15.0 2024-03-16 01:21:52,981 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-16 01:22:12,131 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.attention_skip_rate, batch_count=45340.0, ans=0.0 2024-03-16 01:22:16,580 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=512, metric=4.33 vs. limit=15.0 2024-03-16 01:22:27,856 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=11.61 vs. limit=15.0 2024-03-16 01:22:36,485 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.967e+01 8.648e+01 1.013e+02 1.374e+02 2.811e+02, threshold=2.025e+02, percent-clipped=5.0 2024-03-16 01:22:39,068 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer2.min_abs, batch_count=45406.666666666664, ans=0.5 2024-03-16 01:22:42,388 INFO [train_char.py:689] (1/2) Epoch 27, batch 450, loss[loss=0.07113, simple_loss=0.1263, pruned_loss=0.008001, over 24276.00 frames. ], tot_loss[loss=0.07145, simple_loss=0.126, pruned_loss=0.008435, over 4321520.78 frames. ], batch size: 296, lr: 1.14e-02, grad_scale: 32.0 2024-03-16 01:22:46,121 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.nonlin_attention.balancer.prob, batch_count=45440.0, ans=0.125 2024-03-16 01:23:06,507 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.bypass.skip_rate, batch_count=45506.666666666664, ans=0.035 2024-03-16 01:23:24,089 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=45540.0, ans=0.125 2024-03-16 01:23:27,788 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=45540.0, ans=0.125 2024-03-16 01:23:45,793 INFO [train_char.py:689] (1/2) Epoch 27, batch 500, loss[loss=0.06747, simple_loss=0.1222, pruned_loss=0.006392, over 24332.00 frames. ], tot_loss[loss=0.07234, simple_loss=0.1278, pruned_loss=0.008433, over 4436773.42 frames. ], batch size: 172, lr: 1.14e-02, grad_scale: 32.0 2024-03-16 01:23:49,654 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.2.prob, batch_count=45606.666666666664, ans=0.125 2024-03-16 01:24:49,613 INFO [train_char.py:689] (1/2) Epoch 28, batch 0, loss[loss=0.07291, simple_loss=0.126, pruned_loss=0.009925, over 24343.00 frames. ], tot_loss[loss=0.07291, simple_loss=0.126, pruned_loss=0.009925, over 24343.00 frames. ], batch size: 180, lr: 1.12e-02, grad_scale: 32.0 2024-03-16 01:24:49,613 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 01:25:03,338 INFO [train_char.py:721] (1/2) Epoch 28, validation: loss=0.06111, simple_loss=0.1123, pruned_loss=0.00498, over 657665.00 frames. 2024-03-16 01:25:03,339 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 01:25:28,965 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer1.prob, batch_count=45663.333333333336, ans=0.125 2024-03-16 01:25:37,540 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=45696.666666666664, ans=0.125 2024-03-16 01:25:47,118 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.attention_skip_rate, batch_count=45730.0, ans=0.0 2024-03-16 01:25:56,696 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=512, metric=15.87 vs. limit=22.5 2024-03-16 01:25:59,951 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.629e+01 8.506e+01 9.862e+01 1.220e+02 2.471e+02, threshold=1.972e+02, percent-clipped=3.0 2024-03-16 01:26:16,555 INFO [train_char.py:689] (1/2) Epoch 28, batch 50, loss[loss=0.06031, simple_loss=0.1117, pruned_loss=0.004475, over 24351.00 frames. ], tot_loss[loss=0.06817, simple_loss=0.1204, pruned_loss=0.00799, over 1088222.36 frames. ], batch size: 116, lr: 1.12e-02, grad_scale: 32.0 2024-03-16 01:26:23,838 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.attention_skip_rate, batch_count=45796.666666666664, ans=0.0 2024-03-16 01:26:51,942 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=4.67 vs. limit=15.0 2024-03-16 01:27:23,714 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.balancer1.prob, batch_count=45930.0, ans=0.125 2024-03-16 01:27:24,969 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_skip_rate, batch_count=45930.0, ans=0.0 2024-03-16 01:27:27,340 INFO [train_char.py:689] (1/2) Epoch 28, batch 100, loss[loss=0.05576, simple_loss=0.08973, pruned_loss=0.0109, over 22861.00 frames. ], tot_loss[loss=0.06995, simple_loss=0.1238, pruned_loss=0.008062, over 1914330.77 frames. ], batch size: 483, lr: 1.12e-02, grad_scale: 32.0 2024-03-16 01:27:35,779 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=11.32 vs. limit=15.0 2024-03-16 01:27:35,967 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=10.43 vs. limit=15.0 2024-03-16 01:27:45,445 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.whiten.whitening_limit, batch_count=45996.666666666664, ans=15.0 2024-03-16 01:28:08,295 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=46063.333333333336, ans=0.125 2024-03-16 01:28:17,002 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.871e+01 8.880e+01 1.076e+02 1.571e+02 3.203e+02, threshold=2.152e+02, percent-clipped=12.0 2024-03-16 01:28:36,664 INFO [train_char.py:689] (1/2) Epoch 28, batch 150, loss[loss=0.06487, simple_loss=0.1069, pruned_loss=0.01139, over 23786.00 frames. ], tot_loss[loss=0.07019, simple_loss=0.1242, pruned_loss=0.008107, over 2559002.64 frames. ], batch size: 439, lr: 1.11e-02, grad_scale: 32.0 2024-03-16 01:29:02,116 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_skip_rate, batch_count=46196.666666666664, ans=0.0 2024-03-16 01:29:38,456 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer2.prob, batch_count=46263.333333333336, ans=0.125 2024-03-16 01:29:40,794 INFO [train_char.py:689] (1/2) Epoch 28, batch 200, loss[loss=0.06111, simple_loss=0.1049, pruned_loss=0.008671, over 23893.00 frames. ], tot_loss[loss=0.07086, simple_loss=0.1252, pruned_loss=0.008245, over 3057583.37 frames. ], batch size: 407, lr: 1.11e-02, grad_scale: 32.0 2024-03-16 01:29:54,120 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=46330.0, ans=0.0 2024-03-16 01:30:01,863 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.attention_skip_rate, batch_count=46330.0, ans=0.0 2024-03-16 01:30:33,635 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.302e+01 9.610e+01 1.209e+02 1.821e+02 3.486e+02, threshold=2.419e+02, percent-clipped=12.0 2024-03-16 01:30:37,638 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.ff3_skip_rate, batch_count=46430.0, ans=0.000776086956521739 2024-03-16 01:30:49,069 INFO [train_char.py:689] (1/2) Epoch 28, batch 250, loss[loss=0.0581, simple_loss=0.1063, pruned_loss=0.004961, over 24301.00 frames. ], tot_loss[loss=0.07033, simple_loss=0.1247, pruned_loss=0.007991, over 3446093.84 frames. ], batch size: 146, lr: 1.11e-02, grad_scale: 32.0 2024-03-16 01:30:54,497 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer1.prob, batch_count=46463.333333333336, ans=0.125 2024-03-16 01:31:00,809 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=46496.666666666664, ans=0.2 2024-03-16 01:31:14,504 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.attention_skip_rate, batch_count=46530.0, ans=0.0 2024-03-16 01:31:26,368 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=46530.0, ans=0.1 2024-03-16 01:31:34,871 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=7.29 vs. limit=10.0 2024-03-16 01:31:38,137 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_skip_rate, batch_count=46563.333333333336, ans=0.0 2024-03-16 01:31:42,284 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=46563.333333333336, ans=0.2 2024-03-16 01:31:51,455 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=9.33 vs. limit=15.0 2024-03-16 01:31:55,003 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer1.prob, batch_count=46596.666666666664, ans=0.125 2024-03-16 01:31:57,416 INFO [train_char.py:689] (1/2) Epoch 28, batch 300, loss[loss=0.06653, simple_loss=0.1142, pruned_loss=0.009441, over 24131.00 frames. ], tot_loss[loss=0.07041, simple_loss=0.1248, pruned_loss=0.008033, over 3751656.90 frames. ], batch size: 362, lr: 1.11e-02, grad_scale: 32.0 2024-03-16 01:32:06,689 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=46630.0, ans=0.0 2024-03-16 01:32:14,468 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=46663.333333333336, ans=0.0 2024-03-16 01:32:21,913 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=46696.666666666664, ans=0.1 2024-03-16 01:32:32,393 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=12.36 vs. limit=15.0 2024-03-16 01:32:47,804 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.647e+01 9.048e+01 1.125e+02 1.475e+02 2.737e+02, threshold=2.249e+02, percent-clipped=1.0 2024-03-16 01:33:02,887 INFO [train_char.py:689] (1/2) Epoch 28, batch 350, loss[loss=0.05616, simple_loss=0.1039, pruned_loss=0.004198, over 24242.00 frames. ], tot_loss[loss=0.07057, simple_loss=0.1251, pruned_loss=0.008015, over 3992997.42 frames. ], batch size: 122, lr: 1.11e-02, grad_scale: 32.0 2024-03-16 01:33:09,448 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=46796.666666666664, ans=0.125 2024-03-16 01:33:13,173 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=46796.666666666664, ans=0.1 2024-03-16 01:33:29,074 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=46863.333333333336, ans=0.125 2024-03-16 01:33:49,353 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=46896.666666666664, ans=0.125 2024-03-16 01:34:03,136 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=46930.0, ans=0.125 2024-03-16 01:34:08,028 INFO [train_char.py:689] (1/2) Epoch 28, batch 400, loss[loss=0.05911, simple_loss=0.1071, pruned_loss=0.005545, over 24244.00 frames. ], tot_loss[loss=0.07093, simple_loss=0.1258, pruned_loss=0.00805, over 4180909.58 frames. ], batch size: 140, lr: 1.11e-02, grad_scale: 32.0 2024-03-16 01:34:19,412 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass.skip_rate, batch_count=46996.666666666664, ans=0.035 2024-03-16 01:34:40,125 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=47030.0, ans=0.125 2024-03-16 01:34:51,209 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_skip_rate, batch_count=47063.333333333336, ans=0.0 2024-03-16 01:34:51,612 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=5.37 vs. limit=10.0 2024-03-16 01:34:58,203 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.909e+01 8.534e+01 1.061e+02 1.413e+02 2.296e+02, threshold=2.123e+02, percent-clipped=2.0 2024-03-16 01:35:05,431 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.out_whiten, num_groups=1, num_channels=192, metric=6.60 vs. limit=8.0 2024-03-16 01:35:12,869 INFO [train_char.py:689] (1/2) Epoch 28, batch 450, loss[loss=0.08234, simple_loss=0.1461, pruned_loss=0.009308, over 24207.00 frames. ], tot_loss[loss=0.07154, simple_loss=0.1268, pruned_loss=0.008131, over 4326327.11 frames. ], batch size: 212, lr: 1.10e-02, grad_scale: 32.0 2024-03-16 01:35:13,134 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass.scale_min, batch_count=47130.0, ans=0.2 2024-03-16 01:35:19,186 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.bypass_mid.scale_min, batch_count=47130.0, ans=0.2 2024-03-16 01:35:53,918 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=47230.0, ans=0.1 2024-03-16 01:36:14,121 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.bypass_mid.scale_min, batch_count=47263.333333333336, ans=0.2 2024-03-16 01:36:14,845 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.whiten, num_groups=1, num_channels=192, metric=4.61 vs. limit=12.0 2024-03-16 01:36:16,875 INFO [train_char.py:689] (1/2) Epoch 28, batch 500, loss[loss=0.08102, simple_loss=0.1428, pruned_loss=0.009617, over 24054.00 frames. ], tot_loss[loss=0.07245, simple_loss=0.1286, pruned_loss=0.008162, over 4438408.80 frames. ], batch size: 250, lr: 1.10e-02, grad_scale: 32.0 2024-03-16 01:36:19,685 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.hidden_balancer.prob, batch_count=47296.666666666664, ans=0.125 2024-03-16 01:37:20,888 INFO [train_char.py:689] (1/2) Epoch 29, batch 0, loss[loss=0.06284, simple_loss=0.1156, pruned_loss=0.005046, over 24253.00 frames. ], tot_loss[loss=0.06284, simple_loss=0.1156, pruned_loss=0.005046, over 24253.00 frames. ], batch size: 134, lr: 1.08e-02, grad_scale: 32.0 2024-03-16 01:37:20,888 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 01:37:34,416 INFO [train_char.py:721] (1/2) Epoch 29, validation: loss=0.06048, simple_loss=0.1117, pruned_loss=0.004651, over 657665.00 frames. 2024-03-16 01:37:34,416 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 01:38:21,761 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.586e+01 8.663e+01 1.045e+02 1.445e+02 2.263e+02, threshold=2.091e+02, percent-clipped=4.0 2024-03-16 01:38:45,508 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.01 vs. limit=6.0 2024-03-16 01:38:47,197 INFO [train_char.py:689] (1/2) Epoch 29, batch 50, loss[loss=0.05444, simple_loss=0.1012, pruned_loss=0.003858, over 24397.00 frames. ], tot_loss[loss=0.06626, simple_loss=0.1185, pruned_loss=0.007009, over 1085896.42 frames. ], batch size: 129, lr: 1.08e-02, grad_scale: 32.0 2024-03-16 01:38:52,637 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.balancer.max_positive, batch_count=47486.666666666664, ans=0.95 2024-03-16 01:39:15,612 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module1.balancer1.max_abs, batch_count=47520.0, ans=10.0 2024-03-16 01:39:16,966 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.min_positive, batch_count=47520.0, ans=0.025 2024-03-16 01:39:26,101 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=47553.333333333336, ans=0.1 2024-03-16 01:39:55,083 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.ff3_skip_rate, batch_count=47620.0, ans=0.0005173913043478272 2024-03-16 01:40:03,731 INFO [train_char.py:689] (1/2) Epoch 29, batch 100, loss[loss=0.05789, simple_loss=0.1061, pruned_loss=0.004829, over 24383.00 frames. ], tot_loss[loss=0.06661, simple_loss=0.1189, pruned_loss=0.007148, over 1912397.90 frames. ], batch size: 129, lr: 1.08e-02, grad_scale: 32.0 2024-03-16 01:40:06,656 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=47653.333333333336, ans=0.0 2024-03-16 01:40:28,388 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=47720.0, ans=0.0 2024-03-16 01:40:41,294 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.scale_min, batch_count=47753.333333333336, ans=0.2 2024-03-16 01:40:41,846 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=15.56 vs. limit=15.0 2024-03-16 01:40:43,682 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.750e+01 7.958e+01 9.896e+01 1.269e+02 2.677e+02, threshold=1.979e+02, percent-clipped=3.0 2024-03-16 01:40:56,045 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=8.90 vs. limit=10.0 2024-03-16 01:41:05,921 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer2.prob, batch_count=47786.666666666664, ans=0.125 2024-03-16 01:41:08,051 INFO [train_char.py:689] (1/2) Epoch 29, batch 150, loss[loss=0.06989, simple_loss=0.126, pruned_loss=0.006879, over 24238.00 frames. ], tot_loss[loss=0.06662, simple_loss=0.1192, pruned_loss=0.007026, over 2555463.93 frames. ], batch size: 328, lr: 1.08e-02, grad_scale: 32.0 2024-03-16 01:41:12,249 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=47820.0, ans=0.0 2024-03-16 01:41:22,896 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=15.10 vs. limit=22.5 2024-03-16 01:41:32,804 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer1.prob, batch_count=47886.666666666664, ans=0.125 2024-03-16 01:41:57,381 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=47920.0, ans=0.0 2024-03-16 01:42:12,357 INFO [train_char.py:689] (1/2) Epoch 29, batch 200, loss[loss=0.08335, simple_loss=0.1514, pruned_loss=0.00766, over 24102.00 frames. ], tot_loss[loss=0.06735, simple_loss=0.1203, pruned_loss=0.007182, over 3050040.62 frames. ], batch size: 251, lr: 1.08e-02, grad_scale: 32.0 2024-03-16 01:42:27,001 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=47986.666666666664, ans=0.0 2024-03-16 01:42:56,554 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.774e+01 8.855e+01 1.142e+02 1.615e+02 2.916e+02, threshold=2.284e+02, percent-clipped=12.0 2024-03-16 01:43:03,279 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=48086.666666666664, ans=0.1 2024-03-16 01:43:09,523 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=48086.666666666664, ans=0.0 2024-03-16 01:43:12,576 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=12.27 vs. limit=15.0 2024-03-16 01:43:24,822 INFO [train_char.py:689] (1/2) Epoch 29, batch 250, loss[loss=0.06207, simple_loss=0.1085, pruned_loss=0.007798, over 24081.00 frames. ], tot_loss[loss=0.06839, simple_loss=0.1221, pruned_loss=0.007354, over 3442621.43 frames. ], batch size: 361, lr: 1.08e-02, grad_scale: 32.0 2024-03-16 01:43:27,782 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=48153.333333333336, ans=0.2 2024-03-16 01:43:45,045 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=11.14 vs. limit=15.0 2024-03-16 01:43:45,732 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=48186.666666666664, ans=0.0 2024-03-16 01:43:46,870 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=48186.666666666664, ans=0.1 2024-03-16 01:43:55,077 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.whiten, num_groups=1, num_channels=256, metric=5.19 vs. limit=12.0 2024-03-16 01:43:59,650 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer2.prob, batch_count=48220.0, ans=0.125 2024-03-16 01:44:23,268 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=512, metric=4.66 vs. limit=15.0 2024-03-16 01:44:29,021 INFO [train_char.py:689] (1/2) Epoch 29, batch 300, loss[loss=0.05842, simple_loss=0.09903, pruned_loss=0.008909, over 23961.00 frames. ], tot_loss[loss=0.06873, simple_loss=0.1226, pruned_loss=0.007435, over 3747561.60 frames. ], batch size: 407, lr: 1.07e-02, grad_scale: 32.0 2024-03-16 01:44:30,660 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer2.prob, batch_count=48320.0, ans=0.125 2024-03-16 01:44:49,247 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten.whitening_limit, batch_count=48353.333333333336, ans=15.0 2024-03-16 01:44:49,929 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=48353.333333333336, ans=0.1 2024-03-16 01:44:55,270 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=512, metric=3.58 vs. limit=15.0 2024-03-16 01:44:56,696 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn2.whiten, num_groups=1, num_channels=192, metric=14.14 vs. limit=22.5 2024-03-16 01:45:08,649 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.balancer1.prob, batch_count=48420.0, ans=0.125 2024-03-16 01:45:13,382 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.609e+01 9.105e+01 1.101e+02 1.405e+02 2.449e+02, threshold=2.201e+02, percent-clipped=1.0 2024-03-16 01:45:20,242 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.conv_module2.whiten, num_groups=1, num_channels=512, metric=5.69 vs. limit=15.0 2024-03-16 01:45:37,221 INFO [train_char.py:689] (1/2) Epoch 29, batch 350, loss[loss=0.06157, simple_loss=0.1121, pruned_loss=0.005515, over 24419.00 frames. ], tot_loss[loss=0.06866, simple_loss=0.1223, pruned_loss=0.007519, over 3989993.65 frames. ], batch size: 158, lr: 1.07e-02, grad_scale: 32.0 2024-03-16 01:45:37,488 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=48486.666666666664, ans=0.1 2024-03-16 01:45:43,602 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 01:46:02,608 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=48553.333333333336, ans=0.125 2024-03-16 01:46:25,710 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=48586.666666666664, ans=0.125 2024-03-16 01:46:41,489 INFO [train_char.py:689] (1/2) Epoch 29, batch 400, loss[loss=0.06621, simple_loss=0.1214, pruned_loss=0.005528, over 24374.00 frames. ], tot_loss[loss=0.06955, simple_loss=0.124, pruned_loss=0.007571, over 4178770.42 frames. ], batch size: 158, lr: 1.07e-02, grad_scale: 32.0 2024-03-16 01:46:42,904 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.skip_rate, batch_count=48653.333333333336, ans=0.04949747468305833 2024-03-16 01:46:54,650 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 01:46:54,724 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer1.prob, batch_count=48686.666666666664, ans=0.125 2024-03-16 01:46:55,395 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn2.whiten, num_groups=1, num_channels=192, metric=14.41 vs. limit=22.5 2024-03-16 01:46:57,353 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=48686.666666666664, ans=0.125 2024-03-16 01:47:03,011 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.conv_module2.whiten, num_groups=1, num_channels=192, metric=2.98 vs. limit=15.0 2024-03-16 01:47:21,784 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.696e+01 8.309e+01 9.654e+01 1.256e+02 2.583e+02, threshold=1.931e+02, percent-clipped=1.0 2024-03-16 01:47:21,931 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.2.prob, batch_count=48753.333333333336, ans=0.125 2024-03-16 01:47:47,001 INFO [train_char.py:689] (1/2) Epoch 29, batch 450, loss[loss=0.08097, simple_loss=0.1433, pruned_loss=0.009346, over 24256.00 frames. ], tot_loss[loss=0.0701, simple_loss=0.125, pruned_loss=0.007599, over 4326100.56 frames. ], batch size: 212, lr: 1.07e-02, grad_scale: 32.0 2024-03-16 01:48:03,626 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=48853.333333333336, ans=0.1 2024-03-16 01:48:23,242 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=48886.666666666664, ans=0.125 2024-03-16 01:48:51,898 INFO [train_char.py:689] (1/2) Epoch 29, batch 500, loss[loss=0.07912, simple_loss=0.139, pruned_loss=0.009612, over 24278.00 frames. ], tot_loss[loss=0.07111, simple_loss=0.1269, pruned_loss=0.007643, over 4439385.50 frames. ], batch size: 267, lr: 1.07e-02, grad_scale: 32.0 2024-03-16 01:49:47,780 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=49010.0, ans=0.125 2024-03-16 01:49:52,950 INFO [train_char.py:689] (1/2) Epoch 30, batch 0, loss[loss=0.0593, simple_loss=0.1012, pruned_loss=0.008697, over 23804.00 frames. ], tot_loss[loss=0.0593, simple_loss=0.1012, pruned_loss=0.008697, over 23804.00 frames. ], batch size: 439, lr: 1.05e-02, grad_scale: 32.0 2024-03-16 01:49:52,951 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 01:50:02,226 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.0.layers.1.self_attn_weights, attn_weights_entropy = tensor([4.6278, 4.7825, 4.8615, 4.5560], device='cuda:1') 2024-03-16 01:50:04,816 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.0.layers.1.self_attn_weights, attn_weights_entropy = tensor([4.6251, 4.7967, 4.8655, 4.5556], device='cuda:1') 2024-03-16 01:50:06,565 INFO [train_char.py:721] (1/2) Epoch 30, validation: loss=0.06041, simple_loss=0.1115, pruned_loss=0.00465, over 657665.00 frames. 2024-03-16 01:50:06,566 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 01:50:33,774 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.self_attn_weights.pos_emb_skip_rate, batch_count=49076.666666666664, ans=0.0 2024-03-16 01:50:38,243 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=7.26 vs. limit=15.0 2024-03-16 01:50:38,727 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.930e+01 8.448e+01 1.014e+02 1.293e+02 3.079e+02, threshold=2.028e+02, percent-clipped=8.0 2024-03-16 01:51:07,463 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=4.47 vs. limit=15.0 2024-03-16 01:51:19,149 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.skip_rate, batch_count=49143.333333333336, ans=0.04949747468305833 2024-03-16 01:51:21,503 INFO [train_char.py:689] (1/2) Epoch 30, batch 50, loss[loss=0.06385, simple_loss=0.1168, pruned_loss=0.005447, over 23810.00 frames. ], tot_loss[loss=0.06556, simple_loss=0.1172, pruned_loss=0.00698, over 1085821.37 frames. ], batch size: 107, lr: 1.05e-02, grad_scale: 32.0 2024-03-16 01:51:35,770 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module2.balancer1.prob, batch_count=49176.666666666664, ans=0.125 2024-03-16 01:51:45,818 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=9.66 vs. limit=15.0 2024-03-16 01:51:54,552 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=49243.333333333336, ans=0.125 2024-03-16 01:51:59,567 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=49243.333333333336, ans=0.1 2024-03-16 01:52:15,204 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=49276.666666666664, ans=0.125 2024-03-16 01:52:30,946 INFO [train_char.py:689] (1/2) Epoch 30, batch 100, loss[loss=0.05542, simple_loss=0.1044, pruned_loss=0.003239, over 24320.00 frames. ], tot_loss[loss=0.06708, simple_loss=0.1197, pruned_loss=0.007256, over 1906590.85 frames. ], batch size: 146, lr: 1.05e-02, grad_scale: 32.0 2024-03-16 01:52:37,569 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.prob, batch_count=49343.333333333336, ans=0.125 2024-03-16 01:52:42,904 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=49376.666666666664, ans=0.0 2024-03-16 01:52:46,715 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.8.prob, batch_count=49376.666666666664, ans=0.125 2024-03-16 01:52:52,623 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.97 vs. limit=6.0 2024-03-16 01:53:01,914 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.123e+01 9.141e+01 1.242e+02 1.610e+02 3.460e+02, threshold=2.484e+02, percent-clipped=11.0 2024-03-16 01:53:35,870 INFO [train_char.py:689] (1/2) Epoch 30, batch 150, loss[loss=0.07789, simple_loss=0.1374, pruned_loss=0.009201, over 24120.00 frames. ], tot_loss[loss=0.06797, simple_loss=0.1211, pruned_loss=0.007422, over 2550558.64 frames. ], batch size: 279, lr: 1.05e-02, grad_scale: 32.0 2024-03-16 01:53:41,439 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.prob, batch_count=49510.0, ans=0.125 2024-03-16 01:54:15,365 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer1.prob, batch_count=49576.666666666664, ans=0.125 2024-03-16 01:54:23,239 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer2.prob, batch_count=49610.0, ans=0.125 2024-03-16 01:54:38,809 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward2.hidden_balancer.prob, batch_count=49643.333333333336, ans=0.125 2024-03-16 01:54:42,916 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=49643.333333333336, ans=0.125 2024-03-16 01:54:45,578 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_skip_rate, batch_count=49643.333333333336, ans=0.0 2024-03-16 01:54:50,366 INFO [train_char.py:689] (1/2) Epoch 30, batch 200, loss[loss=0.07116, simple_loss=0.1272, pruned_loss=0.007558, over 24380.00 frames. ], tot_loss[loss=0.06745, simple_loss=0.1202, pruned_loss=0.007371, over 3047537.80 frames. ], batch size: 172, lr: 1.04e-02, grad_scale: 32.0 2024-03-16 01:54:53,023 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer2.min_positive, batch_count=49676.666666666664, ans=0.05 2024-03-16 01:55:20,845 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.211e+01 9.065e+01 1.195e+02 1.639e+02 4.488e+02, threshold=2.390e+02, percent-clipped=6.0 2024-03-16 01:55:33,177 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=13.69 vs. limit=15.0 2024-03-16 01:55:43,561 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=14.12 vs. limit=15.0 2024-03-16 01:55:45,716 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.balancer1.prob, batch_count=49810.0, ans=0.125 2024-03-16 01:55:54,329 INFO [train_char.py:689] (1/2) Epoch 30, batch 250, loss[loss=0.0799, simple_loss=0.1436, pruned_loss=0.00812, over 24102.00 frames. ], tot_loss[loss=0.06799, simple_loss=0.121, pruned_loss=0.007472, over 3442776.95 frames. ], batch size: 199, lr: 1.04e-02, grad_scale: 32.0 2024-03-16 01:56:07,893 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=16.01 vs. limit=22.5 2024-03-16 01:56:16,408 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.5.prob, batch_count=49876.666666666664, ans=0.125 2024-03-16 01:56:34,104 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=49943.333333333336, ans=0.0 2024-03-16 01:56:38,265 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=5.12 vs. limit=15.0 2024-03-16 01:57:04,642 INFO [train_char.py:689] (1/2) Epoch 30, batch 300, loss[loss=0.05959, simple_loss=0.1115, pruned_loss=0.00386, over 24280.00 frames. ], tot_loss[loss=0.06785, simple_loss=0.121, pruned_loss=0.007365, over 3752351.28 frames. ], batch size: 116, lr: 1.04e-02, grad_scale: 32.0 2024-03-16 01:57:10,961 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=50010.0, ans=0.0 2024-03-16 01:57:14,783 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=50010.0, ans=0.125 2024-03-16 01:57:26,855 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=50043.333333333336, ans=0.0 2024-03-16 01:57:33,804 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.289e+01 9.764e+01 1.255e+02 1.703e+02 3.273e+02, threshold=2.510e+02, percent-clipped=8.0 2024-03-16 01:57:43,633 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.71 vs. limit=6.0 2024-03-16 01:57:49,236 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=50110.0, ans=0.125 2024-03-16 01:57:56,671 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-16 01:57:56,766 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=50143.333333333336, ans=0.0 2024-03-16 01:58:06,484 INFO [train_char.py:689] (1/2) Epoch 30, batch 350, loss[loss=0.07285, simple_loss=0.1315, pruned_loss=0.007095, over 24141.00 frames. ], tot_loss[loss=0.06858, simple_loss=0.1223, pruned_loss=0.00745, over 3992772.09 frames. ], batch size: 188, lr: 1.04e-02, grad_scale: 32.0 2024-03-16 01:58:07,527 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.conv_module2.whiten, num_groups=1, num_channels=192, metric=11.09 vs. limit=15.0 2024-03-16 01:58:16,646 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff2_skip_rate, batch_count=50176.666666666664, ans=0.0 2024-03-16 01:58:46,702 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer1.prob, batch_count=50276.666666666664, ans=0.125 2024-03-16 01:59:12,429 INFO [train_char.py:689] (1/2) Epoch 30, batch 400, loss[loss=0.07758, simple_loss=0.1382, pruned_loss=0.008479, over 24123.00 frames. ], tot_loss[loss=0.06894, simple_loss=0.123, pruned_loss=0.007425, over 4180263.50 frames. ], batch size: 251, lr: 1.04e-02, grad_scale: 32.0 2024-03-16 01:59:22,703 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=50343.333333333336, ans=0.125 2024-03-16 01:59:35,972 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=192, metric=5.55 vs. limit=10.0 2024-03-16 01:59:42,545 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.655e+01 8.249e+01 9.557e+01 1.227e+02 2.032e+02, threshold=1.911e+02, percent-clipped=0.0 2024-03-16 01:59:43,971 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.hidden_balancer.prob, batch_count=50410.0, ans=0.125 2024-03-16 01:59:58,634 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=50443.333333333336, ans=0.1 2024-03-16 02:00:17,282 INFO [train_char.py:689] (1/2) Epoch 30, batch 450, loss[loss=0.08072, simple_loss=0.1474, pruned_loss=0.007032, over 24066.00 frames. ], tot_loss[loss=0.0698, simple_loss=0.1246, pruned_loss=0.007496, over 4326501.60 frames. ], batch size: 236, lr: 1.04e-02, grad_scale: 32.0 2024-03-16 02:00:21,324 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.balancer.min_positive, batch_count=50510.0, ans=0.05 2024-03-16 02:00:23,832 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=50510.0, ans=0.125 2024-03-16 02:00:47,815 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.whiten, num_groups=1, num_channels=512, metric=6.16 vs. limit=12.0 2024-03-16 02:00:58,129 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=50610.0, ans=0.125 2024-03-16 02:01:15,048 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=50643.333333333336, ans=0.125 2024-03-16 02:01:18,274 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward3.out_whiten.whitening_limit, batch_count=50643.333333333336, ans=15.0 2024-03-16 02:01:21,333 INFO [train_char.py:689] (1/2) Epoch 30, batch 500, loss[loss=0.08237, simple_loss=0.1433, pruned_loss=0.01072, over 24150.00 frames. ], tot_loss[loss=0.07109, simple_loss=0.1269, pruned_loss=0.007635, over 4439010.67 frames. ], batch size: 223, lr: 1.04e-02, grad_scale: 32.0 2024-03-16 02:02:22,871 INFO [train_char.py:689] (1/2) Epoch 31, batch 0, loss[loss=0.0659, simple_loss=0.1189, pruned_loss=0.006444, over 24210.00 frames. ], tot_loss[loss=0.0659, simple_loss=0.1189, pruned_loss=0.006444, over 24210.00 frames. ], batch size: 311, lr: 1.02e-02, grad_scale: 32.0 2024-03-16 02:02:22,872 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 02:02:36,557 INFO [train_char.py:721] (1/2) Epoch 31, validation: loss=0.06049, simple_loss=0.1115, pruned_loss=0.004723, over 657665.00 frames. 2024-03-16 02:02:36,558 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 02:02:59,634 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.675e+01 8.591e+01 1.068e+02 1.376e+02 2.315e+02, threshold=2.136e+02, percent-clipped=7.0 2024-03-16 02:03:24,971 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=50800.0, ans=0.125 2024-03-16 02:03:43,404 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.prob, batch_count=50833.333333333336, ans=0.125 2024-03-16 02:03:51,324 INFO [train_char.py:689] (1/2) Epoch 31, batch 50, loss[loss=0.06738, simple_loss=0.1207, pruned_loss=0.007032, over 24166.00 frames. ], tot_loss[loss=0.06529, simple_loss=0.1179, pruned_loss=0.006342, over 1086701.15 frames. ], batch size: 344, lr: 1.02e-02, grad_scale: 32.0 2024-03-16 02:03:51,692 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_skip_rate, batch_count=50866.666666666664, ans=0.0 2024-03-16 02:03:53,115 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff2_skip_rate, batch_count=50866.666666666664, ans=0.0 2024-03-16 02:04:10,632 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.4.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 02:04:25,040 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.attention_skip_rate, batch_count=50933.333333333336, ans=0.0 2024-03-16 02:04:27,428 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.scale_min, batch_count=50933.333333333336, ans=0.2 2024-03-16 02:04:27,526 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=50933.333333333336, ans=0.1 2024-03-16 02:04:49,693 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff3_skip_rate, batch_count=51000.0, ans=0.0 2024-03-16 02:04:57,149 INFO [train_char.py:689] (1/2) Epoch 31, batch 100, loss[loss=0.06785, simple_loss=0.1191, pruned_loss=0.008322, over 24223.00 frames. ], tot_loss[loss=0.06582, simple_loss=0.1175, pruned_loss=0.007066, over 1902417.03 frames. ], batch size: 344, lr: 1.02e-02, grad_scale: 32.0 2024-03-16 02:04:57,908 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=6.87 vs. limit=10.0 2024-03-16 02:05:09,421 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=7.21 vs. limit=15.0 2024-03-16 02:05:22,285 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=51066.666666666664, ans=0.1 2024-03-16 02:05:24,571 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.055e+01 9.150e+01 1.136e+02 1.489e+02 2.776e+02, threshold=2.272e+02, percent-clipped=11.0 2024-03-16 02:05:28,857 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=51100.0, ans=0.1 2024-03-16 02:05:29,415 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=15.25 vs. limit=15.0 2024-03-16 02:05:55,647 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=51166.666666666664, ans=0.125 2024-03-16 02:06:07,006 INFO [train_char.py:689] (1/2) Epoch 31, batch 150, loss[loss=0.07782, simple_loss=0.1388, pruned_loss=0.008414, over 24127.00 frames. ], tot_loss[loss=0.0663, simple_loss=0.1181, pruned_loss=0.007268, over 2544391.44 frames. ], batch size: 199, lr: 1.01e-02, grad_scale: 64.0 2024-03-16 02:06:24,711 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass.scale_min, batch_count=51233.333333333336, ans=0.2 2024-03-16 02:06:57,166 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=51300.0, ans=0.0 2024-03-16 02:07:15,085 INFO [train_char.py:689] (1/2) Epoch 31, batch 200, loss[loss=0.06933, simple_loss=0.1221, pruned_loss=0.008279, over 24344.00 frames. ], tot_loss[loss=0.06644, simple_loss=0.1184, pruned_loss=0.00724, over 3049867.32 frames. ], batch size: 158, lr: 1.01e-02, grad_scale: 64.0 2024-03-16 02:07:17,914 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=51366.666666666664, ans=0.125 2024-03-16 02:07:26,037 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward2.out_whiten.whitening_limit, batch_count=51366.666666666664, ans=15.0 2024-03-16 02:07:35,986 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=512, metric=16.85 vs. limit=22.5 2024-03-16 02:07:36,479 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.945e+01 8.396e+01 1.015e+02 1.465e+02 2.987e+02, threshold=2.030e+02, percent-clipped=3.0 2024-03-16 02:07:42,253 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.93 vs. limit=6.0 2024-03-16 02:07:46,676 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=51433.333333333336, ans=0.0 2024-03-16 02:08:07,715 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=512, metric=12.88 vs. limit=22.5 2024-03-16 02:08:21,666 INFO [train_char.py:689] (1/2) Epoch 31, batch 250, loss[loss=0.06119, simple_loss=0.1111, pruned_loss=0.005648, over 24430.00 frames. ], tot_loss[loss=0.0668, simple_loss=0.1189, pruned_loss=0.007346, over 3442568.60 frames. ], batch size: 158, lr: 1.01e-02, grad_scale: 64.0 2024-03-16 02:08:25,586 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer2.prob, batch_count=51533.333333333336, ans=0.125 2024-03-16 02:08:33,472 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=51566.666666666664, ans=0.125 2024-03-16 02:08:34,641 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=51566.666666666664, ans=0.0 2024-03-16 02:08:43,751 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff3_skip_rate, batch_count=51566.666666666664, ans=0.0 2024-03-16 02:08:51,182 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.ff2_skip_rate, batch_count=51600.0, ans=0.0 2024-03-16 02:08:59,286 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.61 vs. limit=6.0 2024-03-16 02:09:23,659 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=51666.666666666664, ans=0.0 2024-03-16 02:09:28,431 INFO [train_char.py:689] (1/2) Epoch 31, batch 300, loss[loss=0.05671, simple_loss=0.1065, pruned_loss=0.003472, over 24261.00 frames. ], tot_loss[loss=0.06693, simple_loss=0.1194, pruned_loss=0.007209, over 3751095.21 frames. ], batch size: 140, lr: 1.01e-02, grad_scale: 64.0 2024-03-16 02:09:36,271 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=51700.0, ans=0.2 2024-03-16 02:09:39,373 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=4.86 vs. limit=15.0 2024-03-16 02:09:46,578 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer1.min_positive, batch_count=51733.333333333336, ans=0.025 2024-03-16 02:09:49,873 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.345e+01 8.460e+01 1.034e+02 1.389e+02 2.588e+02, threshold=2.067e+02, percent-clipped=8.0 2024-03-16 02:10:01,595 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.bypass.scale_min, batch_count=51766.666666666664, ans=0.2 2024-03-16 02:10:33,775 INFO [train_char.py:689] (1/2) Epoch 31, batch 350, loss[loss=0.07023, simple_loss=0.126, pruned_loss=0.00722, over 24331.00 frames. ], tot_loss[loss=0.06731, simple_loss=0.1203, pruned_loss=0.007169, over 3993016.50 frames. ], batch size: 180, lr: 1.01e-02, grad_scale: 64.0 2024-03-16 02:10:38,997 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer1.prob, batch_count=51866.666666666664, ans=0.125 2024-03-16 02:11:02,343 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=256, metric=12.36 vs. limit=15.0 2024-03-16 02:11:15,047 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=51966.666666666664, ans=0.125 2024-03-16 02:11:16,267 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=51966.666666666664, ans=0.125 2024-03-16 02:11:17,931 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=14.55 vs. limit=15.0 2024-03-16 02:11:38,289 INFO [train_char.py:689] (1/2) Epoch 31, batch 400, loss[loss=0.07259, simple_loss=0.1285, pruned_loss=0.008336, over 24172.00 frames. ], tot_loss[loss=0.06799, simple_loss=0.1215, pruned_loss=0.007253, over 4180396.26 frames. ], batch size: 310, lr: 1.01e-02, grad_scale: 64.0 2024-03-16 02:12:01,017 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 7.228e+01 9.105e+01 1.103e+02 1.407e+02 2.463e+02, threshold=2.206e+02, percent-clipped=6.0 2024-03-16 02:12:09,902 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=52100.0, ans=0.1 2024-03-16 02:12:26,311 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=52133.333333333336, ans=0.125 2024-03-16 02:12:29,848 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=52166.666666666664, ans=0.1 2024-03-16 02:12:43,572 INFO [train_char.py:689] (1/2) Epoch 31, batch 450, loss[loss=0.06623, simple_loss=0.1222, pruned_loss=0.005123, over 24371.00 frames. ], tot_loss[loss=0.06909, simple_loss=0.1235, pruned_loss=0.007349, over 4326730.37 frames. ], batch size: 172, lr: 1.01e-02, grad_scale: 64.0 2024-03-16 02:12:45,074 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=52200.0, ans=0.1 2024-03-16 02:13:35,041 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=52333.333333333336, ans=0.0 2024-03-16 02:13:38,903 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=52333.333333333336, ans=0.1 2024-03-16 02:13:44,919 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=52333.333333333336, ans=0.0 2024-03-16 02:13:47,178 INFO [train_char.py:689] (1/2) Epoch 31, batch 500, loss[loss=0.07166, simple_loss=0.1296, pruned_loss=0.006865, over 24383.00 frames. ], tot_loss[loss=0.07036, simple_loss=0.1257, pruned_loss=0.007511, over 4440287.64 frames. ], batch size: 172, lr: 1.01e-02, grad_scale: 64.0 2024-03-16 02:14:42,640 INFO [train_char.py:689] (1/2) Epoch 32, batch 0, loss[loss=0.06122, simple_loss=0.1068, pruned_loss=0.007843, over 24015.00 frames. ], tot_loss[loss=0.06122, simple_loss=0.1068, pruned_loss=0.007843, over 24015.00 frames. ], batch size: 381, lr: 9.89e-03, grad_scale: 64.0 2024-03-16 02:14:42,641 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 02:14:56,362 INFO [train_char.py:721] (1/2) Epoch 32, validation: loss=0.06052, simple_loss=0.1115, pruned_loss=0.004789, over 657665.00 frames. 2024-03-16 02:14:56,363 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 02:14:58,165 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=52390.0, ans=0.2 2024-03-16 02:15:02,648 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.01 vs. limit=6.0 2024-03-16 02:15:05,001 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=52390.0, ans=0.1 2024-03-16 02:15:09,985 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.604e+01 8.417e+01 9.526e+01 1.144e+02 3.547e+02, threshold=1.905e+02, percent-clipped=2.0 2024-03-16 02:15:10,469 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=52423.333333333336, ans=0.1 2024-03-16 02:15:11,854 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer1.prob, batch_count=52423.333333333336, ans=0.125 2024-03-16 02:15:29,547 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.out_combiner.scale_min, batch_count=52456.666666666664, ans=0.2 2024-03-16 02:15:30,810 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=52456.666666666664, ans=0.1 2024-03-16 02:15:30,910 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_skip_rate, batch_count=52456.666666666664, ans=0.0 2024-03-16 02:15:38,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=52490.0, ans=0.125 2024-03-16 02:15:38,633 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=52490.0, ans=0.125 2024-03-16 02:16:00,049 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.skip_rate, batch_count=52523.333333333336, ans=0.04949747468305833 2024-03-16 02:16:01,761 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=14.96 vs. limit=15.0 2024-03-16 02:16:01,924 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=13.45 vs. limit=15.0 2024-03-16 02:16:02,710 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=52523.333333333336, ans=0.0 2024-03-16 02:16:06,470 INFO [train_char.py:689] (1/2) Epoch 32, batch 50, loss[loss=0.07406, simple_loss=0.1319, pruned_loss=0.00811, over 24293.00 frames. ], tot_loss[loss=0.0652, simple_loss=0.1161, pruned_loss=0.007161, over 1079681.63 frames. ], batch size: 267, lr: 9.88e-03, grad_scale: 64.0 2024-03-16 02:16:32,738 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer_ff2.min_abs, batch_count=52590.0, ans=0.1 2024-03-16 02:16:40,816 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=4.78 vs. limit=12.0 2024-03-16 02:16:50,793 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff3_skip_rate, batch_count=52656.666666666664, ans=0.0 2024-03-16 02:17:17,643 INFO [train_char.py:689] (1/2) Epoch 32, batch 100, loss[loss=0.06399, simple_loss=0.1059, pruned_loss=0.01103, over 24011.00 frames. ], tot_loss[loss=0.06366, simple_loss=0.1137, pruned_loss=0.006829, over 1904522.60 frames. ], batch size: 381, lr: 9.87e-03, grad_scale: 64.0 2024-03-16 02:17:24,911 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=384, metric=13.66 vs. limit=22.5 2024-03-16 02:17:25,940 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 02:17:30,821 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.622e+01 8.593e+01 1.094e+02 1.492e+02 2.911e+02, threshold=2.188e+02, percent-clipped=11.0 2024-03-16 02:17:42,467 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=52790.0, ans=0.125 2024-03-16 02:17:46,267 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward3.hidden_balancer.prob, batch_count=52790.0, ans=0.125 2024-03-16 02:17:58,861 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer2.prob, batch_count=52823.333333333336, ans=0.125 2024-03-16 02:18:24,616 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff2_skip_rate, batch_count=52890.0, ans=0.0 2024-03-16 02:18:24,676 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer2.prob, batch_count=52890.0, ans=0.125 2024-03-16 02:18:25,683 INFO [train_char.py:689] (1/2) Epoch 32, batch 150, loss[loss=0.06796, simple_loss=0.1241, pruned_loss=0.005932, over 24397.00 frames. ], tot_loss[loss=0.06444, simple_loss=0.1154, pruned_loss=0.006727, over 2553163.72 frames. ], batch size: 165, lr: 9.85e-03, grad_scale: 64.0 2024-03-16 02:18:26,033 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 02:19:22,153 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module1.balancer2.prob, batch_count=53023.333333333336, ans=0.125 2024-03-16 02:19:29,513 INFO [train_char.py:689] (1/2) Epoch 32, batch 200, loss[loss=0.07301, simple_loss=0.1295, pruned_loss=0.008236, over 24270.00 frames. ], tot_loss[loss=0.06478, simple_loss=0.1162, pruned_loss=0.006697, over 3053494.56 frames. ], batch size: 296, lr: 9.84e-03, grad_scale: 64.0 2024-03-16 02:19:37,606 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=53056.666666666664, ans=0.1 2024-03-16 02:19:46,248 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.329e+01 8.565e+01 1.072e+02 1.480e+02 2.596e+02, threshold=2.144e+02, percent-clipped=2.0 2024-03-16 02:19:53,352 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=10.46 vs. limit=15.0 2024-03-16 02:19:58,007 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.skip_rate, batch_count=53123.333333333336, ans=0.07 2024-03-16 02:20:14,379 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.attention_skip_rate, batch_count=53156.666666666664, ans=0.0 2024-03-16 02:20:21,642 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.scale_min, batch_count=53156.666666666664, ans=0.2 2024-03-16 02:20:21,683 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=53156.666666666664, ans=0.1 2024-03-16 02:20:25,485 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=53190.0, ans=0.125 2024-03-16 02:20:36,587 INFO [train_char.py:689] (1/2) Epoch 32, batch 250, loss[loss=0.06193, simple_loss=0.1136, pruned_loss=0.005109, over 24233.00 frames. ], tot_loss[loss=0.06544, simple_loss=0.1176, pruned_loss=0.006631, over 3446259.78 frames. ], batch size: 328, lr: 9.83e-03, grad_scale: 64.0 2024-03-16 02:20:45,601 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.4.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 02:21:38,325 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=53356.666666666664, ans=0.1 2024-03-16 02:21:43,242 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=53356.666666666664, ans=0.1 2024-03-16 02:21:44,524 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=53356.666666666664, ans=0.1 2024-03-16 02:21:46,758 INFO [train_char.py:689] (1/2) Epoch 32, batch 300, loss[loss=0.06622, simple_loss=0.12, pruned_loss=0.006225, over 24424.00 frames. ], tot_loss[loss=0.06632, simple_loss=0.119, pruned_loss=0.006833, over 3747054.29 frames. ], batch size: 165, lr: 9.82e-03, grad_scale: 64.0 2024-03-16 02:21:49,593 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=53390.0, ans=0.125 2024-03-16 02:21:59,468 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.117e+01 8.738e+01 1.014e+02 1.367e+02 3.046e+02, threshold=2.028e+02, percent-clipped=3.0 2024-03-16 02:22:31,842 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_skip_rate, batch_count=53490.0, ans=0.0 2024-03-16 02:22:39,903 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=6.88 vs. limit=10.0 2024-03-16 02:22:45,866 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=53523.333333333336, ans=0.125 2024-03-16 02:22:48,095 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.bypass_mid.scale_min, batch_count=53523.333333333336, ans=0.2 2024-03-16 02:22:52,646 INFO [train_char.py:689] (1/2) Epoch 32, batch 350, loss[loss=0.05706, simple_loss=0.1069, pruned_loss=0.003585, over 24233.00 frames. ], tot_loss[loss=0.06658, simple_loss=0.1195, pruned_loss=0.00683, over 3990685.59 frames. ], batch size: 116, lr: 9.80e-03, grad_scale: 64.0 2024-03-16 02:23:27,442 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=10.48 vs. limit=15.0 2024-03-16 02:23:28,693 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=512, metric=13.41 vs. limit=22.5 2024-03-16 02:23:30,629 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module2.balancer2.prob, batch_count=53623.333333333336, ans=0.125 2024-03-16 02:23:32,556 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.20 vs. limit=6.0 2024-03-16 02:23:57,622 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer1.max_abs, batch_count=53690.0, ans=10.0 2024-03-16 02:23:58,280 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=8.87 vs. limit=15.0 2024-03-16 02:23:59,952 INFO [train_char.py:689] (1/2) Epoch 32, batch 400, loss[loss=0.06493, simple_loss=0.1202, pruned_loss=0.00482, over 24333.00 frames. ], tot_loss[loss=0.06722, simple_loss=0.1206, pruned_loss=0.006899, over 4178553.43 frames. ], batch size: 180, lr: 9.79e-03, grad_scale: 64.0 2024-03-16 02:24:07,739 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=53723.333333333336, ans=0.0 2024-03-16 02:24:11,635 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=53756.666666666664, ans=0.125 2024-03-16 02:24:12,625 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.176e+01 8.405e+01 1.034e+02 1.494e+02 2.729e+02, threshold=2.068e+02, percent-clipped=9.0 2024-03-16 02:24:24,346 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.ff3_skip_rate, batch_count=53790.0, ans=0.0 2024-03-16 02:24:40,498 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module2.balancer2.prob, batch_count=53823.333333333336, ans=0.125 2024-03-16 02:24:50,309 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.nonlin_attention.balancer.min_positive, batch_count=53856.666666666664, ans=0.05 2024-03-16 02:24:54,263 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.bypass.skip_rate, batch_count=53856.666666666664, ans=0.035 2024-03-16 02:24:56,837 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=53856.666666666664, ans=0.125 2024-03-16 02:24:58,096 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff2_skip_rate, batch_count=53856.666666666664, ans=0.0 2024-03-16 02:24:59,707 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=12.60 vs. limit=22.5 2024-03-16 02:25:04,149 INFO [train_char.py:689] (1/2) Epoch 32, batch 450, loss[loss=0.0668, simple_loss=0.1254, pruned_loss=0.004091, over 24327.00 frames. ], tot_loss[loss=0.06826, simple_loss=0.1227, pruned_loss=0.006936, over 4326936.25 frames. ], batch size: 180, lr: 9.78e-03, grad_scale: 64.0 2024-03-16 02:25:53,417 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer2.prob, batch_count=53990.0, ans=0.125 2024-03-16 02:26:05,743 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=12.95 vs. limit=15.0 2024-03-16 02:26:08,676 INFO [train_char.py:689] (1/2) Epoch 32, batch 500, loss[loss=0.07693, simple_loss=0.1362, pruned_loss=0.00881, over 24086.00 frames. ], tot_loss[loss=0.06904, simple_loss=0.1239, pruned_loss=0.007114, over 4439263.12 frames. ], batch size: 223, lr: 9.77e-03, grad_scale: 64.0 2024-03-16 02:26:10,041 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.8.prob, batch_count=54056.666666666664, ans=0.125 2024-03-16 02:26:10,195 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.attention_skip_rate, batch_count=54056.666666666664, ans=0.0 2024-03-16 02:26:11,519 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=54056.666666666664, ans=0.125 2024-03-16 02:27:05,862 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=54080.0, ans=0.1 2024-03-16 02:27:08,641 INFO [train_char.py:689] (1/2) Epoch 33, batch 0, loss[loss=0.0693, simple_loss=0.1234, pruned_loss=0.007585, over 24331.00 frames. ], tot_loss[loss=0.0693, simple_loss=0.1234, pruned_loss=0.007585, over 24331.00 frames. ], batch size: 180, lr: 9.61e-03, grad_scale: 64.0 2024-03-16 02:27:08,641 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 02:27:22,225 INFO [train_char.py:721] (1/2) Epoch 33, validation: loss=0.05978, simple_loss=0.1102, pruned_loss=0.004664, over 657665.00 frames. 2024-03-16 02:27:22,226 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 02:27:26,295 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.862e+01 8.567e+01 1.007e+02 1.294e+02 2.240e+02, threshold=2.013e+02, percent-clipped=1.0 2024-03-16 02:27:43,394 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=384, metric=13.94 vs. limit=15.0 2024-03-16 02:27:49,849 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass_mid.scale_min, batch_count=54113.333333333336, ans=0.2 2024-03-16 02:27:57,828 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.balancer2.prob, batch_count=54146.666666666664, ans=0.125 2024-03-16 02:28:07,430 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.balancer2.prob, batch_count=54180.0, ans=0.125 2024-03-16 02:28:14,502 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=54180.0, ans=0.0 2024-03-16 02:28:25,082 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff3_skip_rate, batch_count=54213.333333333336, ans=0.0 2024-03-16 02:28:34,602 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.scale_min, batch_count=54246.666666666664, ans=0.2 2024-03-16 02:28:35,663 INFO [train_char.py:689] (1/2) Epoch 33, batch 50, loss[loss=0.06076, simple_loss=0.1066, pruned_loss=0.007446, over 24165.00 frames. ], tot_loss[loss=0.0663, simple_loss=0.1187, pruned_loss=0.006928, over 1091351.96 frames. ], batch size: 362, lr: 9.60e-03, grad_scale: 64.0 2024-03-16 02:28:42,652 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass.skip_rate, batch_count=54246.666666666664, ans=0.09899494936611666 2024-03-16 02:28:45,421 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff2_skip_rate, batch_count=54246.666666666664, ans=0.0 2024-03-16 02:28:56,142 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.ff3_skip_rate, batch_count=54280.0, ans=0.0 2024-03-16 02:29:01,154 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.balancer1.prob, batch_count=54313.333333333336, ans=0.125 2024-03-16 02:29:05,031 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer1.prob, batch_count=54313.333333333336, ans=0.125 2024-03-16 02:29:10,046 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.attention_skip_rate, batch_count=54313.333333333336, ans=0.0 2024-03-16 02:29:18,430 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=10.75 vs. limit=22.5 2024-03-16 02:29:40,747 INFO [train_char.py:689] (1/2) Epoch 33, batch 100, loss[loss=0.05552, simple_loss=0.106, pruned_loss=0.002514, over 24276.00 frames. ], tot_loss[loss=0.06531, simple_loss=0.1177, pruned_loss=0.006444, over 1918448.48 frames. ], batch size: 140, lr: 9.59e-03, grad_scale: 64.0 2024-03-16 02:29:44,587 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.212e+01 8.692e+01 1.036e+02 1.314e+02 2.598e+02, threshold=2.072e+02, percent-clipped=6.0 2024-03-16 02:29:46,180 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=54413.333333333336, ans=0.125 2024-03-16 02:29:51,772 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=11.76 vs. limit=15.0 2024-03-16 02:29:57,779 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=54446.666666666664, ans=0.125 2024-03-16 02:29:58,761 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.layerdrop_rate, batch_count=54446.666666666664, ans=0.015 2024-03-16 02:29:59,495 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=12.89 vs. limit=15.0 2024-03-16 02:30:30,229 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.ff3_skip_rate, batch_count=54513.333333333336, ans=0.0 2024-03-16 02:30:39,232 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=54546.666666666664, ans=0.125 2024-03-16 02:30:47,548 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=384, metric=17.57 vs. limit=22.5 2024-03-16 02:30:48,869 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=11.10 vs. limit=15.0 2024-03-16 02:30:49,295 INFO [train_char.py:689] (1/2) Epoch 33, batch 150, loss[loss=0.05832, simple_loss=0.1109, pruned_loss=0.002876, over 24356.00 frames. ], tot_loss[loss=0.0656, simple_loss=0.1182, pruned_loss=0.006525, over 2555268.70 frames. ], batch size: 146, lr: 9.58e-03, grad_scale: 64.0 2024-03-16 02:30:56,095 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff3_skip_rate, batch_count=54580.0, ans=0.0 2024-03-16 02:31:05,524 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=54613.333333333336, ans=0.125 2024-03-16 02:31:10,594 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module1.balancer1.prob, batch_count=54613.333333333336, ans=0.125 2024-03-16 02:31:11,878 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=54613.333333333336, ans=0.1 2024-03-16 02:31:28,763 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.hidden_balancer.prob, batch_count=54646.666666666664, ans=0.125 2024-03-16 02:31:41,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer2.min_abs, batch_count=54680.0, ans=0.5 2024-03-16 02:31:45,342 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module2.balancer2.min_abs, batch_count=54713.333333333336, ans=0.5 2024-03-16 02:31:54,352 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.balancer.min_positive, batch_count=54713.333333333336, ans=0.05 2024-03-16 02:31:58,016 INFO [train_char.py:689] (1/2) Epoch 33, batch 200, loss[loss=0.07271, simple_loss=0.1264, pruned_loss=0.009522, over 24217.00 frames. ], tot_loss[loss=0.06581, simple_loss=0.1183, pruned_loss=0.006661, over 3052655.55 frames. ], batch size: 311, lr: 9.56e-03, grad_scale: 64.0 2024-03-16 02:32:01,779 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.254e+01 8.119e+01 1.013e+02 1.326e+02 2.731e+02, threshold=2.027e+02, percent-clipped=6.0 2024-03-16 02:32:41,884 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer1.prob, batch_count=54846.666666666664, ans=0.125 2024-03-16 02:32:43,201 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=54846.666666666664, ans=0.125 2024-03-16 02:32:52,101 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.ff3_skip_rate, batch_count=54880.0, ans=0.0 2024-03-16 02:32:57,148 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer1.prob, batch_count=54880.0, ans=0.125 2024-03-16 02:32:57,199 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer2.prob, batch_count=54880.0, ans=0.125 2024-03-16 02:33:01,910 INFO [train_char.py:689] (1/2) Epoch 33, batch 250, loss[loss=0.05914, simple_loss=0.1048, pruned_loss=0.006744, over 24024.00 frames. ], tot_loss[loss=0.06569, simple_loss=0.118, pruned_loss=0.006707, over 3442611.31 frames. ], batch size: 381, lr: 9.55e-03, grad_scale: 64.0 2024-03-16 02:33:17,246 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.balancer.prob, batch_count=54946.666666666664, ans=0.125 2024-03-16 02:33:27,214 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer1.prob, batch_count=54946.666666666664, ans=0.125 2024-03-16 02:33:27,233 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass.scale_min, batch_count=54946.666666666664, ans=0.2 2024-03-16 02:33:32,428 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=54980.0, ans=0.125 2024-03-16 02:33:36,046 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer1.prob, batch_count=54980.0, ans=0.125 2024-03-16 02:33:49,752 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.scale_min, batch_count=55013.333333333336, ans=0.2 2024-03-16 02:33:56,220 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff3_skip_rate, batch_count=55013.333333333336, ans=0.0 2024-03-16 02:34:03,780 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 02:34:12,305 INFO [train_char.py:689] (1/2) Epoch 33, batch 300, loss[loss=0.07551, simple_loss=0.1351, pruned_loss=0.007937, over 24149.00 frames. ], tot_loss[loss=0.06575, simple_loss=0.1182, pruned_loss=0.006651, over 3748627.76 frames. ], batch size: 279, lr: 9.54e-03, grad_scale: 64.0 2024-03-16 02:34:16,095 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.114e+01 8.552e+01 1.120e+02 1.349e+02 2.763e+02, threshold=2.240e+02, percent-clipped=5.0 2024-03-16 02:34:23,996 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer2.prob, batch_count=55113.333333333336, ans=0.125 2024-03-16 02:34:25,337 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.balancer1.prob, batch_count=55113.333333333336, ans=0.125 2024-03-16 02:34:26,564 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.prob, batch_count=55113.333333333336, ans=0.125 2024-03-16 02:34:26,630 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=55113.333333333336, ans=0.0 2024-03-16 02:34:40,479 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.bypass.scale_min, batch_count=55146.666666666664, ans=0.2 2024-03-16 02:34:40,494 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer2.min_abs, batch_count=55146.666666666664, ans=0.5 2024-03-16 02:34:49,528 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.balancer2.prob, batch_count=55180.0, ans=0.125 2024-03-16 02:34:55,171 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=256, metric=2.75 vs. limit=15.0 2024-03-16 02:34:55,623 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.convnext.out_whiten, num_groups=1, num_channels=128, metric=4.20 vs. limit=5.0 2024-03-16 02:35:15,982 INFO [train_char.py:689] (1/2) Epoch 33, batch 350, loss[loss=0.05804, simple_loss=0.1082, pruned_loss=0.003945, over 24240.00 frames. ], tot_loss[loss=0.06598, simple_loss=0.1186, pruned_loss=0.006689, over 3984766.68 frames. ], batch size: 122, lr: 9.53e-03, grad_scale: 64.0 2024-03-16 02:35:38,274 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.whiten2.whitening_limit, batch_count=55280.0, ans=15.0 2024-03-16 02:35:57,295 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.balancer_na.min_abs, batch_count=55346.666666666664, ans=0.02 2024-03-16 02:36:07,590 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=55346.666666666664, ans=0.1 2024-03-16 02:36:23,596 INFO [train_char.py:689] (1/2) Epoch 33, batch 400, loss[loss=0.06551, simple_loss=0.1112, pruned_loss=0.009905, over 24066.00 frames. ], tot_loss[loss=0.0666, simple_loss=0.1197, pruned_loss=0.006747, over 4171643.75 frames. ], batch size: 362, lr: 9.52e-03, grad_scale: 64.0 2024-03-16 02:36:27,444 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.306e+01 8.897e+01 1.076e+02 1.362e+02 2.416e+02, threshold=2.151e+02, percent-clipped=4.0 2024-03-16 02:36:35,403 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.balancer2.prob, batch_count=55446.666666666664, ans=0.125 2024-03-16 02:36:48,019 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=55480.0, ans=0.0 2024-03-16 02:36:52,958 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer1.prob, batch_count=55480.0, ans=0.125 2024-03-16 02:36:56,932 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer_ff3.min_abs, batch_count=55480.0, ans=0.2 2024-03-16 02:36:58,231 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass.scale_min, batch_count=55480.0, ans=0.2 2024-03-16 02:37:28,081 INFO [train_char.py:689] (1/2) Epoch 33, batch 450, loss[loss=0.07366, simple_loss=0.1347, pruned_loss=0.006324, over 24107.00 frames. ], tot_loss[loss=0.06768, simple_loss=0.1216, pruned_loss=0.006886, over 4319977.06 frames. ], batch size: 236, lr: 9.50e-03, grad_scale: 64.0 2024-03-16 02:37:29,938 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=256, metric=2.44 vs. limit=15.0 2024-03-16 02:37:34,737 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer1.prob, batch_count=55580.0, ans=0.125 2024-03-16 02:37:42,128 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer1.prob, batch_count=55613.333333333336, ans=0.125 2024-03-16 02:37:44,548 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.bypass_mid.scale_min, batch_count=55613.333333333336, ans=0.2 2024-03-16 02:38:20,969 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass.skip_rate, batch_count=55713.333333333336, ans=0.035 2024-03-16 02:38:25,168 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=6.50 vs. limit=15.0 2024-03-16 02:38:26,068 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=55713.333333333336, ans=0.2 2024-03-16 02:38:28,045 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn1.whiten, num_groups=1, num_channels=192, metric=12.91 vs. limit=22.5 2024-03-16 02:38:32,118 INFO [train_char.py:689] (1/2) Epoch 33, batch 500, loss[loss=0.06904, simple_loss=0.1269, pruned_loss=0.005566, over 24114.00 frames. ], tot_loss[loss=0.06855, simple_loss=0.1234, pruned_loss=0.006853, over 4434516.44 frames. ], batch size: 199, lr: 9.49e-03, grad_scale: 64.0 2024-03-16 02:38:35,969 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.980e+01 8.635e+01 9.935e+01 1.233e+02 2.467e+02, threshold=1.987e+02, percent-clipped=3.0 2024-03-16 02:39:33,651 INFO [train_char.py:689] (1/2) Epoch 34, batch 0, loss[loss=0.073, simple_loss=0.1342, pruned_loss=0.005881, over 23932.00 frames. ], tot_loss[loss=0.073, simple_loss=0.1342, pruned_loss=0.005881, over 23932.00 frames. ], batch size: 107, lr: 9.35e-03, grad_scale: 64.0 2024-03-16 02:39:33,652 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 02:39:52,196 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.3.encoder.layers.2.self_attn_weights, attn_weights_entropy = tensor([2.4216, 1.9669, 2.3986, 2.3925, 2.4658, 1.4360, 2.5386, 2.4312], device='cuda:1') 2024-03-16 02:39:52,699 INFO [train_char.py:721] (1/2) Epoch 34, validation: loss=0.05905, simple_loss=0.1091, pruned_loss=0.004522, over 657665.00 frames. 2024-03-16 02:39:52,700 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 02:39:56,997 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=55770.0, ans=0.0 2024-03-16 02:39:57,092 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff3_skip_rate, batch_count=55770.0, ans=0.0 2024-03-16 02:40:15,333 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.ff2_skip_rate, batch_count=55803.333333333336, ans=0.0 2024-03-16 02:40:22,036 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=55836.666666666664, ans=0.0 2024-03-16 02:40:34,635 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=256, metric=2.79 vs. limit=15.0 2024-03-16 02:41:02,237 INFO [train_char.py:689] (1/2) Epoch 34, batch 50, loss[loss=0.05652, simple_loss=0.107, pruned_loss=0.003008, over 24197.00 frames. ], tot_loss[loss=0.06499, simple_loss=0.117, pruned_loss=0.006491, over 1087015.61 frames. ], batch size: 122, lr: 9.33e-03, grad_scale: 64.0 2024-03-16 02:41:02,995 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=512, metric=14.34 vs. limit=22.5 2024-03-16 02:41:16,102 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer2.prob, batch_count=55970.0, ans=0.125 2024-03-16 02:41:26,933 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=55970.0, ans=0.125 2024-03-16 02:41:40,592 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=56003.333333333336, ans=0.1 2024-03-16 02:41:49,658 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.attention_skip_rate, batch_count=56036.666666666664, ans=0.0 2024-03-16 02:41:51,546 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.whiten, num_groups=1, num_channels=512, metric=7.09 vs. limit=12.0 2024-03-16 02:41:56,857 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=12.61 vs. limit=15.0 2024-03-16 02:42:00,092 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer2.prob, batch_count=56070.0, ans=0.125 2024-03-16 02:42:08,583 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.955e+01 8.419e+01 1.063e+02 1.403e+02 2.842e+02, threshold=2.126e+02, percent-clipped=6.0 2024-03-16 02:42:13,644 INFO [train_char.py:689] (1/2) Epoch 34, batch 100, loss[loss=0.06791, simple_loss=0.1242, pruned_loss=0.005828, over 24226.00 frames. ], tot_loss[loss=0.06496, simple_loss=0.1171, pruned_loss=0.006388, over 1911802.50 frames. ], batch size: 311, lr: 9.32e-03, grad_scale: 64.0 2024-03-16 02:42:28,121 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_skip_rate, batch_count=56136.666666666664, ans=0.0 2024-03-16 02:42:40,440 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=56136.666666666664, ans=0.1 2024-03-16 02:42:49,367 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=56170.0, ans=0.1 2024-03-16 02:43:22,204 INFO [train_char.py:689] (1/2) Epoch 34, batch 150, loss[loss=0.05902, simple_loss=0.1043, pruned_loss=0.006885, over 23962.00 frames. ], tot_loss[loss=0.06461, simple_loss=0.1167, pruned_loss=0.006249, over 2557873.73 frames. ], batch size: 407, lr: 9.31e-03, grad_scale: 64.0 2024-03-16 02:44:20,868 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.260e+01 8.335e+01 1.032e+02 1.458e+02 2.258e+02, threshold=2.065e+02, percent-clipped=2.0 2024-03-16 02:44:26,201 INFO [train_char.py:689] (1/2) Epoch 34, batch 200, loss[loss=0.06274, simple_loss=0.1159, pruned_loss=0.004775, over 24164.00 frames. ], tot_loss[loss=0.06471, simple_loss=0.1169, pruned_loss=0.006243, over 3055374.04 frames. ], batch size: 311, lr: 9.30e-03, grad_scale: 64.0 2024-03-16 02:44:36,578 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=56436.666666666664, ans=0.0 2024-03-16 02:44:36,589 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.bypass_mid.scale_min, batch_count=56436.666666666664, ans=0.2 2024-03-16 02:44:59,799 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer2.prob, batch_count=56503.333333333336, ans=0.125 2024-03-16 02:45:19,129 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=56536.666666666664, ans=0.0 2024-03-16 02:45:27,255 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.73 vs. limit=6.0 2024-03-16 02:45:33,954 INFO [train_char.py:689] (1/2) Epoch 34, batch 250, loss[loss=0.05889, simple_loss=0.1046, pruned_loss=0.006611, over 23985.00 frames. ], tot_loss[loss=0.06509, simple_loss=0.1178, pruned_loss=0.006218, over 3448375.67 frames. ], batch size: 381, lr: 9.29e-03, grad_scale: 32.0 2024-03-16 02:45:34,257 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=56603.333333333336, ans=0.125 2024-03-16 02:45:35,465 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer2.prob, batch_count=56603.333333333336, ans=0.125 2024-03-16 02:46:08,657 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=56670.0, ans=0.0 2024-03-16 02:46:09,844 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer1.prob, batch_count=56670.0, ans=0.125 2024-03-16 02:46:19,684 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=144, metric=9.13 vs. limit=10.0 2024-03-16 02:46:31,793 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=56736.666666666664, ans=0.125 2024-03-16 02:46:37,818 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.626e+01 8.444e+01 1.114e+02 1.603e+02 3.194e+02, threshold=2.227e+02, percent-clipped=9.0 2024-03-16 02:46:41,678 INFO [train_char.py:689] (1/2) Epoch 34, batch 300, loss[loss=0.073, simple_loss=0.1326, pruned_loss=0.006715, over 24134.00 frames. ], tot_loss[loss=0.06494, simple_loss=0.1174, pruned_loss=0.00624, over 3753605.60 frames. ], batch size: 279, lr: 9.28e-03, grad_scale: 32.0 2024-03-16 02:47:27,974 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=56870.0, ans=0.1 2024-03-16 02:47:28,412 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=11.65 vs. limit=15.0 2024-03-16 02:47:40,974 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module1.whiten, num_groups=1, num_channels=512, metric=4.53 vs. limit=15.0 2024-03-16 02:47:47,919 INFO [train_char.py:689] (1/2) Epoch 34, batch 350, loss[loss=0.06126, simple_loss=0.1147, pruned_loss=0.003912, over 24375.00 frames. ], tot_loss[loss=0.06532, simple_loss=0.1181, pruned_loss=0.00626, over 3989668.84 frames. ], batch size: 172, lr: 9.27e-03, grad_scale: 32.0 2024-03-16 02:47:52,923 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass_mid.scale_min, batch_count=56936.666666666664, ans=0.2 2024-03-16 02:47:59,110 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer1.prob, batch_count=56936.666666666664, ans=0.125 2024-03-16 02:48:34,796 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.self_attn2.whiten.whitening_limit, batch_count=57036.666666666664, ans=22.5 2024-03-16 02:48:39,483 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=57070.0, ans=0.125 2024-03-16 02:48:49,081 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.754e+01 8.377e+01 1.088e+02 1.366e+02 2.811e+02, threshold=2.177e+02, percent-clipped=2.0 2024-03-16 02:48:53,055 INFO [train_char.py:689] (1/2) Epoch 34, batch 400, loss[loss=0.0568, simple_loss=0.1024, pruned_loss=0.005591, over 24027.00 frames. ], tot_loss[loss=0.06615, simple_loss=0.1195, pruned_loss=0.006397, over 4177319.50 frames. ], batch size: 381, lr: 9.25e-03, grad_scale: 32.0 2024-03-16 02:49:00,283 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.balancer_ff3.min_abs, batch_count=57103.333333333336, ans=0.2 2024-03-16 02:49:26,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.attention_skip_rate, batch_count=57170.0, ans=0.0 2024-03-16 02:49:37,823 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=57203.333333333336, ans=0.125 2024-03-16 02:49:54,725 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=192, metric=5.80 vs. limit=10.0 2024-03-16 02:49:56,389 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer2.prob, batch_count=57236.666666666664, ans=0.125 2024-03-16 02:49:58,752 INFO [train_char.py:689] (1/2) Epoch 34, batch 450, loss[loss=0.07778, simple_loss=0.1399, pruned_loss=0.007823, over 24131.00 frames. ], tot_loss[loss=0.06742, simple_loss=0.1217, pruned_loss=0.006542, over 4323387.56 frames. ], batch size: 237, lr: 9.24e-03, grad_scale: 16.0 2024-03-16 02:49:58,976 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=57270.0, ans=0.1 2024-03-16 02:50:50,676 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten1.whitening_limit, batch_count=57403.333333333336, ans=10.0 2024-03-16 02:50:52,621 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=57403.333333333336, ans=0.0 2024-03-16 02:50:59,486 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=10.66 vs. limit=15.0 2024-03-16 02:51:01,377 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.622e+01 8.077e+01 9.168e+01 1.186e+02 1.930e+02, threshold=1.834e+02, percent-clipped=1.0 2024-03-16 02:51:03,958 INFO [train_char.py:689] (1/2) Epoch 34, batch 500, loss[loss=0.07138, simple_loss=0.1303, pruned_loss=0.006212, over 24163.00 frames. ], tot_loss[loss=0.06847, simple_loss=0.1235, pruned_loss=0.006698, over 4435866.83 frames. ], batch size: 251, lr: 9.23e-03, grad_scale: 16.0 2024-03-16 02:51:06,793 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=57436.666666666664, ans=0.1 2024-03-16 02:52:05,657 INFO [train_char.py:689] (1/2) Epoch 35, batch 0, loss[loss=0.05916, simple_loss=0.1012, pruned_loss=0.008585, over 24015.00 frames. ], tot_loss[loss=0.05916, simple_loss=0.1012, pruned_loss=0.008585, over 24015.00 frames. ], batch size: 381, lr: 9.10e-03, grad_scale: 32.0 2024-03-16 02:52:05,657 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 02:52:19,449 INFO [train_char.py:721] (1/2) Epoch 35, validation: loss=0.05958, simple_loss=0.11, pruned_loss=0.004554, over 657665.00 frames. 2024-03-16 02:52:19,450 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 02:52:28,850 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.11 vs. limit=6.0 2024-03-16 02:53:12,371 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=57560.0, ans=0.1 2024-03-16 02:53:37,227 INFO [train_char.py:689] (1/2) Epoch 35, batch 50, loss[loss=0.05901, simple_loss=0.1112, pruned_loss=0.003397, over 24274.00 frames. ], tot_loss[loss=0.06324, simple_loss=0.1144, pruned_loss=0.006046, over 1073900.62 frames. ], batch size: 116, lr: 9.08e-03, grad_scale: 32.0 2024-03-16 02:53:50,478 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=20.44 vs. limit=22.5 2024-03-16 02:53:58,154 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer2.prob, batch_count=57660.0, ans=0.125 2024-03-16 02:54:10,060 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=14.08 vs. limit=15.0 2024-03-16 02:54:28,217 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.ff3_skip_rate, batch_count=57726.666666666664, ans=0.0 2024-03-16 02:54:36,878 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.719e+01 8.040e+01 1.010e+02 1.352e+02 2.642e+02, threshold=2.021e+02, percent-clipped=9.0 2024-03-16 02:54:41,204 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.ff2_skip_rate, batch_count=57760.0, ans=0.0 2024-03-16 02:54:48,504 INFO [train_char.py:689] (1/2) Epoch 35, batch 100, loss[loss=0.05695, simple_loss=0.1069, pruned_loss=0.003511, over 24260.00 frames. ], tot_loss[loss=0.06415, simple_loss=0.116, pruned_loss=0.006169, over 1895341.82 frames. ], batch size: 116, lr: 9.07e-03, grad_scale: 32.0 2024-03-16 02:54:51,339 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=57793.333333333336, ans=0.1 2024-03-16 02:55:05,668 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=5.36 vs. limit=10.0 2024-03-16 02:55:07,885 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.attention_skip_rate, batch_count=57826.666666666664, ans=0.0 2024-03-16 02:55:52,845 INFO [train_char.py:689] (1/2) Epoch 35, batch 150, loss[loss=0.05985, simple_loss=0.1055, pruned_loss=0.007076, over 23975.00 frames. ], tot_loss[loss=0.06438, simple_loss=0.1163, pruned_loss=0.00623, over 2537759.29 frames. ], batch size: 381, lr: 9.06e-03, grad_scale: 32.0 2024-03-16 02:56:26,374 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass_mid.scale_min, batch_count=58026.666666666664, ans=0.2 2024-03-16 02:56:35,608 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=58060.0, ans=0.125 2024-03-16 02:56:39,394 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer1.prob, batch_count=58060.0, ans=0.125 2024-03-16 02:56:40,440 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=58060.0, ans=0.125 2024-03-16 02:56:50,352 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.885e+01 8.780e+01 1.179e+02 1.482e+02 2.837e+02, threshold=2.359e+02, percent-clipped=8.0 2024-03-16 02:56:53,184 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=58093.333333333336, ans=0.1 2024-03-16 02:56:53,227 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=58093.333333333336, ans=0.1 2024-03-16 02:57:01,860 INFO [train_char.py:689] (1/2) Epoch 35, batch 200, loss[loss=0.06785, simple_loss=0.1224, pruned_loss=0.006657, over 24228.00 frames. ], tot_loss[loss=0.06482, simple_loss=0.1175, pruned_loss=0.006056, over 3045130.54 frames. ], batch size: 328, lr: 9.05e-03, grad_scale: 32.0 2024-03-16 02:57:02,785 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=16.77 vs. limit=22.5 2024-03-16 02:57:17,296 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=58160.0, ans=0.1 2024-03-16 02:57:22,289 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=58160.0, ans=0.0 2024-03-16 02:57:23,492 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=58160.0, ans=0.1 2024-03-16 02:57:37,716 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward3.hidden_balancer.prob, batch_count=58193.333333333336, ans=0.125 2024-03-16 02:58:09,645 INFO [train_char.py:689] (1/2) Epoch 35, batch 250, loss[loss=0.07881, simple_loss=0.1391, pruned_loss=0.009246, over 24207.00 frames. ], tot_loss[loss=0.06574, simple_loss=0.1189, pruned_loss=0.006278, over 3440820.84 frames. ], batch size: 266, lr: 9.04e-03, grad_scale: 32.0 2024-03-16 02:58:31,666 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.prob, batch_count=58326.666666666664, ans=0.125 2024-03-16 02:58:34,181 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=58360.0, ans=0.125 2024-03-16 02:58:51,901 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer2.prob, batch_count=58393.333333333336, ans=0.125 2024-03-16 02:58:57,361 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=14.24 vs. limit=15.0 2024-03-16 02:59:02,643 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module1.balancer2.prob, batch_count=58426.666666666664, ans=0.125 2024-03-16 02:59:04,958 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.348e+01 8.615e+01 1.130e+02 1.487e+02 2.732e+02, threshold=2.259e+02, percent-clipped=2.0 2024-03-16 02:59:16,505 INFO [train_char.py:689] (1/2) Epoch 35, batch 300, loss[loss=0.06302, simple_loss=0.118, pruned_loss=0.004009, over 24339.00 frames. ], tot_loss[loss=0.06603, simple_loss=0.1196, pruned_loss=0.006234, over 3750863.38 frames. ], batch size: 180, lr: 9.03e-03, grad_scale: 32.0 2024-03-16 02:59:29,600 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=256, metric=2.76 vs. limit=15.0 2024-03-16 03:00:12,929 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.attention_skip_rate, batch_count=58593.333333333336, ans=0.0 2024-03-16 03:00:21,645 INFO [train_char.py:689] (1/2) Epoch 35, batch 350, loss[loss=0.07007, simple_loss=0.1296, pruned_loss=0.005285, over 24130.00 frames. ], tot_loss[loss=0.06595, simple_loss=0.1198, pruned_loss=0.006065, over 3993120.78 frames. ], batch size: 279, lr: 9.02e-03, grad_scale: 32.0 2024-03-16 03:00:28,476 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=58626.666666666664, ans=0.1 2024-03-16 03:00:41,638 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=58660.0, ans=0.125 2024-03-16 03:00:47,682 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass.skip_rate, batch_count=58693.333333333336, ans=0.09899494936611666 2024-03-16 03:01:15,780 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.986e+01 8.331e+01 1.020e+02 1.326e+02 2.718e+02, threshold=2.039e+02, percent-clipped=3.0 2024-03-16 03:01:23,627 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=58760.0, ans=0.1 2024-03-16 03:01:26,438 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=256, metric=2.62 vs. limit=15.0 2024-03-16 03:01:27,155 INFO [train_char.py:689] (1/2) Epoch 35, batch 400, loss[loss=0.07876, simple_loss=0.1405, pruned_loss=0.008531, over 24108.00 frames. ], tot_loss[loss=0.0668, simple_loss=0.1211, pruned_loss=0.006249, over 4180483.35 frames. ], batch size: 236, lr: 9.01e-03, grad_scale: 32.0 2024-03-16 03:01:48,364 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.whiten, num_groups=1, num_channels=512, metric=7.32 vs. limit=12.0 2024-03-16 03:02:00,160 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=58860.0, ans=0.125 2024-03-16 03:02:06,509 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module2.balancer1.prob, batch_count=58893.333333333336, ans=0.125 2024-03-16 03:02:23,629 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=8.54 vs. limit=15.0 2024-03-16 03:02:33,080 INFO [train_char.py:689] (1/2) Epoch 35, batch 450, loss[loss=0.06295, simple_loss=0.1154, pruned_loss=0.005254, over 24437.00 frames. ], tot_loss[loss=0.06704, simple_loss=0.1216, pruned_loss=0.00625, over 4328157.66 frames. ], batch size: 165, lr: 9.00e-03, grad_scale: 32.0 2024-03-16 03:02:50,824 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.skip_rate, batch_count=58993.333333333336, ans=0.07 2024-03-16 03:03:05,953 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer2.prob, batch_count=59026.666666666664, ans=0.125 2024-03-16 03:03:26,318 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.321e+01 8.212e+01 9.902e+01 1.368e+02 2.400e+02, threshold=1.980e+02, percent-clipped=4.0 2024-03-16 03:03:27,047 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.98 vs. limit=6.0 2024-03-16 03:03:27,940 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.out_combiner.scale_min, batch_count=59093.333333333336, ans=0.2 2024-03-16 03:03:37,711 INFO [train_char.py:689] (1/2) Epoch 35, batch 500, loss[loss=0.06607, simple_loss=0.1207, pruned_loss=0.005713, over 24207.00 frames. ], tot_loss[loss=0.06797, simple_loss=0.1233, pruned_loss=0.006327, over 4438830.61 frames. ], batch size: 296, lr: 8.99e-03, grad_scale: 32.0 2024-03-16 03:04:38,880 INFO [train_char.py:689] (1/2) Epoch 36, batch 0, loss[loss=0.07365, simple_loss=0.1358, pruned_loss=0.005756, over 21462.00 frames. ], tot_loss[loss=0.07365, simple_loss=0.1358, pruned_loss=0.005756, over 21462.00 frames. ], batch size: 85, lr: 8.86e-03, grad_scale: 32.0 2024-03-16 03:04:38,880 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 03:04:46,599 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.4.encoder.layers.0.self_attn_weights, attn_weights_entropy = tensor([4.1094, 3.5970, 3.8034, 3.6533], device='cuda:1') 2024-03-16 03:04:52,703 INFO [train_char.py:721] (1/2) Epoch 36, validation: loss=0.05972, simple_loss=0.1105, pruned_loss=0.004482, over 657665.00 frames. 2024-03-16 03:04:52,704 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 03:04:56,442 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=7.83 vs. limit=10.0 2024-03-16 03:05:07,967 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=59183.333333333336, ans=0.125 2024-03-16 03:05:10,657 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.0.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 03:05:33,908 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=8.18 vs. limit=10.0 2024-03-16 03:06:02,823 INFO [train_char.py:689] (1/2) Epoch 36, batch 50, loss[loss=0.05937, simple_loss=0.1078, pruned_loss=0.005467, over 24178.00 frames. ], tot_loss[loss=0.06401, simple_loss=0.1164, pruned_loss=0.005809, over 1083424.13 frames. ], batch size: 344, lr: 8.85e-03, grad_scale: 32.0 2024-03-16 03:06:14,435 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=59316.666666666664, ans=0.0 2024-03-16 03:06:23,459 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass_mid.scale_min, batch_count=59350.0, ans=0.2 2024-03-16 03:06:37,501 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_skip_rate, batch_count=59383.333333333336, ans=0.0 2024-03-16 03:06:44,126 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.ff2_skip_rate, batch_count=59383.333333333336, ans=0.0 2024-03-16 03:06:44,773 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=11.92 vs. limit=15.0 2024-03-16 03:06:51,373 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.conv_module1.whiten, num_groups=1, num_channels=192, metric=8.70 vs. limit=15.0 2024-03-16 03:06:54,085 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.032e+01 8.670e+01 1.104e+02 1.415e+02 2.881e+02, threshold=2.207e+02, percent-clipped=7.0 2024-03-16 03:07:01,999 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.skip_rate, batch_count=59450.0, ans=0.07 2024-03-16 03:07:04,635 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward3.hidden_balancer.prob, batch_count=59450.0, ans=0.125 2024-03-16 03:07:05,795 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=59450.0, ans=0.1 2024-03-16 03:07:14,607 INFO [train_char.py:689] (1/2) Epoch 36, batch 100, loss[loss=0.0731, simple_loss=0.1367, pruned_loss=0.004748, over 24074.00 frames. ], tot_loss[loss=0.06391, simple_loss=0.1166, pruned_loss=0.005609, over 1907375.16 frames. ], batch size: 199, lr: 8.84e-03, grad_scale: 32.0 2024-03-16 03:07:19,482 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=7.32 vs. limit=15.0 2024-03-16 03:07:42,889 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer1.prob, batch_count=59516.666666666664, ans=0.125 2024-03-16 03:07:44,145 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer2.prob, batch_count=59550.0, ans=0.125 2024-03-16 03:07:55,649 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.ff2_skip_rate, batch_count=59550.0, ans=0.0 2024-03-16 03:07:59,435 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.balancer.max_positive, batch_count=59583.333333333336, ans=0.95 2024-03-16 03:08:23,812 INFO [train_char.py:689] (1/2) Epoch 36, batch 150, loss[loss=0.05825, simple_loss=0.0988, pruned_loss=0.008856, over 23821.00 frames. ], tot_loss[loss=0.06368, simple_loss=0.1158, pruned_loss=0.005772, over 2552027.17 frames. ], batch size: 439, lr: 8.83e-03, grad_scale: 32.0 2024-03-16 03:08:27,171 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=20.86 vs. limit=22.5 2024-03-16 03:08:35,892 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.prob, batch_count=59683.333333333336, ans=0.125 2024-03-16 03:08:54,373 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=7.75 vs. limit=15.0 2024-03-16 03:09:07,550 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.412e+01 8.156e+01 1.033e+02 1.452e+02 2.863e+02, threshold=2.065e+02, percent-clipped=1.0 2024-03-16 03:09:07,786 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.attention_skip_rate, batch_count=59750.0, ans=0.0 2024-03-16 03:09:28,279 INFO [train_char.py:689] (1/2) Epoch 36, batch 200, loss[loss=0.06689, simple_loss=0.1178, pruned_loss=0.00801, over 24357.00 frames. ], tot_loss[loss=0.06416, simple_loss=0.1165, pruned_loss=0.005891, over 3045558.72 frames. ], batch size: 172, lr: 8.81e-03, grad_scale: 32.0 2024-03-16 03:09:38,088 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=14.27 vs. limit=15.0 2024-03-16 03:10:26,616 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module2.balancer1.prob, batch_count=59950.0, ans=0.125 2024-03-16 03:10:27,117 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.whiten, num_groups=1, num_channels=256, metric=5.05 vs. limit=12.0 2024-03-16 03:10:36,404 INFO [train_char.py:689] (1/2) Epoch 36, batch 250, loss[loss=0.0784, simple_loss=0.1436, pruned_loss=0.006584, over 24177.00 frames. ], tot_loss[loss=0.064, simple_loss=0.1163, pruned_loss=0.00586, over 3436926.27 frames. ], batch size: 224, lr: 8.80e-03, grad_scale: 32.0 2024-03-16 03:10:48,648 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.0.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 03:11:08,786 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=60050.0, ans=0.125 2024-03-16 03:11:08,820 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.attention_skip_rate, batch_count=60050.0, ans=0.0 2024-03-16 03:11:20,467 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=384, metric=2.78 vs. limit=15.0 2024-03-16 03:11:23,803 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.765e+01 9.277e+01 1.100e+02 1.418e+02 3.207e+02, threshold=2.200e+02, percent-clipped=6.0 2024-03-16 03:11:31,794 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.skip_rate, batch_count=60116.666666666664, ans=0.04949747468305833 2024-03-16 03:11:44,083 INFO [train_char.py:689] (1/2) Epoch 36, batch 300, loss[loss=0.05834, simple_loss=0.1001, pruned_loss=0.008268, over 24017.00 frames. ], tot_loss[loss=0.06384, simple_loss=0.1161, pruned_loss=0.005793, over 3748235.86 frames. ], batch size: 381, lr: 8.79e-03, grad_scale: 32.0 2024-03-16 03:11:55,826 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.balancer.min_positive, batch_count=60183.333333333336, ans=0.05 2024-03-16 03:12:27,666 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.balancer.prob, batch_count=60250.0, ans=0.125 2024-03-16 03:12:35,208 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass.scale_min, batch_count=60250.0, ans=0.2 2024-03-16 03:12:36,371 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=60250.0, ans=0.125 2024-03-16 03:12:53,581 INFO [train_char.py:689] (1/2) Epoch 36, batch 350, loss[loss=0.05997, simple_loss=0.1115, pruned_loss=0.00422, over 24222.00 frames. ], tot_loss[loss=0.06458, simple_loss=0.1175, pruned_loss=0.005843, over 3990756.82 frames. ], batch size: 328, lr: 8.78e-03, grad_scale: 32.0 2024-03-16 03:13:09,187 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff2_skip_rate, batch_count=60350.0, ans=0.0 2024-03-16 03:13:30,645 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.out_combiner.scale_min, batch_count=60416.666666666664, ans=0.2 2024-03-16 03:13:36,738 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.888e+01 8.417e+01 1.067e+02 1.450e+02 2.294e+02, threshold=2.134e+02, percent-clipped=4.0 2024-03-16 03:13:46,920 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=60450.0, ans=0.125 2024-03-16 03:13:53,204 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=60450.0, ans=0.1 2024-03-16 03:13:56,781 INFO [train_char.py:689] (1/2) Epoch 36, batch 400, loss[loss=0.06521, simple_loss=0.1105, pruned_loss=0.009944, over 24144.00 frames. ], tot_loss[loss=0.0652, simple_loss=0.1185, pruned_loss=0.005944, over 4178393.32 frames. ], batch size: 362, lr: 8.77e-03, grad_scale: 32.0 2024-03-16 03:14:09,929 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=60483.333333333336, ans=0.0 2024-03-16 03:14:36,505 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.bypass.skip_rate, batch_count=60550.0, ans=0.035 2024-03-16 03:14:46,834 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.bypass_mid.scale_min, batch_count=60583.333333333336, ans=0.2 2024-03-16 03:14:54,402 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=60616.666666666664, ans=0.125 2024-03-16 03:15:04,166 INFO [train_char.py:689] (1/2) Epoch 36, batch 450, loss[loss=0.07193, simple_loss=0.1332, pruned_loss=0.005322, over 24105.00 frames. ], tot_loss[loss=0.06628, simple_loss=0.1204, pruned_loss=0.00607, over 4324469.18 frames. ], batch size: 236, lr: 8.76e-03, grad_scale: 32.0 2024-03-16 03:15:22,031 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=9.05 vs. limit=15.0 2024-03-16 03:15:39,191 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.scale_min, batch_count=60716.666666666664, ans=0.2 2024-03-16 03:15:50,391 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.868e+01 8.279e+01 1.038e+02 1.350e+02 2.212e+02, threshold=2.075e+02, percent-clipped=2.0 2024-03-16 03:15:58,267 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=4.39 vs. limit=15.0 2024-03-16 03:16:04,329 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.attention_skip_rate, batch_count=60783.333333333336, ans=0.0 2024-03-16 03:16:10,972 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.whiten, num_groups=1, num_channels=512, metric=6.31 vs. limit=12.0 2024-03-16 03:16:11,406 INFO [train_char.py:689] (1/2) Epoch 36, batch 500, loss[loss=0.07174, simple_loss=0.1332, pruned_loss=0.005126, over 24195.00 frames. ], tot_loss[loss=0.06676, simple_loss=0.1213, pruned_loss=0.006104, over 4438089.33 frames. ], batch size: 212, lr: 8.75e-03, grad_scale: 32.0 2024-03-16 03:16:13,424 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=512, metric=2.98 vs. limit=15.0 2024-03-16 03:17:06,752 INFO [train_char.py:689] (1/2) Epoch 37, batch 0, loss[loss=0.06287, simple_loss=0.1189, pruned_loss=0.003404, over 24300.00 frames. ], tot_loss[loss=0.06287, simple_loss=0.1189, pruned_loss=0.003404, over 24300.00 frames. ], batch size: 140, lr: 8.63e-03, grad_scale: 32.0 2024-03-16 03:17:06,753 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 03:17:20,473 INFO [train_char.py:721] (1/2) Epoch 37, validation: loss=0.05978, simple_loss=0.1109, pruned_loss=0.004319, over 657665.00 frames. 2024-03-16 03:17:20,474 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 03:17:26,300 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward2.hidden_balancer.prob, batch_count=60840.0, ans=0.125 2024-03-16 03:17:35,514 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff3_skip_rate, batch_count=60873.333333333336, ans=0.0 2024-03-16 03:17:40,766 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.prob, batch_count=60873.333333333336, ans=0.125 2024-03-16 03:17:54,687 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=60906.666666666664, ans=0.1 2024-03-16 03:18:01,524 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=60940.0, ans=0.1 2024-03-16 03:18:08,564 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.85 vs. limit=6.0 2024-03-16 03:18:13,290 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.balancer2.prob, batch_count=60973.333333333336, ans=0.125 2024-03-16 03:18:27,629 INFO [train_char.py:689] (1/2) Epoch 37, batch 50, loss[loss=0.06941, simple_loss=0.1248, pruned_loss=0.007036, over 24126.00 frames. ], tot_loss[loss=0.06289, simple_loss=0.1153, pruned_loss=0.005233, over 1086531.54 frames. ], batch size: 279, lr: 8.62e-03, grad_scale: 32.0 2024-03-16 03:18:50,085 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=61040.0, ans=0.125 2024-03-16 03:18:51,418 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.balancer2.prob, batch_count=61040.0, ans=0.125 2024-03-16 03:19:07,805 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.880e+01 8.751e+01 1.106e+02 1.447e+02 2.400e+02, threshold=2.212e+02, percent-clipped=3.0 2024-03-16 03:19:13,140 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer2.min_positive, batch_count=61106.666666666664, ans=0.05 2024-03-16 03:19:25,015 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer2.prob, batch_count=61140.0, ans=0.125 2024-03-16 03:19:31,716 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=256, metric=13.46 vs. limit=22.5 2024-03-16 03:19:32,770 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=61140.0, ans=0.1 2024-03-16 03:19:37,789 INFO [train_char.py:689] (1/2) Epoch 37, batch 100, loss[loss=0.06629, simple_loss=0.1224, pruned_loss=0.005073, over 24327.00 frames. ], tot_loss[loss=0.06368, simple_loss=0.1167, pruned_loss=0.005352, over 1916811.34 frames. ], batch size: 180, lr: 8.61e-03, grad_scale: 32.0 2024-03-16 03:19:38,034 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer2.prob, batch_count=61173.333333333336, ans=0.125 2024-03-16 03:19:44,637 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff3_skip_rate, batch_count=61173.333333333336, ans=0.0 2024-03-16 03:19:56,203 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=61206.666666666664, ans=0.125 2024-03-16 03:20:05,313 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=61240.0, ans=0.1 2024-03-16 03:20:16,433 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=61273.333333333336, ans=0.1 2024-03-16 03:20:35,238 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=384, metric=6.72 vs. limit=15.0 2024-03-16 03:20:42,519 INFO [train_char.py:689] (1/2) Epoch 37, batch 150, loss[loss=0.06963, simple_loss=0.1286, pruned_loss=0.005315, over 24099.00 frames. ], tot_loss[loss=0.06379, simple_loss=0.1165, pruned_loss=0.005538, over 2555736.15 frames. ], batch size: 279, lr: 8.60e-03, grad_scale: 32.0 2024-03-16 03:20:44,238 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 03:21:16,987 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.525e+01 8.338e+01 1.063e+02 1.477e+02 3.765e+02, threshold=2.125e+02, percent-clipped=4.0 2024-03-16 03:21:29,985 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=61440.0, ans=0.125 2024-03-16 03:21:40,269 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass.skip_rate, batch_count=61473.333333333336, ans=0.07 2024-03-16 03:21:40,607 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=18.60 vs. limit=15.0 2024-03-16 03:21:45,379 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.prob, batch_count=61473.333333333336, ans=0.125 2024-03-16 03:21:45,387 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.balancer_na.min_abs, batch_count=61473.333333333336, ans=0.02 2024-03-16 03:21:50,332 INFO [train_char.py:689] (1/2) Epoch 37, batch 200, loss[loss=0.06643, simple_loss=0.1215, pruned_loss=0.005681, over 24337.00 frames. ], tot_loss[loss=0.064, simple_loss=0.117, pruned_loss=0.005476, over 3059964.83 frames. ], batch size: 180, lr: 8.59e-03, grad_scale: 32.0 2024-03-16 03:22:19,270 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=61573.333333333336, ans=0.125 2024-03-16 03:22:43,865 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.bypass_mid.scale_min, batch_count=61640.0, ans=0.2 2024-03-16 03:22:52,705 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer1.prob, batch_count=61640.0, ans=0.125 2024-03-16 03:22:57,631 INFO [train_char.py:689] (1/2) Epoch 37, batch 250, loss[loss=0.07101, simple_loss=0.1299, pruned_loss=0.00606, over 24171.00 frames. ], tot_loss[loss=0.06389, simple_loss=0.1166, pruned_loss=0.00559, over 3451123.61 frames. ], batch size: 251, lr: 8.58e-03, grad_scale: 32.0 2024-03-16 03:23:31,474 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.916e+01 8.382e+01 1.027e+02 1.348e+02 2.463e+02, threshold=2.054e+02, percent-clipped=8.0 2024-03-16 03:23:48,319 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=61806.666666666664, ans=0.0 2024-03-16 03:23:50,854 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=61806.666666666664, ans=0.0 2024-03-16 03:24:00,738 INFO [train_char.py:689] (1/2) Epoch 37, batch 300, loss[loss=0.07221, simple_loss=0.1345, pruned_loss=0.004973, over 24053.00 frames. ], tot_loss[loss=0.06415, simple_loss=0.1169, pruned_loss=0.005724, over 3755958.62 frames. ], batch size: 250, lr: 8.57e-03, grad_scale: 32.0 2024-03-16 03:24:06,658 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=12.97 vs. limit=15.0 2024-03-16 03:24:18,867 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=384, metric=9.84 vs. limit=15.0 2024-03-16 03:24:39,296 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=512, metric=5.48 vs. limit=15.0 2024-03-16 03:24:43,809 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=61940.0, ans=0.1 2024-03-16 03:24:53,920 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward3.hidden_balancer.prob, batch_count=61940.0, ans=0.125 2024-03-16 03:25:10,679 INFO [train_char.py:689] (1/2) Epoch 37, batch 350, loss[loss=0.05837, simple_loss=0.1021, pruned_loss=0.007314, over 23966.00 frames. ], tot_loss[loss=0.06402, simple_loss=0.1167, pruned_loss=0.005687, over 3996139.83 frames. ], batch size: 407, lr: 8.56e-03, grad_scale: 32.0 2024-03-16 03:25:22,607 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.92 vs. limit=6.0 2024-03-16 03:25:30,008 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=512, metric=3.55 vs. limit=15.0 2024-03-16 03:25:42,318 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn1.whiten, num_groups=1, num_channels=192, metric=12.77 vs. limit=22.5 2024-03-16 03:25:43,817 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.924e+01 8.426e+01 1.161e+02 1.562e+02 3.148e+02, threshold=2.322e+02, percent-clipped=6.0 2024-03-16 03:26:07,444 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=62140.0, ans=0.125 2024-03-16 03:26:16,502 INFO [train_char.py:689] (1/2) Epoch 37, batch 400, loss[loss=0.06521, simple_loss=0.1186, pruned_loss=0.005894, over 24341.00 frames. ], tot_loss[loss=0.06486, simple_loss=0.1183, pruned_loss=0.005723, over 4183950.54 frames. ], batch size: 146, lr: 8.55e-03, grad_scale: 32.0 2024-03-16 03:26:19,842 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=8.08 vs. limit=10.0 2024-03-16 03:26:22,874 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.attention_skip_rate, batch_count=62173.333333333336, ans=0.0 2024-03-16 03:26:27,200 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.46 vs. limit=6.0 2024-03-16 03:26:36,793 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=62206.666666666664, ans=0.0 2024-03-16 03:26:38,553 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=12.93 vs. limit=15.0 2024-03-16 03:27:05,776 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.attention_skip_rate, batch_count=62306.666666666664, ans=0.0 2024-03-16 03:27:10,696 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.ff3_skip_rate, batch_count=62306.666666666664, ans=0.0 2024-03-16 03:27:19,378 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.out_combiner.scale_min, batch_count=62306.666666666664, ans=0.2 2024-03-16 03:27:21,624 INFO [train_char.py:689] (1/2) Epoch 37, batch 450, loss[loss=0.05925, simple_loss=0.1108, pruned_loss=0.003873, over 24405.00 frames. ], tot_loss[loss=0.06581, simple_loss=0.12, pruned_loss=0.0058, over 4329371.77 frames. ], batch size: 158, lr: 8.54e-03, grad_scale: 32.0 2024-03-16 03:27:38,308 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn1.whiten.whitening_limit, batch_count=62373.333333333336, ans=22.5 2024-03-16 03:27:52,663 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module2.balancer1.prob, batch_count=62406.666666666664, ans=0.125 2024-03-16 03:27:54,854 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.436e+01 9.101e+01 1.096e+02 1.426e+02 2.997e+02, threshold=2.191e+02, percent-clipped=2.0 2024-03-16 03:28:15,012 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=256, metric=3.43 vs. limit=15.0 2024-03-16 03:28:15,302 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=11.14 vs. limit=15.0 2024-03-16 03:28:24,507 INFO [train_char.py:689] (1/2) Epoch 37, batch 500, loss[loss=0.07502, simple_loss=0.1348, pruned_loss=0.007616, over 24038.00 frames. ], tot_loss[loss=0.06682, simple_loss=0.1219, pruned_loss=0.005867, over 4440806.59 frames. ], batch size: 250, lr: 8.53e-03, grad_scale: 32.0 2024-03-16 03:29:25,840 INFO [train_char.py:689] (1/2) Epoch 38, batch 0, loss[loss=0.06515, simple_loss=0.1217, pruned_loss=0.004291, over 24091.00 frames. ], tot_loss[loss=0.06515, simple_loss=0.1217, pruned_loss=0.004291, over 24091.00 frames. ], batch size: 279, lr: 8.41e-03, grad_scale: 32.0 2024-03-16 03:29:25,841 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 03:29:39,843 INFO [train_char.py:721] (1/2) Epoch 38, validation: loss=0.05841, simple_loss=0.1082, pruned_loss=0.004321, over 657665.00 frames. 2024-03-16 03:29:39,844 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 03:29:45,516 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=62530.0, ans=0.1 2024-03-16 03:30:06,854 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.hidden_balancer.prob, batch_count=62596.666666666664, ans=0.125 2024-03-16 03:30:10,961 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=62596.666666666664, ans=0.0 2024-03-16 03:30:50,500 INFO [train_char.py:689] (1/2) Epoch 38, batch 50, loss[loss=0.07012, simple_loss=0.1296, pruned_loss=0.005317, over 24130.00 frames. ], tot_loss[loss=0.06232, simple_loss=0.1137, pruned_loss=0.005463, over 1082079.29 frames. ], batch size: 199, lr: 8.40e-03, grad_scale: 32.0 2024-03-16 03:30:59,342 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.conv_module1.whiten, num_groups=1, num_channels=256, metric=9.69 vs. limit=15.0 2024-03-16 03:31:12,338 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff2_skip_rate, batch_count=62730.0, ans=0.0 2024-03-16 03:31:13,687 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.bypass.skip_rate, batch_count=62730.0, ans=0.07 2024-03-16 03:31:13,762 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.attention_skip_rate, batch_count=62730.0, ans=0.0 2024-03-16 03:31:20,020 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.727e+01 8.596e+01 1.170e+02 1.433e+02 3.372e+02, threshold=2.341e+02, percent-clipped=1.0 2024-03-16 03:31:22,976 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=62763.333333333336, ans=0.125 2024-03-16 03:31:37,107 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.skip_rate, batch_count=62796.666666666664, ans=0.04949747468305833 2024-03-16 03:31:40,082 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=13.44 vs. limit=15.0 2024-03-16 03:31:47,157 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=62830.0, ans=0.0 2024-03-16 03:31:54,716 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-16 03:31:58,227 INFO [train_char.py:689] (1/2) Epoch 38, batch 100, loss[loss=0.07084, simple_loss=0.1276, pruned_loss=0.007046, over 24213.00 frames. ], tot_loss[loss=0.06245, simple_loss=0.1143, pruned_loss=0.005318, over 1911342.52 frames. ], batch size: 311, lr: 8.39e-03, grad_scale: 32.0 2024-03-16 03:32:00,263 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=9.16 vs. limit=12.0 2024-03-16 03:32:03,753 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer2.prob, batch_count=62863.333333333336, ans=0.125 2024-03-16 03:32:35,539 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=10.54 vs. limit=15.0 2024-03-16 03:32:46,208 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=62963.333333333336, ans=0.125 2024-03-16 03:32:50,123 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.ff2_skip_rate, batch_count=62996.666666666664, ans=0.0 2024-03-16 03:33:02,754 INFO [train_char.py:689] (1/2) Epoch 38, batch 150, loss[loss=0.07343, simple_loss=0.1365, pruned_loss=0.005176, over 24060.00 frames. ], tot_loss[loss=0.06288, simple_loss=0.1151, pruned_loss=0.005317, over 2558765.89 frames. ], batch size: 199, lr: 8.38e-03, grad_scale: 32.0 2024-03-16 03:33:13,870 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=63030.0, ans=0.0 2024-03-16 03:33:19,059 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=63063.333333333336, ans=0.0 2024-03-16 03:33:30,710 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=256, metric=7.58 vs. limit=15.0 2024-03-16 03:33:32,250 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.556e+01 8.114e+01 1.022e+02 1.417e+02 2.954e+02, threshold=2.044e+02, percent-clipped=2.0 2024-03-16 03:33:45,295 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.balancer1.prob, batch_count=63130.0, ans=0.125 2024-03-16 03:33:53,021 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=63130.0, ans=0.0 2024-03-16 03:33:54,817 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=14.98 vs. limit=15.0 2024-03-16 03:34:09,547 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=7.21 vs. limit=10.0 2024-03-16 03:34:13,864 INFO [train_char.py:689] (1/2) Epoch 38, batch 200, loss[loss=0.05274, simple_loss=0.09985, pruned_loss=0.002813, over 24295.00 frames. ], tot_loss[loss=0.06278, simple_loss=0.1151, pruned_loss=0.005222, over 3057137.76 frames. ], batch size: 140, lr: 8.38e-03, grad_scale: 32.0 2024-03-16 03:34:36,381 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.69 vs. limit=6.0 2024-03-16 03:34:40,840 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=63263.333333333336, ans=0.0 2024-03-16 03:35:04,937 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff2_skip_rate, batch_count=63330.0, ans=0.0 2024-03-16 03:35:17,828 INFO [train_char.py:689] (1/2) Epoch 38, batch 250, loss[loss=0.06217, simple_loss=0.1169, pruned_loss=0.003712, over 24287.00 frames. ], tot_loss[loss=0.06328, simple_loss=0.116, pruned_loss=0.005264, over 3449862.96 frames. ], batch size: 180, lr: 8.37e-03, grad_scale: 32.0 2024-03-16 03:35:38,807 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=12.69 vs. limit=15.0 2024-03-16 03:35:43,250 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.430e+01 9.189e+01 1.105e+02 1.475e+02 2.826e+02, threshold=2.209e+02, percent-clipped=4.0 2024-03-16 03:35:49,935 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer2.prob, batch_count=63430.0, ans=0.125 2024-03-16 03:36:05,089 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer1.prob, batch_count=63463.333333333336, ans=0.125 2024-03-16 03:36:07,718 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_skip_rate, batch_count=63463.333333333336, ans=0.0 2024-03-16 03:36:25,295 INFO [train_char.py:689] (1/2) Epoch 38, batch 300, loss[loss=0.06122, simple_loss=0.1099, pruned_loss=0.006292, over 24430.00 frames. ], tot_loss[loss=0.06375, simple_loss=0.1168, pruned_loss=0.005346, over 3753906.22 frames. ], batch size: 152, lr: 8.36e-03, grad_scale: 32.0 2024-03-16 03:37:03,876 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.self_attn_weights.whiten_keys.whitening_limit, batch_count=63596.666666666664, ans=6.0 2024-03-16 03:37:14,573 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer1.prob, batch_count=63630.0, ans=0.125 2024-03-16 03:37:17,457 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.80 vs. limit=6.0 2024-03-16 03:37:21,807 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=10.62 vs. limit=15.0 2024-03-16 03:37:24,597 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=63663.333333333336, ans=0.1 2024-03-16 03:37:30,834 INFO [train_char.py:689] (1/2) Epoch 38, batch 350, loss[loss=0.07243, simple_loss=0.1322, pruned_loss=0.006326, over 24260.00 frames. ], tot_loss[loss=0.06449, simple_loss=0.1181, pruned_loss=0.00544, over 3989645.80 frames. ], batch size: 267, lr: 8.35e-03, grad_scale: 32.0 2024-03-16 03:37:36,467 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=7.74 vs. limit=10.0 2024-03-16 03:37:49,839 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=63730.0, ans=0.125 2024-03-16 03:37:55,638 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.675e+01 7.628e+01 9.692e+01 1.330e+02 2.100e+02, threshold=1.938e+02, percent-clipped=0.0 2024-03-16 03:38:28,311 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=63830.0, ans=0.1 2024-03-16 03:38:36,815 INFO [train_char.py:689] (1/2) Epoch 38, batch 400, loss[loss=0.06401, simple_loss=0.1192, pruned_loss=0.004432, over 24380.00 frames. ], tot_loss[loss=0.06503, simple_loss=0.1191, pruned_loss=0.005506, over 4178313.63 frames. ], batch size: 172, lr: 8.34e-03, grad_scale: 32.0 2024-03-16 03:38:40,931 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=63863.333333333336, ans=0.1 2024-03-16 03:38:55,485 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn1.whiten, num_groups=1, num_channels=192, metric=12.95 vs. limit=22.5 2024-03-16 03:38:58,525 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=63896.666666666664, ans=0.0 2024-03-16 03:38:59,744 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_skip_rate, batch_count=63896.666666666664, ans=0.0 2024-03-16 03:39:06,145 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff3_skip_rate, batch_count=63930.0, ans=0.0 2024-03-16 03:39:08,814 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.scale_min, batch_count=63930.0, ans=0.2 2024-03-16 03:39:41,140 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.ff3_skip_rate, batch_count=64030.0, ans=0.0 2024-03-16 03:39:42,199 INFO [train_char.py:689] (1/2) Epoch 38, batch 450, loss[loss=0.0713, simple_loss=0.134, pruned_loss=0.004297, over 24100.00 frames. ], tot_loss[loss=0.06538, simple_loss=0.1194, pruned_loss=0.005675, over 4323323.22 frames. ], batch size: 199, lr: 8.33e-03, grad_scale: 64.0 2024-03-16 03:39:46,109 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer2.prob, batch_count=64030.0, ans=0.125 2024-03-16 03:40:07,422 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.187e+01 8.210e+01 9.441e+01 1.316e+02 3.422e+02, threshold=1.888e+02, percent-clipped=6.0 2024-03-16 03:40:13,673 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=64096.666666666664, ans=0.125 2024-03-16 03:40:16,115 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.attention_skip_rate, batch_count=64096.666666666664, ans=0.0 2024-03-16 03:40:23,712 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=13.50 vs. limit=15.0 2024-03-16 03:40:30,825 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=10.38 vs. limit=15.0 2024-03-16 03:40:32,707 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=64163.333333333336, ans=0.125 2024-03-16 03:40:33,350 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=384, metric=22.11 vs. limit=22.5 2024-03-16 03:40:41,512 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.prob, batch_count=64163.333333333336, ans=0.125 2024-03-16 03:40:46,345 INFO [train_char.py:689] (1/2) Epoch 38, batch 500, loss[loss=0.07501, simple_loss=0.1345, pruned_loss=0.007775, over 24074.00 frames. ], tot_loss[loss=0.06639, simple_loss=0.1215, pruned_loss=0.005646, over 4434790.41 frames. ], batch size: 251, lr: 8.32e-03, grad_scale: 64.0 2024-03-16 03:41:47,680 INFO [train_char.py:689] (1/2) Epoch 39, batch 0, loss[loss=0.06264, simple_loss=0.1135, pruned_loss=0.005875, over 24117.00 frames. ], tot_loss[loss=0.06264, simple_loss=0.1135, pruned_loss=0.005875, over 24117.00 frames. ], batch size: 310, lr: 8.21e-03, grad_scale: 64.0 2024-03-16 03:41:47,681 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 03:41:57,040 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.0.layers.0.self_attn_weights, attn_weights_entropy = tensor([4.7214, 4.9542, 4.9101, 4.6452], device='cuda:1') 2024-03-16 03:42:00,684 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.2.encoder.layers.0.self_attn_weights, attn_weights_entropy = tensor([4.2624, 4.1978, 3.6285, 3.3878], device='cuda:1') 2024-03-16 03:42:01,448 INFO [train_char.py:721] (1/2) Epoch 39, validation: loss=0.05877, simple_loss=0.1092, pruned_loss=0.004156, over 657665.00 frames. 2024-03-16 03:42:01,449 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 03:42:37,320 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.64 vs. limit=6.0 2024-03-16 03:42:47,557 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer1.prob, batch_count=64320.0, ans=0.125 2024-03-16 03:42:50,546 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.balancer2.prob, batch_count=64320.0, ans=0.125 2024-03-16 03:42:56,866 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=64320.0, ans=0.125 2024-03-16 03:42:59,588 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.ff3_skip_rate, batch_count=64320.0, ans=0.0 2024-03-16 03:43:01,419 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=13.39 vs. limit=15.0 2024-03-16 03:43:17,020 INFO [train_char.py:689] (1/2) Epoch 39, batch 50, loss[loss=0.05976, simple_loss=0.1116, pruned_loss=0.003964, over 24380.00 frames. ], tot_loss[loss=0.06198, simple_loss=0.113, pruned_loss=0.005469, over 1084793.35 frames. ], batch size: 158, lr: 8.20e-03, grad_scale: 64.0 2024-03-16 03:43:25,711 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=256, metric=9.28 vs. limit=15.0 2024-03-16 03:43:33,969 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.158e+01 8.215e+01 9.996e+01 1.469e+02 2.861e+02, threshold=1.999e+02, percent-clipped=8.0 2024-03-16 03:43:52,732 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=64453.333333333336, ans=0.1 2024-03-16 03:44:22,050 INFO [train_char.py:689] (1/2) Epoch 39, batch 100, loss[loss=0.07463, simple_loss=0.1368, pruned_loss=0.006231, over 24197.00 frames. ], tot_loss[loss=0.06162, simple_loss=0.1126, pruned_loss=0.005313, over 1914292.32 frames. ], batch size: 212, lr: 8.19e-03, grad_scale: 64.0 2024-03-16 03:44:23,683 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer1.prob, batch_count=64553.333333333336, ans=0.125 2024-03-16 03:45:20,744 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=9.73 vs. limit=15.0 2024-03-16 03:45:31,665 INFO [train_char.py:689] (1/2) Epoch 39, batch 150, loss[loss=0.06802, simple_loss=0.1209, pruned_loss=0.007583, over 24211.00 frames. ], tot_loss[loss=0.06234, simple_loss=0.114, pruned_loss=0.005343, over 2550381.55 frames. ], batch size: 311, lr: 8.18e-03, grad_scale: 64.0 2024-03-16 03:45:33,146 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.ff3_skip_rate, batch_count=64720.0, ans=0.0 2024-03-16 03:45:45,770 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass_mid.scale_min, batch_count=64720.0, ans=0.2 2024-03-16 03:45:48,202 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=64753.333333333336, ans=0.0 2024-03-16 03:45:48,362 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer2.prob, batch_count=64753.333333333336, ans=0.125 2024-03-16 03:45:52,896 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.482e+01 7.912e+01 1.019e+02 1.370e+02 2.594e+02, threshold=2.039e+02, percent-clipped=7.0 2024-03-16 03:46:20,262 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff3_skip_rate, batch_count=64820.0, ans=0.0 2024-03-16 03:46:26,315 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.out_balancer.prob, batch_count=64853.333333333336, ans=0.125 2024-03-16 03:46:33,056 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 03:46:40,248 INFO [train_char.py:689] (1/2) Epoch 39, batch 200, loss[loss=0.07605, simple_loss=0.1404, pruned_loss=0.005862, over 24223.00 frames. ], tot_loss[loss=0.06261, simple_loss=0.1145, pruned_loss=0.00536, over 3055284.74 frames. ], batch size: 212, lr: 8.17e-03, grad_scale: 64.0 2024-03-16 03:46:52,959 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.75 vs. limit=6.0 2024-03-16 03:47:48,584 INFO [train_char.py:689] (1/2) Epoch 39, batch 250, loss[loss=0.05847, simple_loss=0.1051, pruned_loss=0.005898, over 24148.00 frames. ], tot_loss[loss=0.06348, simple_loss=0.1162, pruned_loss=0.005407, over 3447552.56 frames. ], batch size: 362, lr: 8.16e-03, grad_scale: 64.0 2024-03-16 03:48:04,159 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=65086.666666666664, ans=0.0 2024-03-16 03:48:05,238 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.308e+01 8.296e+01 1.001e+02 1.321e+02 2.719e+02, threshold=2.002e+02, percent-clipped=3.0 2024-03-16 03:48:14,650 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_skip_rate, batch_count=65120.0, ans=0.0 2024-03-16 03:48:32,695 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer2.min_abs, batch_count=65153.333333333336, ans=0.5 2024-03-16 03:48:36,218 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 03:48:47,169 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer2.prob, batch_count=65186.666666666664, ans=0.125 2024-03-16 03:48:54,293 INFO [train_char.py:689] (1/2) Epoch 39, batch 300, loss[loss=0.05822, simple_loss=0.1061, pruned_loss=0.005193, over 24143.00 frames. ], tot_loss[loss=0.06379, simple_loss=0.1167, pruned_loss=0.005443, over 3754971.45 frames. ], batch size: 344, lr: 8.15e-03, grad_scale: 64.0 2024-03-16 03:49:01,978 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=65220.0, ans=0.125 2024-03-16 03:49:05,429 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=65253.333333333336, ans=0.0 2024-03-16 03:49:44,689 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer1.prob, batch_count=65320.0, ans=0.125 2024-03-16 03:49:46,074 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=65353.333333333336, ans=0.1 2024-03-16 03:49:59,793 INFO [train_char.py:689] (1/2) Epoch 39, batch 350, loss[loss=0.05905, simple_loss=0.1114, pruned_loss=0.003327, over 24350.00 frames. ], tot_loss[loss=0.06379, simple_loss=0.1168, pruned_loss=0.005404, over 3995813.23 frames. ], batch size: 152, lr: 8.14e-03, grad_scale: 32.0 2024-03-16 03:50:01,824 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=11.16 vs. limit=15.0 2024-03-16 03:50:07,661 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.ff2_skip_rate, batch_count=65386.666666666664, ans=0.0 2024-03-16 03:50:17,238 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.bypass_mid.scale_min, batch_count=65420.0, ans=0.2 2024-03-16 03:50:19,591 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.214e+01 8.670e+01 1.064e+02 1.435e+02 2.767e+02, threshold=2.127e+02, percent-clipped=6.0 2024-03-16 03:50:22,194 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=65420.0, ans=0.0 2024-03-16 03:50:22,768 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=12.33 vs. limit=15.0 2024-03-16 03:50:37,069 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer2.prob, batch_count=65453.333333333336, ans=0.125 2024-03-16 03:50:37,128 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.skip_rate, batch_count=65453.333333333336, ans=0.07 2024-03-16 03:50:46,841 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=65486.666666666664, ans=0.125 2024-03-16 03:50:49,279 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-16 03:51:00,749 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.prob, batch_count=65520.0, ans=0.125 2024-03-16 03:51:04,149 INFO [train_char.py:689] (1/2) Epoch 39, batch 400, loss[loss=0.05302, simple_loss=0.095, pruned_loss=0.005522, over 23967.00 frames. ], tot_loss[loss=0.06391, simple_loss=0.1171, pruned_loss=0.005384, over 4182259.38 frames. ], batch size: 407, lr: 8.13e-03, grad_scale: 32.0 2024-03-16 03:51:07,954 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.prob, batch_count=65553.33333333333, ans=0.125 2024-03-16 03:51:22,069 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer1.prob, batch_count=65586.66666666667, ans=0.125 2024-03-16 03:51:35,846 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer2.prob, batch_count=65620.0, ans=0.125 2024-03-16 03:52:01,586 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=256, metric=5.25 vs. limit=15.0 2024-03-16 03:52:04,800 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer2.prob, batch_count=65686.66666666667, ans=0.125 2024-03-16 03:52:09,737 INFO [train_char.py:689] (1/2) Epoch 39, batch 450, loss[loss=0.06346, simple_loss=0.1181, pruned_loss=0.004396, over 24406.00 frames. ], tot_loss[loss=0.06497, simple_loss=0.1192, pruned_loss=0.005396, over 4327885.99 frames. ], batch size: 165, lr: 8.13e-03, grad_scale: 32.0 2024-03-16 03:52:11,821 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=4.83 vs. limit=12.0 2024-03-16 03:52:16,261 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=65720.0, ans=0.125 2024-03-16 03:52:24,959 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=65753.33333333333, ans=0.1 2024-03-16 03:52:28,197 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.763e+01 7.580e+01 9.089e+01 1.133e+02 2.105e+02, threshold=1.818e+02, percent-clipped=0.0 2024-03-16 03:52:53,295 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=13.83 vs. limit=15.0 2024-03-16 03:52:58,929 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=65820.0, ans=0.0 2024-03-16 03:53:09,448 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=7.32 vs. limit=10.0 2024-03-16 03:53:13,555 INFO [train_char.py:689] (1/2) Epoch 39, batch 500, loss[loss=0.07605, simple_loss=0.1382, pruned_loss=0.00693, over 24108.00 frames. ], tot_loss[loss=0.06599, simple_loss=0.1212, pruned_loss=0.005394, over 4439663.84 frames. ], batch size: 236, lr: 8.12e-03, grad_scale: 32.0 2024-03-16 03:53:13,815 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass.scale_min, batch_count=65886.66666666667, ans=0.2 2024-03-16 03:53:17,819 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=65886.66666666667, ans=0.0 2024-03-16 03:54:14,410 INFO [train_char.py:689] (1/2) Epoch 40, batch 0, loss[loss=0.05488, simple_loss=0.0978, pruned_loss=0.005981, over 23945.00 frames. ], tot_loss[loss=0.05488, simple_loss=0.0978, pruned_loss=0.005981, over 23945.00 frames. ], batch size: 381, lr: 8.01e-03, grad_scale: 32.0 2024-03-16 03:54:14,411 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 03:54:28,172 INFO [train_char.py:721] (1/2) Epoch 40, validation: loss=0.0578, simple_loss=0.1073, pruned_loss=0.004123, over 657665.00 frames. 2024-03-16 03:54:28,173 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 03:54:52,099 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=512, metric=14.27 vs. limit=22.5 2024-03-16 03:54:58,468 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.prob, batch_count=65976.66666666667, ans=0.125 2024-03-16 03:55:03,728 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer2.prob, batch_count=65976.66666666667, ans=0.125 2024-03-16 03:55:13,462 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.prob, batch_count=66010.0, ans=0.125 2024-03-16 03:55:19,332 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=384, metric=14.27 vs. limit=22.5 2024-03-16 03:55:21,722 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=66010.0, ans=0.1 2024-03-16 03:55:38,826 INFO [train_char.py:689] (1/2) Epoch 40, batch 50, loss[loss=0.05822, simple_loss=0.1074, pruned_loss=0.00452, over 24450.00 frames. ], tot_loss[loss=0.06176, simple_loss=0.113, pruned_loss=0.005257, over 1086475.39 frames. ], batch size: 165, lr: 8.00e-03, grad_scale: 32.0 2024-03-16 03:55:48,243 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.621e+01 7.695e+01 9.241e+01 1.128e+02 2.685e+02, threshold=1.848e+02, percent-clipped=6.0 2024-03-16 03:55:56,499 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass.scale_min, batch_count=66110.0, ans=0.2 2024-03-16 03:56:02,633 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=6.11 vs. limit=6.0 2024-03-16 03:56:20,117 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=66143.33333333333, ans=0.1 2024-03-16 03:56:28,522 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=7.12 vs. limit=10.0 2024-03-16 03:56:49,940 INFO [train_char.py:689] (1/2) Epoch 40, batch 100, loss[loss=0.06233, simple_loss=0.1124, pruned_loss=0.00613, over 24095.00 frames. ], tot_loss[loss=0.06188, simple_loss=0.1141, pruned_loss=0.004817, over 1906358.30 frames. ], batch size: 361, lr: 7.99e-03, grad_scale: 32.0 2024-03-16 03:56:57,998 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff3_skip_rate, batch_count=66243.33333333333, ans=0.0 2024-03-16 03:56:58,000 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.skip_rate, batch_count=66243.33333333333, ans=0.07 2024-03-16 03:57:02,232 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=19.87 vs. limit=22.5 2024-03-16 03:57:07,907 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=66276.66666666667, ans=0.0 2024-03-16 03:57:15,421 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer1.prob, batch_count=66310.0, ans=0.125 2024-03-16 03:57:20,643 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=66310.0, ans=0.0 2024-03-16 03:57:23,856 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=9.82 vs. limit=15.0 2024-03-16 03:57:29,456 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=8.30 vs. limit=15.0 2024-03-16 03:57:44,630 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=66376.66666666667, ans=0.1 2024-03-16 03:57:44,939 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=7.05 vs. limit=15.0 2024-03-16 03:57:49,715 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.skip_rate, batch_count=66376.66666666667, ans=0.04949747468305833 2024-03-16 03:57:52,824 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=15.29 vs. limit=15.0 2024-03-16 03:57:53,702 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_skip_rate, batch_count=66376.66666666667, ans=0.0 2024-03-16 03:57:58,490 INFO [train_char.py:689] (1/2) Epoch 40, batch 150, loss[loss=0.05663, simple_loss=0.1062, pruned_loss=0.003524, over 24182.00 frames. ], tot_loss[loss=0.06247, simple_loss=0.1152, pruned_loss=0.004853, over 2544605.20 frames. ], batch size: 122, lr: 7.99e-03, grad_scale: 32.0 2024-03-16 03:58:03,844 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=66410.0, ans=0.2 2024-03-16 03:58:07,371 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.073e+01 8.361e+01 1.025e+02 1.396e+02 3.471e+02, threshold=2.051e+02, percent-clipped=12.0 2024-03-16 03:58:25,058 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=15.20 vs. limit=15.0 2024-03-16 03:58:35,717 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff3_skip_rate, batch_count=66510.0, ans=0.0 2024-03-16 03:58:39,592 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff2_skip_rate, batch_count=66510.0, ans=0.0 2024-03-16 03:58:39,718 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=66510.0, ans=0.1 2024-03-16 03:58:47,341 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=66510.0, ans=0.1 2024-03-16 03:59:02,569 INFO [train_char.py:689] (1/2) Epoch 40, batch 200, loss[loss=0.06696, simple_loss=0.1218, pruned_loss=0.006087, over 24254.00 frames. ], tot_loss[loss=0.06264, simple_loss=0.1149, pruned_loss=0.005163, over 3046690.12 frames. ], batch size: 296, lr: 7.98e-03, grad_scale: 32.0 2024-03-16 03:59:16,261 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.skip_rate, batch_count=66576.66666666667, ans=0.09899494936611666 2024-03-16 03:59:20,058 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=66610.0, ans=0.125 2024-03-16 03:59:26,595 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer1.prob, batch_count=66610.0, ans=0.125 2024-03-16 03:59:30,475 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer1.prob, batch_count=66610.0, ans=0.125 2024-03-16 03:59:49,736 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=11.59 vs. limit=15.0 2024-03-16 04:00:08,095 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer2.prob, batch_count=66710.0, ans=0.125 2024-03-16 04:00:09,284 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=66710.0, ans=0.125 2024-03-16 04:00:13,373 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass.skip_rate, batch_count=66743.33333333333, ans=0.04949747468305833 2024-03-16 04:00:14,294 INFO [train_char.py:689] (1/2) Epoch 40, batch 250, loss[loss=0.05939, simple_loss=0.1045, pruned_loss=0.007122, over 24010.00 frames. ], tot_loss[loss=0.06264, simple_loss=0.115, pruned_loss=0.005156, over 3438144.89 frames. ], batch size: 381, lr: 7.97e-03, grad_scale: 32.0 2024-03-16 04:00:18,760 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=11.06 vs. limit=22.5 2024-03-16 04:00:19,732 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=66743.33333333333, ans=0.125 2024-03-16 04:00:23,216 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.667e+01 8.395e+01 9.941e+01 1.251e+02 2.533e+02, threshold=1.988e+02, percent-clipped=4.0 2024-03-16 04:00:40,740 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=144, metric=7.03 vs. limit=10.0 2024-03-16 04:00:58,889 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.max_abs, batch_count=66843.33333333333, ans=10.0 2024-03-16 04:01:04,026 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward2.hidden_balancer.prob, batch_count=66843.33333333333, ans=0.125 2024-03-16 04:01:21,865 INFO [train_char.py:689] (1/2) Epoch 40, batch 300, loss[loss=0.0666, simple_loss=0.118, pruned_loss=0.007624, over 24213.00 frames. ], tot_loss[loss=0.06279, simple_loss=0.1154, pruned_loss=0.005092, over 3749875.03 frames. ], batch size: 328, lr: 7.96e-03, grad_scale: 32.0 2024-03-16 04:01:25,161 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=12.59 vs. limit=15.0 2024-03-16 04:01:32,718 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=192, metric=6.64 vs. limit=15.0 2024-03-16 04:01:34,596 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=66943.33333333333, ans=0.125 2024-03-16 04:01:37,228 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.out_combiner.scale_min, batch_count=66943.33333333333, ans=0.2 2024-03-16 04:01:44,941 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=66943.33333333333, ans=0.125 2024-03-16 04:01:52,398 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module2.balancer2.min_positive, batch_count=66976.66666666667, ans=0.05 2024-03-16 04:02:06,149 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 04:02:27,355 INFO [train_char.py:689] (1/2) Epoch 40, batch 350, loss[loss=0.05606, simple_loss=0.1031, pruned_loss=0.004502, over 24134.00 frames. ], tot_loss[loss=0.06262, simple_loss=0.115, pruned_loss=0.005124, over 3991588.59 frames. ], batch size: 362, lr: 7.95e-03, grad_scale: 32.0 2024-03-16 04:02:27,636 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=67076.66666666667, ans=0.2 2024-03-16 04:02:36,125 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.888e+01 8.606e+01 1.082e+02 1.293e+02 2.282e+02, threshold=2.165e+02, percent-clipped=4.0 2024-03-16 04:02:47,348 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module2.balancer2.prob, batch_count=67110.0, ans=0.125 2024-03-16 04:02:47,816 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.21 vs. limit=6.0 2024-03-16 04:03:08,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer1.prob, batch_count=67176.66666666667, ans=0.125 2024-03-16 04:03:19,705 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward2.hidden_balancer.prob, batch_count=67210.0, ans=0.125 2024-03-16 04:03:22,295 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=67210.0, ans=0.0 2024-03-16 04:03:34,236 INFO [train_char.py:689] (1/2) Epoch 40, batch 400, loss[loss=0.05647, simple_loss=0.1032, pruned_loss=0.004844, over 24138.00 frames. ], tot_loss[loss=0.06404, simple_loss=0.1174, pruned_loss=0.005328, over 4177818.91 frames. ], batch size: 362, lr: 7.94e-03, grad_scale: 32.0 2024-03-16 04:03:35,595 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module1.balancer2.prob, batch_count=67243.33333333333, ans=0.125 2024-03-16 04:03:39,362 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=67243.33333333333, ans=0.1 2024-03-16 04:03:50,287 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=67276.66666666667, ans=0.0 2024-03-16 04:03:52,610 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass.skip_rate, batch_count=67276.66666666667, ans=0.035 2024-03-16 04:04:00,213 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass.skip_rate, batch_count=67310.0, ans=0.09899494936611666 2024-03-16 04:04:05,232 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=67310.0, ans=0.0 2024-03-16 04:04:14,701 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=384, metric=2.85 vs. limit=15.0 2024-03-16 04:04:37,886 INFO [train_char.py:689] (1/2) Epoch 40, batch 450, loss[loss=0.06635, simple_loss=0.1211, pruned_loss=0.005781, over 24372.00 frames. ], tot_loss[loss=0.06445, simple_loss=0.1182, pruned_loss=0.00533, over 4324834.73 frames. ], batch size: 172, lr: 7.93e-03, grad_scale: 32.0 2024-03-16 04:04:47,922 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.029e+01 8.375e+01 1.018e+02 1.239e+02 2.132e+02, threshold=2.036e+02, percent-clipped=0.0 2024-03-16 04:05:10,288 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.max_abs, batch_count=67476.66666666667, ans=10.0 2024-03-16 04:05:11,560 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=67476.66666666667, ans=0.1 2024-03-16 04:05:16,318 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.attention_skip_rate, batch_count=67510.0, ans=0.0 2024-03-16 04:05:22,643 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer2.prob, batch_count=67510.0, ans=0.125 2024-03-16 04:05:38,312 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.bypass.skip_rate, batch_count=67543.33333333333, ans=0.035 2024-03-16 04:05:38,432 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer2.prob, batch_count=67543.33333333333, ans=0.125 2024-03-16 04:05:43,145 INFO [train_char.py:689] (1/2) Epoch 40, batch 500, loss[loss=0.06878, simple_loss=0.1295, pruned_loss=0.004026, over 24153.00 frames. ], tot_loss[loss=0.06546, simple_loss=0.1203, pruned_loss=0.005311, over 4437578.54 frames. ], batch size: 251, lr: 7.92e-03, grad_scale: 32.0 2024-03-16 04:06:41,619 INFO [train_char.py:689] (1/2) Epoch 41, batch 0, loss[loss=0.06707, simple_loss=0.1217, pruned_loss=0.006207, over 24144.00 frames. ], tot_loss[loss=0.06707, simple_loss=0.1217, pruned_loss=0.006207, over 24144.00 frames. ], batch size: 279, lr: 7.82e-03, grad_scale: 32.0 2024-03-16 04:06:41,620 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 04:06:55,495 INFO [train_char.py:721] (1/2) Epoch 41, validation: loss=0.05831, simple_loss=0.1085, pruned_loss=0.004049, over 657665.00 frames. 2024-03-16 04:06:55,496 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 04:06:55,829 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass_mid.scale_min, batch_count=67600.0, ans=0.2 2024-03-16 04:07:14,890 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.attention_skip_rate, batch_count=67633.33333333333, ans=0.0 2024-03-16 04:07:37,412 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.bypass.scale_min, batch_count=67666.66666666667, ans=0.2 2024-03-16 04:07:41,414 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=67700.0, ans=0.125 2024-03-16 04:07:55,032 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=67733.33333333333, ans=0.125 2024-03-16 04:07:57,729 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.attention_skip_rate, batch_count=67733.33333333333, ans=0.0 2024-03-16 04:08:12,010 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.601e+01 7.689e+01 9.251e+01 1.241e+02 2.488e+02, threshold=1.850e+02, percent-clipped=3.0 2024-03-16 04:08:12,041 INFO [train_char.py:689] (1/2) Epoch 41, batch 50, loss[loss=0.06414, simple_loss=0.1212, pruned_loss=0.003562, over 24338.00 frames. ], tot_loss[loss=0.06205, simple_loss=0.1152, pruned_loss=0.004442, over 1076874.35 frames. ], batch size: 180, lr: 7.82e-03, grad_scale: 32.0 2024-03-16 04:08:18,305 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.whiten, num_groups=1, num_channels=512, metric=5.42 vs. limit=12.0 2024-03-16 04:08:22,760 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.convnext.out_whiten, num_groups=1, num_channels=128, metric=3.98 vs. limit=5.0 2024-03-16 04:08:43,409 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=512, metric=3.87 vs. limit=15.0 2024-03-16 04:08:44,693 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=384, metric=21.50 vs. limit=22.5 2024-03-16 04:09:07,765 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff3_skip_rate, batch_count=67866.66666666667, ans=0.0 2024-03-16 04:09:22,850 INFO [train_char.py:689] (1/2) Epoch 41, batch 100, loss[loss=0.05727, simple_loss=0.1037, pruned_loss=0.005418, over 23983.00 frames. ], tot_loss[loss=0.06218, simple_loss=0.1152, pruned_loss=0.004599, over 1906916.29 frames. ], batch size: 381, lr: 7.81e-03, grad_scale: 32.0 2024-03-16 04:09:37,302 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=67966.66666666667, ans=0.1 2024-03-16 04:09:45,009 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=67966.66666666667, ans=0.125 2024-03-16 04:09:53,052 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module1.balancer2.prob, batch_count=68000.0, ans=0.125 2024-03-16 04:09:58,383 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer2.prob, batch_count=68000.0, ans=0.125 2024-03-16 04:10:03,763 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=5.60 vs. limit=15.0 2024-03-16 04:10:22,688 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer1.prob, batch_count=68066.66666666667, ans=0.125 2024-03-16 04:10:27,701 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.008e+01 8.767e+01 1.112e+02 1.458e+02 2.855e+02, threshold=2.223e+02, percent-clipped=15.0 2024-03-16 04:10:27,732 INFO [train_char.py:689] (1/2) Epoch 41, batch 150, loss[loss=0.05517, simple_loss=0.1037, pruned_loss=0.00332, over 24185.00 frames. ], tot_loss[loss=0.06282, simple_loss=0.1162, pruned_loss=0.004735, over 2548715.71 frames. ], batch size: 122, lr: 7.80e-03, grad_scale: 32.0 2024-03-16 04:10:29,170 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.hidden_balancer.prob, batch_count=68100.0, ans=0.125 2024-03-16 04:11:02,233 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.attention_skip_rate, batch_count=68166.66666666667, ans=0.0 2024-03-16 04:11:02,656 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.82 vs. limit=6.0 2024-03-16 04:11:11,350 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.ff3_skip_rate, batch_count=68200.0, ans=0.0 2024-03-16 04:11:21,065 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=29.47 vs. limit=22.5 2024-03-16 04:11:38,185 INFO [train_char.py:689] (1/2) Epoch 41, batch 200, loss[loss=0.06862, simple_loss=0.1303, pruned_loss=0.003483, over 24024.00 frames. ], tot_loss[loss=0.0626, simple_loss=0.1157, pruned_loss=0.004748, over 3048121.61 frames. ], batch size: 199, lr: 7.79e-03, grad_scale: 32.0 2024-03-16 04:11:58,749 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward2.out_whiten.whitening_limit, batch_count=68300.0, ans=15.0 2024-03-16 04:12:04,704 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.hidden_balancer.prob, batch_count=68333.33333333333, ans=0.125 2024-03-16 04:12:17,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass_mid.scale_min, batch_count=68366.66666666667, ans=0.2 2024-03-16 04:12:18,100 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=5.76 vs. limit=15.0 2024-03-16 04:12:18,152 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=512, metric=17.56 vs. limit=22.5 2024-03-16 04:12:34,569 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.76 vs. limit=6.0 2024-03-16 04:12:40,564 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=68400.0, ans=0.125 2024-03-16 04:12:45,389 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.nonlin_attention.balancer.max_positive, batch_count=68433.33333333333, ans=0.95 2024-03-16 04:12:46,459 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.687e+01 8.116e+01 1.013e+02 1.265e+02 2.717e+02, threshold=2.026e+02, percent-clipped=5.0 2024-03-16 04:12:46,489 INFO [train_char.py:689] (1/2) Epoch 41, batch 250, loss[loss=0.05397, simple_loss=0.1014, pruned_loss=0.003285, over 24319.00 frames. ], tot_loss[loss=0.06204, simple_loss=0.1147, pruned_loss=0.004704, over 3442977.38 frames. ], batch size: 129, lr: 7.78e-03, grad_scale: 32.0 2024-03-16 04:13:29,698 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=68533.33333333333, ans=0.0 2024-03-16 04:13:55,248 INFO [train_char.py:689] (1/2) Epoch 41, batch 300, loss[loss=0.05477, simple_loss=0.09408, pruned_loss=0.007732, over 23791.00 frames. ], tot_loss[loss=0.06227, simple_loss=0.1148, pruned_loss=0.00487, over 3743740.53 frames. ], batch size: 439, lr: 7.77e-03, grad_scale: 32.0 2024-03-16 04:13:58,029 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff3_skip_rate, batch_count=68600.0, ans=0.0 2024-03-16 04:14:10,766 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=68633.33333333333, ans=0.125 2024-03-16 04:14:36,852 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=68700.0, ans=0.125 2024-03-16 04:14:50,391 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer2.prob, batch_count=68733.33333333333, ans=0.125 2024-03-16 04:15:01,135 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.706e+01 8.610e+01 1.058e+02 1.394e+02 2.614e+02, threshold=2.115e+02, percent-clipped=7.0 2024-03-16 04:15:01,169 INFO [train_char.py:689] (1/2) Epoch 41, batch 350, loss[loss=0.06285, simple_loss=0.1181, pruned_loss=0.003798, over 24397.00 frames. ], tot_loss[loss=0.06248, simple_loss=0.1151, pruned_loss=0.004903, over 3986067.77 frames. ], batch size: 172, lr: 7.77e-03, grad_scale: 32.0 2024-03-16 04:15:02,634 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=68766.66666666667, ans=0.125 2024-03-16 04:15:05,237 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.scale_min, batch_count=68766.66666666667, ans=0.2 2024-03-16 04:15:43,968 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.nonlin_attention.balancer.prob, batch_count=68866.66666666667, ans=0.125 2024-03-16 04:15:53,932 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass_mid.scale_min, batch_count=68900.0, ans=0.2 2024-03-16 04:15:57,723 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer2.prob, batch_count=68900.0, ans=0.125 2024-03-16 04:16:04,282 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=68900.0, ans=0.1 2024-03-16 04:16:06,564 INFO [train_char.py:689] (1/2) Epoch 41, batch 400, loss[loss=0.07181, simple_loss=0.1314, pruned_loss=0.006125, over 23989.00 frames. ], tot_loss[loss=0.06319, simple_loss=0.1164, pruned_loss=0.004998, over 4175429.83 frames. ], batch size: 250, lr: 7.76e-03, grad_scale: 32.0 2024-03-16 04:16:08,125 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer2.prob, batch_count=68933.33333333333, ans=0.125 2024-03-16 04:16:13,281 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.ff2_skip_rate, batch_count=68933.33333333333, ans=0.0 2024-03-16 04:16:17,109 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.4.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-16 04:16:31,468 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.attention_skip_rate, batch_count=68966.66666666667, ans=0.0 2024-03-16 04:16:36,430 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=69000.0, ans=0.1 2024-03-16 04:16:59,072 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.bypass_mid.scale_min, batch_count=69066.66666666667, ans=0.2 2024-03-16 04:17:12,743 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.353e+01 8.046e+01 9.625e+01 1.198e+02 2.490e+02, threshold=1.925e+02, percent-clipped=3.0 2024-03-16 04:17:12,773 INFO [train_char.py:689] (1/2) Epoch 41, batch 450, loss[loss=0.06804, simple_loss=0.1271, pruned_loss=0.004485, over 24135.00 frames. ], tot_loss[loss=0.06412, simple_loss=0.1182, pruned_loss=0.005041, over 4321651.53 frames. ], batch size: 223, lr: 7.75e-03, grad_scale: 32.0 2024-03-16 04:17:29,039 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass_mid.scale_min, batch_count=69133.33333333333, ans=0.2 2024-03-16 04:17:42,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer2.prob, batch_count=69166.66666666667, ans=0.125 2024-03-16 04:17:59,773 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=7.23 vs. limit=12.0 2024-03-16 04:18:09,754 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=14.31 vs. limit=15.0 2024-03-16 04:18:16,721 INFO [train_char.py:689] (1/2) Epoch 41, batch 500, loss[loss=0.06897, simple_loss=0.1275, pruned_loss=0.005232, over 24146.00 frames. ], tot_loss[loss=0.06473, simple_loss=0.1194, pruned_loss=0.005039, over 4435096.75 frames. ], batch size: 251, lr: 7.74e-03, grad_scale: 32.0 2024-03-16 04:18:21,271 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer2.prob, batch_count=69266.66666666667, ans=0.125 2024-03-16 04:19:15,595 INFO [train_char.py:689] (1/2) Epoch 42, batch 0, loss[loss=0.05833, simple_loss=0.1034, pruned_loss=0.006644, over 24251.00 frames. ], tot_loss[loss=0.05833, simple_loss=0.1034, pruned_loss=0.006644, over 24251.00 frames. ], batch size: 328, lr: 7.65e-03, grad_scale: 32.0 2024-03-16 04:19:15,596 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 04:19:29,718 INFO [train_char.py:721] (1/2) Epoch 42, validation: loss=0.05734, simple_loss=0.1066, pruned_loss=0.004037, over 657665.00 frames. 2024-03-16 04:19:29,719 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 04:19:33,247 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.70 vs. limit=6.0 2024-03-16 04:19:47,333 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=69323.33333333333, ans=0.125 2024-03-16 04:19:54,321 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.balancer1.prob, batch_count=69323.33333333333, ans=0.125 2024-03-16 04:20:11,183 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=69356.66666666667, ans=0.125 2024-03-16 04:20:32,427 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.178e+01 8.024e+01 9.663e+01 1.246e+02 2.316e+02, threshold=1.933e+02, percent-clipped=3.0 2024-03-16 04:20:41,972 INFO [train_char.py:689] (1/2) Epoch 42, batch 50, loss[loss=0.06415, simple_loss=0.1157, pruned_loss=0.006281, over 24206.00 frames. ], tot_loss[loss=0.05853, simple_loss=0.1076, pruned_loss=0.004752, over 1083207.56 frames. ], batch size: 311, lr: 7.64e-03, grad_scale: 32.0 2024-03-16 04:20:54,872 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=13.96 vs. limit=15.0 2024-03-16 04:21:34,854 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer2.prob, batch_count=69556.66666666667, ans=0.125 2024-03-16 04:21:58,384 INFO [train_char.py:689] (1/2) Epoch 42, batch 100, loss[loss=0.06107, simple_loss=0.1142, pruned_loss=0.003984, over 24429.00 frames. ], tot_loss[loss=0.06133, simple_loss=0.1128, pruned_loss=0.00495, over 1913642.04 frames. ], batch size: 165, lr: 7.63e-03, grad_scale: 32.0 2024-03-16 04:22:05,245 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer1.prob, batch_count=69623.33333333333, ans=0.125 2024-03-16 04:22:47,186 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_skip_rate, batch_count=69723.33333333333, ans=0.0 2024-03-16 04:22:53,188 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.851e+01 9.031e+01 1.137e+02 1.503e+02 3.377e+02, threshold=2.274e+02, percent-clipped=9.0 2024-03-16 04:23:02,321 INFO [train_char.py:689] (1/2) Epoch 42, batch 150, loss[loss=0.05655, simple_loss=0.09839, pruned_loss=0.007351, over 23838.00 frames. ], tot_loss[loss=0.06125, simple_loss=0.1129, pruned_loss=0.004821, over 2560348.48 frames. ], batch size: 439, lr: 7.62e-03, grad_scale: 32.0 2024-03-16 04:23:20,265 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.balancer1.prob, batch_count=69823.33333333333, ans=0.125 2024-03-16 04:23:24,118 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=69823.33333333333, ans=0.125 2024-03-16 04:23:32,087 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=69856.66666666667, ans=0.1 2024-03-16 04:23:42,764 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.whiten, num_groups=1, num_channels=256, metric=5.42 vs. limit=12.0 2024-03-16 04:23:53,580 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass_mid.scale_min, batch_count=69923.33333333333, ans=0.2 2024-03-16 04:24:06,286 INFO [train_char.py:689] (1/2) Epoch 42, batch 200, loss[loss=0.05451, simple_loss=0.09733, pruned_loss=0.005848, over 24030.00 frames. ], tot_loss[loss=0.06112, simple_loss=0.1126, pruned_loss=0.004815, over 3054613.17 frames. ], batch size: 408, lr: 7.61e-03, grad_scale: 32.0 2024-03-16 04:24:17,977 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.ff3_skip_rate, batch_count=69990.0, ans=0.0 2024-03-16 04:24:20,511 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module2.balancer1.prob, batch_count=69990.0, ans=0.125 2024-03-16 04:24:49,515 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=12.08 vs. limit=15.0 2024-03-16 04:25:06,413 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=70090.0, ans=0.125 2024-03-16 04:25:08,630 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.710e+01 8.104e+01 9.775e+01 1.256e+02 2.245e+02, threshold=1.955e+02, percent-clipped=0.0 2024-03-16 04:25:17,554 INFO [train_char.py:689] (1/2) Epoch 42, batch 250, loss[loss=0.07532, simple_loss=0.1394, pruned_loss=0.005599, over 24086.00 frames. ], tot_loss[loss=0.06144, simple_loss=0.1134, pruned_loss=0.004759, over 3442322.82 frames. ], batch size: 236, lr: 7.61e-03, grad_scale: 32.0 2024-03-16 04:25:22,799 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer1.prob, batch_count=70123.33333333333, ans=0.125 2024-03-16 04:25:34,073 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=70156.66666666667, ans=0.125 2024-03-16 04:25:37,899 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=70156.66666666667, ans=0.1 2024-03-16 04:25:44,960 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=7.10 vs. limit=12.0 2024-03-16 04:25:49,309 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer1.min_positive, batch_count=70190.0, ans=0.025 2024-03-16 04:25:54,973 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=12.70 vs. limit=22.5 2024-03-16 04:26:19,168 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=11.56 vs. limit=15.0 2024-03-16 04:26:20,993 INFO [train_char.py:689] (1/2) Epoch 42, batch 300, loss[loss=0.07198, simple_loss=0.1332, pruned_loss=0.00536, over 24215.00 frames. ], tot_loss[loss=0.06208, simple_loss=0.1147, pruned_loss=0.004738, over 3749795.21 frames. ], batch size: 266, lr: 7.60e-03, grad_scale: 32.0 2024-03-16 04:26:22,534 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=70290.0, ans=0.125 2024-03-16 04:26:33,899 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.skip_rate, batch_count=70323.33333333333, ans=0.04949747468305833 2024-03-16 04:27:20,482 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.974e+01 7.732e+01 9.408e+01 1.206e+02 2.073e+02, threshold=1.882e+02, percent-clipped=1.0 2024-03-16 04:27:29,483 INFO [train_char.py:689] (1/2) Epoch 42, batch 350, loss[loss=0.06847, simple_loss=0.1292, pruned_loss=0.003889, over 24104.00 frames. ], tot_loss[loss=0.06235, simple_loss=0.1152, pruned_loss=0.004739, over 3986705.06 frames. ], batch size: 199, lr: 7.59e-03, grad_scale: 32.0 2024-03-16 04:27:43,867 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.min_positive, batch_count=70490.0, ans=0.05 2024-03-16 04:27:45,135 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=70490.0, ans=0.2 2024-03-16 04:27:46,367 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.ff2_skip_rate, batch_count=70490.0, ans=0.0 2024-03-16 04:27:53,668 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.attention_skip_rate, batch_count=70523.33333333333, ans=0.0 2024-03-16 04:28:32,419 INFO [train_char.py:689] (1/2) Epoch 42, batch 400, loss[loss=0.06222, simple_loss=0.1098, pruned_loss=0.00734, over 24178.00 frames. ], tot_loss[loss=0.06304, simple_loss=0.1164, pruned_loss=0.004862, over 4176191.17 frames. ], batch size: 344, lr: 7.58e-03, grad_scale: 32.0 2024-03-16 04:28:40,463 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=70623.33333333333, ans=0.125 2024-03-16 04:28:49,990 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=70656.66666666667, ans=0.1 2024-03-16 04:29:29,637 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.709e+01 7.851e+01 1.003e+02 1.420e+02 2.377e+02, threshold=2.006e+02, percent-clipped=9.0 2024-03-16 04:29:31,256 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass_mid.scale_min, batch_count=70756.66666666667, ans=0.2 2024-03-16 04:29:38,355 INFO [train_char.py:689] (1/2) Epoch 42, batch 450, loss[loss=0.07402, simple_loss=0.1341, pruned_loss=0.006967, over 24092.00 frames. ], tot_loss[loss=0.06377, simple_loss=0.1179, pruned_loss=0.004838, over 4323911.50 frames. ], batch size: 199, lr: 7.57e-03, grad_scale: 32.0 2024-03-16 04:29:44,920 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass_mid.scale_min, batch_count=70790.0, ans=0.2 2024-03-16 04:29:57,992 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=70823.33333333333, ans=0.125 2024-03-16 04:30:05,396 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=70856.66666666667, ans=0.125 2024-03-16 04:30:37,667 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=70923.33333333333, ans=0.0 2024-03-16 04:30:43,385 INFO [train_char.py:689] (1/2) Epoch 42, batch 500, loss[loss=0.07294, simple_loss=0.1325, pruned_loss=0.006713, over 24114.00 frames. ], tot_loss[loss=0.06405, simple_loss=0.1183, pruned_loss=0.004909, over 4438891.99 frames. ], batch size: 223, lr: 7.57e-03, grad_scale: 32.0 2024-03-16 04:30:43,674 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer1.prob, batch_count=70956.66666666667, ans=0.125 2024-03-16 04:30:48,547 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 04:31:44,758 INFO [train_char.py:689] (1/2) Epoch 43, batch 0, loss[loss=0.05914, simple_loss=0.1101, pruned_loss=0.00408, over 24194.00 frames. ], tot_loss[loss=0.05914, simple_loss=0.1101, pruned_loss=0.00408, over 24194.00 frames. ], batch size: 122, lr: 7.47e-03, grad_scale: 32.0 2024-03-16 04:31:44,758 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 04:31:57,344 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.5.encoder.layers.0.self_attn_weights, attn_weights_entropy = tensor([4.1854, 3.8055, 3.7611, 3.8634], device='cuda:1') 2024-03-16 04:31:58,277 INFO [train_char.py:721] (1/2) Epoch 43, validation: loss=0.05703, simple_loss=0.1061, pruned_loss=0.00396, over 657665.00 frames. 2024-03-16 04:31:58,278 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 04:31:58,767 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer1.prob, batch_count=70980.0, ans=0.125 2024-03-16 04:32:05,436 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=70980.0, ans=0.1 2024-03-16 04:32:10,739 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=71013.33333333333, ans=0.125 2024-03-16 04:32:13,496 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=71013.33333333333, ans=0.1 2024-03-16 04:32:32,036 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward2.hidden_balancer.prob, batch_count=71046.66666666667, ans=0.125 2024-03-16 04:32:34,850 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff2_skip_rate, batch_count=71046.66666666667, ans=0.0 2024-03-16 04:32:46,887 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.456e+01 8.096e+01 1.078e+02 1.458e+02 3.187e+02, threshold=2.155e+02, percent-clipped=5.0 2024-03-16 04:33:05,937 INFO [train_char.py:689] (1/2) Epoch 43, batch 50, loss[loss=0.06608, simple_loss=0.1227, pruned_loss=0.004721, over 24347.00 frames. ], tot_loss[loss=0.06191, simple_loss=0.1147, pruned_loss=0.004537, over 1090710.56 frames. ], batch size: 172, lr: 7.47e-03, grad_scale: 32.0 2024-03-16 04:33:46,518 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=13.52 vs. limit=15.0 2024-03-16 04:33:48,745 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module1.balancer2.prob, batch_count=71213.33333333333, ans=0.125 2024-03-16 04:33:51,538 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff3_skip_rate, batch_count=71246.66666666667, ans=0.0 2024-03-16 04:34:16,491 INFO [train_char.py:689] (1/2) Epoch 43, batch 100, loss[loss=0.04159, simple_loss=0.0752, pruned_loss=0.00399, over 22839.00 frames. ], tot_loss[loss=0.06149, simple_loss=0.1136, pruned_loss=0.004683, over 1912783.26 frames. ], batch size: 483, lr: 7.46e-03, grad_scale: 32.0 2024-03-16 04:34:50,288 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module1.whiten, num_groups=1, num_channels=512, metric=4.43 vs. limit=15.0 2024-03-16 04:34:56,260 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer1.prob, batch_count=71413.33333333333, ans=0.125 2024-03-16 04:35:03,833 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.462e+01 8.199e+01 1.186e+02 1.417e+02 2.618e+02, threshold=2.372e+02, percent-clipped=5.0 2024-03-16 04:35:13,313 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.ff3_skip_rate, batch_count=71446.66666666667, ans=0.0 2024-03-16 04:35:18,722 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=4.61 vs. limit=15.0 2024-03-16 04:35:21,859 INFO [train_char.py:689] (1/2) Epoch 43, batch 150, loss[loss=0.0568, simple_loss=0.1082, pruned_loss=0.002679, over 24211.00 frames. ], tot_loss[loss=0.06156, simple_loss=0.1138, pruned_loss=0.004668, over 2556032.19 frames. ], batch size: 122, lr: 7.45e-03, grad_scale: 32.0 2024-03-16 04:35:28,585 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass_mid.scale_min, batch_count=71480.0, ans=0.2 2024-03-16 04:35:35,286 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.balancer.max_positive, batch_count=71513.33333333333, ans=0.95 2024-03-16 04:36:33,707 INFO [train_char.py:689] (1/2) Epoch 43, batch 200, loss[loss=0.0732, simple_loss=0.1354, pruned_loss=0.005494, over 24135.00 frames. ], tot_loss[loss=0.06153, simple_loss=0.1139, pruned_loss=0.004591, over 3055930.72 frames. ], batch size: 251, lr: 7.44e-03, grad_scale: 16.0 2024-03-16 04:36:33,998 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer1.prob, batch_count=71646.66666666667, ans=0.125 2024-03-16 04:36:40,498 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer1.prob, batch_count=71646.66666666667, ans=0.125 2024-03-16 04:36:41,691 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.8.prob, batch_count=71646.66666666667, ans=0.125 2024-03-16 04:36:41,893 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass.skip_rate, batch_count=71646.66666666667, ans=0.07 2024-03-16 04:36:45,760 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.balancer2.prob, batch_count=71680.0, ans=0.125 2024-03-16 04:37:01,418 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.conv_module2.whiten, num_groups=1, num_channels=192, metric=3.04 vs. limit=15.0 2024-03-16 04:37:04,591 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=71713.33333333333, ans=0.1 2024-03-16 04:37:11,057 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.max_abs, batch_count=71746.66666666667, ans=10.0 2024-03-16 04:37:20,811 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.506e+01 7.657e+01 9.834e+01 1.161e+02 2.413e+02, threshold=1.967e+02, percent-clipped=1.0 2024-03-16 04:37:35,680 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=71780.0, ans=0.1 2024-03-16 04:37:37,738 INFO [train_char.py:689] (1/2) Epoch 43, batch 250, loss[loss=0.05643, simple_loss=0.1078, pruned_loss=0.002552, over 21551.00 frames. ], tot_loss[loss=0.06143, simple_loss=0.1137, pruned_loss=0.004591, over 3442185.39 frames. ], batch size: 86, lr: 7.44e-03, grad_scale: 16.0 2024-03-16 04:37:49,566 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.attention_skip_rate, batch_count=71846.66666666667, ans=0.0 2024-03-16 04:37:57,216 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=71846.66666666667, ans=0.0 2024-03-16 04:38:22,988 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=71913.33333333333, ans=0.125 2024-03-16 04:38:25,507 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=71913.33333333333, ans=0.125 2024-03-16 04:38:25,974 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=384, metric=3.58 vs. limit=15.0 2024-03-16 04:38:26,003 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward2.out_whiten.whitening_limit, batch_count=71913.33333333333, ans=15.0 2024-03-16 04:38:29,950 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=8.55 vs. limit=15.0 2024-03-16 04:38:39,415 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer2.prob, batch_count=71946.66666666667, ans=0.125 2024-03-16 04:38:44,798 INFO [train_char.py:689] (1/2) Epoch 43, batch 300, loss[loss=0.07081, simple_loss=0.1327, pruned_loss=0.004477, over 24108.00 frames. ], tot_loss[loss=0.06157, simple_loss=0.1141, pruned_loss=0.00452, over 3747836.53 frames. ], batch size: 223, lr: 7.43e-03, grad_scale: 16.0 2024-03-16 04:38:45,029 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 04:38:52,202 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=71980.0, ans=0.125 2024-03-16 04:39:14,240 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=72046.66666666667, ans=0.0 2024-03-16 04:39:31,890 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_skip_rate, batch_count=72080.0, ans=0.0 2024-03-16 04:39:35,330 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.018e+01 8.350e+01 1.044e+02 1.466e+02 2.517e+02, threshold=2.089e+02, percent-clipped=4.0 2024-03-16 04:39:51,839 INFO [train_char.py:689] (1/2) Epoch 43, batch 350, loss[loss=0.07444, simple_loss=0.1359, pruned_loss=0.00649, over 24095.00 frames. ], tot_loss[loss=0.0619, simple_loss=0.1147, pruned_loss=0.004552, over 3990430.23 frames. ], batch size: 199, lr: 7.42e-03, grad_scale: 16.0 2024-03-16 04:40:07,776 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=384, metric=16.00 vs. limit=22.5 2024-03-16 04:40:42,208 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.8.prob, batch_count=72246.66666666667, ans=0.125 2024-03-16 04:40:53,538 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer2.prob, batch_count=72280.0, ans=0.125 2024-03-16 04:40:58,227 INFO [train_char.py:689] (1/2) Epoch 43, batch 400, loss[loss=0.07119, simple_loss=0.1283, pruned_loss=0.007048, over 24134.00 frames. ], tot_loss[loss=0.06262, simple_loss=0.1161, pruned_loss=0.004588, over 4180133.92 frames. ], batch size: 279, lr: 7.41e-03, grad_scale: 32.0 2024-03-16 04:41:22,539 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=256, metric=14.67 vs. limit=22.5 2024-03-16 04:41:30,156 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.whiten, num_groups=1, num_channels=384, metric=3.34 vs. limit=12.0 2024-03-16 04:41:34,607 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff2_skip_rate, batch_count=72413.33333333333, ans=0.0 2024-03-16 04:41:37,200 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=72413.33333333333, ans=0.1 2024-03-16 04:41:44,469 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.552e+01 7.784e+01 9.416e+01 1.205e+02 2.528e+02, threshold=1.883e+02, percent-clipped=1.0 2024-03-16 04:42:03,530 INFO [train_char.py:689] (1/2) Epoch 43, batch 450, loss[loss=0.06199, simple_loss=0.1166, pruned_loss=0.003693, over 24382.00 frames. ], tot_loss[loss=0.06321, simple_loss=0.1172, pruned_loss=0.004613, over 4326757.62 frames. ], batch size: 158, lr: 7.40e-03, grad_scale: 32.0 2024-03-16 04:42:11,204 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=72480.0, ans=0.1 2024-03-16 04:42:12,939 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=8.81 vs. limit=15.0 2024-03-16 04:42:20,163 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=5.60 vs. limit=15.0 2024-03-16 04:42:33,870 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=384, metric=4.88 vs. limit=15.0 2024-03-16 04:42:44,672 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=72580.0, ans=0.1 2024-03-16 04:42:58,434 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 04:43:06,838 INFO [train_char.py:689] (1/2) Epoch 43, batch 500, loss[loss=0.06379, simple_loss=0.1179, pruned_loss=0.004825, over 24156.00 frames. ], tot_loss[loss=0.06388, simple_loss=0.1184, pruned_loss=0.004694, over 4439664.24 frames. ], batch size: 188, lr: 7.40e-03, grad_scale: 32.0 2024-03-16 04:43:09,515 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward2.hidden_balancer.prob, batch_count=72646.66666666667, ans=0.125 2024-03-16 04:44:09,206 INFO [train_char.py:689] (1/2) Epoch 44, batch 0, loss[loss=0.05406, simple_loss=0.1041, pruned_loss=0.002013, over 24232.00 frames. ], tot_loss[loss=0.05406, simple_loss=0.1041, pruned_loss=0.002013, over 24232.00 frames. ], batch size: 134, lr: 7.31e-03, grad_scale: 32.0 2024-03-16 04:44:09,206 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 04:44:22,969 INFO [train_char.py:721] (1/2) Epoch 44, validation: loss=0.05726, simple_loss=0.1064, pruned_loss=0.004049, over 657665.00 frames. 2024-03-16 04:44:22,970 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 04:44:44,550 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer1.prob, batch_count=72703.33333333333, ans=0.125 2024-03-16 04:44:44,554 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=72703.33333333333, ans=0.1 2024-03-16 04:44:54,558 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=384, metric=4.28 vs. limit=15.0 2024-03-16 04:45:01,116 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=72736.66666666667, ans=0.125 2024-03-16 04:45:03,367 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.514e+01 7.866e+01 9.379e+01 1.188e+02 2.924e+02, threshold=1.876e+02, percent-clipped=7.0 2024-03-16 04:45:32,216 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten2.whitening_limit, batch_count=72803.33333333333, ans=15.0 2024-03-16 04:45:35,505 INFO [train_char.py:689] (1/2) Epoch 44, batch 50, loss[loss=0.06122, simple_loss=0.1137, pruned_loss=0.004348, over 24385.00 frames. ], tot_loss[loss=0.06009, simple_loss=0.1106, pruned_loss=0.004784, over 1088020.84 frames. ], batch size: 172, lr: 7.30e-03, grad_scale: 32.0 2024-03-16 04:45:35,856 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.attention_skip_rate, batch_count=72836.66666666667, ans=0.0 2024-03-16 04:45:38,668 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_skip_rate, batch_count=72836.66666666667, ans=0.0 2024-03-16 04:45:41,327 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module2.balancer1.prob, batch_count=72836.66666666667, ans=0.125 2024-03-16 04:45:43,954 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=72836.66666666667, ans=0.1 2024-03-16 04:45:45,319 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.balancer1.prob, batch_count=72836.66666666667, ans=0.125 2024-03-16 04:45:46,649 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.attention_skip_rate, batch_count=72836.66666666667, ans=0.0 2024-03-16 04:46:07,410 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=72903.33333333333, ans=0.125 2024-03-16 04:46:24,532 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=72936.66666666667, ans=0.1 2024-03-16 04:46:27,213 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward3.hidden_balancer.prob, batch_count=72970.0, ans=0.125 2024-03-16 04:46:41,177 INFO [train_char.py:689] (1/2) Epoch 44, batch 100, loss[loss=0.05989, simple_loss=0.1066, pruned_loss=0.006607, over 24215.00 frames. ], tot_loss[loss=0.06056, simple_loss=0.1115, pruned_loss=0.004801, over 1914361.41 frames. ], batch size: 344, lr: 7.30e-03, grad_scale: 32.0 2024-03-16 04:46:54,234 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module1.balancer2.prob, batch_count=73036.66666666667, ans=0.125 2024-03-16 04:46:55,438 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=73036.66666666667, ans=0.1 2024-03-16 04:47:04,276 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.nonlin_attention.balancer.prob, batch_count=73036.66666666667, ans=0.125 2024-03-16 04:47:10,981 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.out_combiner.scale_min, batch_count=73070.0, ans=0.2 2024-03-16 04:47:19,291 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.038e+01 8.357e+01 1.042e+02 1.354e+02 2.440e+02, threshold=2.085e+02, percent-clipped=11.0 2024-03-16 04:47:23,379 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=73103.33333333333, ans=0.1 2024-03-16 04:47:37,742 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=73136.66666666667, ans=0.1 2024-03-16 04:47:42,811 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff2_skip_rate, batch_count=73136.66666666667, ans=0.0 2024-03-16 04:47:45,462 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.attention_skip_rate, batch_count=73136.66666666667, ans=0.0 2024-03-16 04:47:51,932 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.out_balancer.prob, batch_count=73170.0, ans=0.125 2024-03-16 04:47:52,145 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=73170.0, ans=0.2 2024-03-16 04:47:53,101 INFO [train_char.py:689] (1/2) Epoch 44, batch 150, loss[loss=0.0684, simple_loss=0.1291, pruned_loss=0.003841, over 24131.00 frames. ], tot_loss[loss=0.06022, simple_loss=0.1108, pruned_loss=0.004822, over 2556751.49 frames. ], batch size: 279, lr: 7.29e-03, grad_scale: 32.0 2024-03-16 04:48:26,638 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer1.prob, batch_count=73236.66666666667, ans=0.125 2024-03-16 04:48:57,173 INFO [train_char.py:689] (1/2) Epoch 44, batch 200, loss[loss=0.06909, simple_loss=0.1291, pruned_loss=0.004552, over 24161.00 frames. ], tot_loss[loss=0.06034, simple_loss=0.1113, pruned_loss=0.004668, over 3059281.02 frames. ], batch size: 199, lr: 7.28e-03, grad_scale: 32.0 2024-03-16 04:49:12,394 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=73370.0, ans=0.0 2024-03-16 04:49:19,279 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=8.36 vs. limit=15.0 2024-03-16 04:49:26,459 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=73403.33333333333, ans=0.125 2024-03-16 04:49:29,629 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=192, metric=6.62 vs. limit=15.0 2024-03-16 04:49:35,048 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.444e+01 8.244e+01 1.074e+02 1.495e+02 2.535e+02, threshold=2.148e+02, percent-clipped=6.0 2024-03-16 04:49:41,642 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=73436.66666666667, ans=0.1 2024-03-16 04:49:53,146 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass.scale_min, batch_count=73470.0, ans=0.2 2024-03-16 04:49:56,690 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_skip_rate, batch_count=73470.0, ans=0.0 2024-03-16 04:50:00,031 INFO [train_char.py:689] (1/2) Epoch 44, batch 250, loss[loss=0.07582, simple_loss=0.1428, pruned_loss=0.004429, over 24060.00 frames. ], tot_loss[loss=0.06107, simple_loss=0.1131, pruned_loss=0.004528, over 3455333.21 frames. ], batch size: 236, lr: 7.27e-03, grad_scale: 32.0 2024-03-16 04:50:00,283 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer1.prob, batch_count=73503.33333333333, ans=0.125 2024-03-16 04:50:07,495 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.dropout.p, batch_count=73503.33333333333, ans=0.1 2024-03-16 04:51:10,171 INFO [train_char.py:689] (1/2) Epoch 44, batch 300, loss[loss=0.05953, simple_loss=0.1109, pruned_loss=0.004092, over 24294.00 frames. ], tot_loss[loss=0.06173, simple_loss=0.1143, pruned_loss=0.004586, over 3756283.14 frames. ], batch size: 146, lr: 7.27e-03, grad_scale: 32.0 2024-03-16 04:51:27,118 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer1.prob, batch_count=73703.33333333333, ans=0.125 2024-03-16 04:51:31,895 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=73703.33333333333, ans=0.0 2024-03-16 04:51:40,012 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=192, metric=8.49 vs. limit=15.0 2024-03-16 04:51:47,941 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.439e+01 7.602e+01 9.912e+01 1.258e+02 2.134e+02, threshold=1.982e+02, percent-clipped=0.0 2024-03-16 04:51:48,247 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.hidden_balancer.prob, batch_count=73770.0, ans=0.125 2024-03-16 04:52:07,526 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer1.prob, batch_count=73803.33333333333, ans=0.125 2024-03-16 04:52:13,580 INFO [train_char.py:689] (1/2) Epoch 44, batch 350, loss[loss=0.07089, simple_loss=0.1319, pruned_loss=0.004955, over 24107.00 frames. ], tot_loss[loss=0.06205, simple_loss=0.1148, pruned_loss=0.004653, over 3988909.82 frames. ], batch size: 251, lr: 7.26e-03, grad_scale: 32.0 2024-03-16 04:52:28,543 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=73870.0, ans=0.0 2024-03-16 04:52:32,851 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=7.57 vs. limit=10.0 2024-03-16 04:52:59,992 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.balancer1.prob, batch_count=73936.66666666667, ans=0.125 2024-03-16 04:53:06,273 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.balancer2.prob, batch_count=73936.66666666667, ans=0.125 2024-03-16 04:53:21,174 INFO [train_char.py:689] (1/2) Epoch 44, batch 400, loss[loss=0.06365, simple_loss=0.1187, pruned_loss=0.004284, over 24246.00 frames. ], tot_loss[loss=0.06287, simple_loss=0.1165, pruned_loss=0.004625, over 4176414.36 frames. ], batch size: 296, lr: 7.25e-03, grad_scale: 32.0 2024-03-16 04:53:44,056 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=74036.66666666667, ans=0.125 2024-03-16 04:53:58,878 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.85 vs. limit=6.0 2024-03-16 04:54:00,568 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.435e+01 7.694e+01 1.010e+02 1.300e+02 2.560e+02, threshold=2.020e+02, percent-clipped=2.0 2024-03-16 04:54:04,701 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module2.balancer1.prob, batch_count=74103.33333333333, ans=0.125 2024-03-16 04:54:26,908 INFO [train_char.py:689] (1/2) Epoch 44, batch 450, loss[loss=0.05788, simple_loss=0.1084, pruned_loss=0.003682, over 24253.00 frames. ], tot_loss[loss=0.06358, simple_loss=0.1178, pruned_loss=0.004675, over 4321734.44 frames. ], batch size: 296, lr: 7.24e-03, grad_scale: 32.0 2024-03-16 04:54:27,631 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=5.33 vs. limit=15.0 2024-03-16 04:54:40,913 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=74203.33333333333, ans=0.125 2024-03-16 04:54:44,656 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.attention_skip_rate, batch_count=74203.33333333333, ans=0.0 2024-03-16 04:55:08,997 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module1.whiten, num_groups=1, num_channels=512, metric=4.02 vs. limit=15.0 2024-03-16 04:55:30,546 INFO [train_char.py:689] (1/2) Epoch 44, batch 500, loss[loss=0.06821, simple_loss=0.1251, pruned_loss=0.005653, over 24082.00 frames. ], tot_loss[loss=0.06441, simple_loss=0.1194, pruned_loss=0.004693, over 4434935.50 frames. ], batch size: 199, lr: 7.24e-03, grad_scale: 32.0 2024-03-16 04:56:29,229 INFO [train_char.py:689] (1/2) Epoch 45, batch 0, loss[loss=0.0646, simple_loss=0.1216, pruned_loss=0.003817, over 23860.00 frames. ], tot_loss[loss=0.0646, simple_loss=0.1216, pruned_loss=0.003817, over 23860.00 frames. ], batch size: 107, lr: 7.15e-03, grad_scale: 32.0 2024-03-16 04:56:29,230 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 04:56:43,111 INFO [train_char.py:721] (1/2) Epoch 45, validation: loss=0.05746, simple_loss=0.1068, pruned_loss=0.004041, over 657665.00 frames. 2024-03-16 04:56:43,112 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 04:57:23,338 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.118e+01 7.188e+01 8.594e+01 1.151e+02 1.943e+02, threshold=1.719e+02, percent-clipped=0.0 2024-03-16 04:57:29,212 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_skip_rate, batch_count=74426.66666666667, ans=0.0 2024-03-16 04:57:32,787 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=74460.0, ans=0.125 2024-03-16 04:57:35,933 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=5.90 vs. limit=10.0 2024-03-16 04:57:38,424 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward3.hidden_balancer.prob, batch_count=74460.0, ans=0.125 2024-03-16 04:57:43,001 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=512, metric=18.45 vs. limit=22.5 2024-03-16 04:57:59,194 INFO [train_char.py:689] (1/2) Epoch 45, batch 50, loss[loss=0.06199, simple_loss=0.1154, pruned_loss=0.004283, over 24385.00 frames. ], tot_loss[loss=0.05958, simple_loss=0.1106, pruned_loss=0.004277, over 1081544.67 frames. ], batch size: 172, lr: 7.15e-03, grad_scale: 32.0 2024-03-16 04:58:04,725 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_module2.balancer1.prob, batch_count=74526.66666666667, ans=0.125 2024-03-16 04:58:08,653 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module1.balancer2.prob, batch_count=74526.66666666667, ans=0.125 2024-03-16 04:58:20,264 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module1.balancer2.min_abs, batch_count=74560.0, ans=0.5 2024-03-16 04:58:23,018 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff3_skip_rate, batch_count=74560.0, ans=0.0 2024-03-16 04:58:29,893 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=512, metric=3.36 vs. limit=15.0 2024-03-16 04:59:03,973 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=74660.0, ans=0.125 2024-03-16 04:59:10,219 INFO [train_char.py:689] (1/2) Epoch 45, batch 100, loss[loss=0.06311, simple_loss=0.1167, pruned_loss=0.004778, over 24445.00 frames. ], tot_loss[loss=0.06036, simple_loss=0.1119, pruned_loss=0.004427, over 1907899.12 frames. ], batch size: 165, lr: 7.14e-03, grad_scale: 32.0 2024-03-16 04:59:32,667 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=6.72 vs. limit=15.0 2024-03-16 04:59:45,007 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.680e+01 7.803e+01 9.889e+01 1.283e+02 2.570e+02, threshold=1.978e+02, percent-clipped=8.0 2024-03-16 04:59:45,375 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.3.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:00:14,776 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:00:19,664 INFO [train_char.py:689] (1/2) Epoch 45, batch 150, loss[loss=0.06212, simple_loss=0.1107, pruned_loss=0.006754, over 24174.00 frames. ], tot_loss[loss=0.06076, simple_loss=0.1125, pruned_loss=0.004502, over 2552330.47 frames. ], batch size: 344, lr: 7.13e-03, grad_scale: 32.0 2024-03-16 05:00:20,595 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=11.99 vs. limit=15.0 2024-03-16 05:00:23,828 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=74860.0, ans=0.2 2024-03-16 05:00:29,062 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=74860.0, ans=0.1 2024-03-16 05:00:39,314 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.balancer.max_positive, batch_count=74893.33333333333, ans=0.95 2024-03-16 05:00:46,641 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=9.35 vs. limit=15.0 2024-03-16 05:01:01,218 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.out_combiner.scale_min, batch_count=74960.0, ans=0.2 2024-03-16 05:01:08,999 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.balancer1.prob, batch_count=74960.0, ans=0.125 2024-03-16 05:01:09,019 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass.skip_rate, batch_count=74960.0, ans=0.07 2024-03-16 05:01:16,611 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.attention_skip_rate, batch_count=74993.33333333333, ans=0.0 2024-03-16 05:01:23,918 INFO [train_char.py:689] (1/2) Epoch 45, batch 200, loss[loss=0.05818, simple_loss=0.1109, pruned_loss=0.002717, over 24417.00 frames. ], tot_loss[loss=0.06093, simple_loss=0.1131, pruned_loss=0.004377, over 3054206.53 frames. ], batch size: 158, lr: 7.12e-03, grad_scale: 32.0 2024-03-16 05:01:30,654 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=75026.66666666667, ans=0.1 2024-03-16 05:01:40,294 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=256, metric=18.67 vs. limit=22.5 2024-03-16 05:01:53,222 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.313e+01 8.035e+01 1.015e+02 1.442e+02 2.562e+02, threshold=2.030e+02, percent-clipped=7.0 2024-03-16 05:01:54,032 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.out_whiten.whitening_limit, batch_count=75093.33333333333, ans=15.0 2024-03-16 05:01:54,730 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer1.prob, batch_count=75093.33333333333, ans=0.125 2024-03-16 05:01:54,784 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=75093.33333333333, ans=0.125 2024-03-16 05:02:00,120 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.ff2_skip_rate, batch_count=75093.33333333333, ans=0.0 2024-03-16 05:02:06,417 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.prob, batch_count=75126.66666666667, ans=0.125 2024-03-16 05:02:24,836 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=512, metric=15.43 vs. limit=22.5 2024-03-16 05:02:28,262 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=75160.0, ans=0.0 2024-03-16 05:02:31,967 INFO [train_char.py:689] (1/2) Epoch 45, batch 250, loss[loss=0.05956, simple_loss=0.1058, pruned_loss=0.006668, over 24218.00 frames. ], tot_loss[loss=0.06016, simple_loss=0.1116, pruned_loss=0.004367, over 3442790.99 frames. ], batch size: 328, lr: 7.12e-03, grad_scale: 32.0 2024-03-16 05:02:45,560 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=75193.33333333333, ans=0.0 2024-03-16 05:02:48,052 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module1.balancer2.prob, batch_count=75226.66666666667, ans=0.125 2024-03-16 05:02:58,354 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.attention_skip_rate, batch_count=75226.66666666667, ans=0.0 2024-03-16 05:03:09,738 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=75260.0, ans=0.125 2024-03-16 05:03:31,826 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.ff2_skip_rate, batch_count=75326.66666666667, ans=0.0 2024-03-16 05:03:39,171 INFO [train_char.py:689] (1/2) Epoch 45, batch 300, loss[loss=0.0566, simple_loss=0.1048, pruned_loss=0.004209, over 24194.00 frames. ], tot_loss[loss=0.06053, simple_loss=0.1123, pruned_loss=0.004382, over 3753273.14 frames. ], batch size: 344, lr: 7.11e-03, grad_scale: 32.0 2024-03-16 05:03:50,996 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=75393.33333333333, ans=0.0 2024-03-16 05:04:08,439 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.116e+01 8.004e+01 1.023e+02 1.414e+02 2.411e+02, threshold=2.046e+02, percent-clipped=6.0 2024-03-16 05:04:09,929 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.dropout.p, batch_count=75426.66666666667, ans=0.1 2024-03-16 05:04:34,252 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=11.03 vs. limit=15.0 2024-03-16 05:04:47,039 INFO [train_char.py:689] (1/2) Epoch 45, batch 350, loss[loss=0.05361, simple_loss=0.09544, pruned_loss=0.005884, over 24008.00 frames. ], tot_loss[loss=0.06145, simple_loss=0.1139, pruned_loss=0.004501, over 3993620.57 frames. ], batch size: 381, lr: 7.10e-03, grad_scale: 32.0 2024-03-16 05:04:47,285 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.2.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:04:57,154 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=75526.66666666667, ans=0.1 2024-03-16 05:05:01,675 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.conv_module2.whiten, num_groups=1, num_channels=192, metric=4.18 vs. limit=15.0 2024-03-16 05:05:05,255 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=512, metric=12.25 vs. limit=22.5 2024-03-16 05:05:15,976 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=75593.33333333333, ans=0.1 2024-03-16 05:05:19,674 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=75593.33333333333, ans=0.1 2024-03-16 05:05:29,995 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.self_attn1.whiten, num_groups=1, num_channels=384, metric=14.15 vs. limit=22.5 2024-03-16 05:05:35,719 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.2.prob, batch_count=75660.0, ans=0.125 2024-03-16 05:05:35,808 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module2.balancer2.prob, batch_count=75660.0, ans=0.125 2024-03-16 05:05:39,522 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=75660.0, ans=0.1 2024-03-16 05:05:49,110 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.92 vs. limit=6.0 2024-03-16 05:05:49,528 INFO [train_char.py:689] (1/2) Epoch 45, batch 400, loss[loss=0.05982, simple_loss=0.1131, pruned_loss=0.003276, over 24314.00 frames. ], tot_loss[loss=0.06197, simple_loss=0.115, pruned_loss=0.004485, over 4181183.06 frames. ], batch size: 297, lr: 7.10e-03, grad_scale: 32.0 2024-03-16 05:06:21,139 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.459e+01 7.661e+01 9.898e+01 1.194e+02 6.153e+02, threshold=1.980e+02, percent-clipped=3.0 2024-03-16 05:06:31,502 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=256, metric=7.86 vs. limit=15.0 2024-03-16 05:06:31,602 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.conv_module1.whiten, num_groups=1, num_channels=192, metric=7.01 vs. limit=15.0 2024-03-16 05:06:46,583 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=8.58 vs. limit=10.0 2024-03-16 05:06:49,957 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=75826.66666666667, ans=0.1 2024-03-16 05:06:54,705 INFO [train_char.py:689] (1/2) Epoch 45, batch 450, loss[loss=0.05722, simple_loss=0.1027, pruned_loss=0.005851, over 24157.00 frames. ], tot_loss[loss=0.06293, simple_loss=0.1168, pruned_loss=0.004515, over 4326591.14 frames. ], batch size: 362, lr: 7.09e-03, grad_scale: 32.0 2024-03-16 05:07:13,604 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff3_skip_rate, batch_count=75893.33333333333, ans=0.0 2024-03-16 05:07:47,971 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward3.hidden_balancer.prob, batch_count=75993.33333333333, ans=0.125 2024-03-16 05:07:50,895 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=75993.33333333333, ans=0.125 2024-03-16 05:07:59,731 INFO [train_char.py:689] (1/2) Epoch 45, batch 500, loss[loss=0.06678, simple_loss=0.1287, pruned_loss=0.002435, over 24160.00 frames. ], tot_loss[loss=0.0639, simple_loss=0.1187, pruned_loss=0.004561, over 4438392.51 frames. ], batch size: 236, lr: 7.08e-03, grad_scale: 32.0 2024-03-16 05:08:54,226 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=76050.0, ans=0.0 2024-03-16 05:08:57,866 INFO [train_char.py:689] (1/2) Epoch 46, batch 0, loss[loss=0.06414, simple_loss=0.1189, pruned_loss=0.004695, over 24090.00 frames. ], tot_loss[loss=0.06414, simple_loss=0.1189, pruned_loss=0.004695, over 24090.00 frames. ], batch size: 188, lr: 7.00e-03, grad_scale: 32.0 2024-03-16 05:08:57,867 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 05:09:10,377 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.4.encoder.layers.0.self_attn_weights, attn_weights_entropy = tensor([3.9692, 3.5022, 3.6223, 3.4889], device='cuda:1') 2024-03-16 05:09:11,754 INFO [train_char.py:721] (1/2) Epoch 46, validation: loss=0.05697, simple_loss=0.106, pruned_loss=0.003992, over 657665.00 frames. 2024-03-16 05:09:11,755 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 05:09:20,615 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer1.prob, batch_count=76050.0, ans=0.125 2024-03-16 05:09:24,361 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module1.balancer1.prob, batch_count=76050.0, ans=0.125 2024-03-16 05:09:29,817 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward1.out_proj.dropout_p, batch_count=76083.33333333333, ans=0.1 2024-03-16 05:09:29,905 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=76083.33333333333, ans=0.125 2024-03-16 05:09:33,897 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=76083.33333333333, ans=0.1 2024-03-16 05:09:35,170 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=76083.33333333333, ans=0.0 2024-03-16 05:09:37,490 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.078e+01 7.476e+01 1.000e+02 1.292e+02 3.318e+02, threshold=2.000e+02, percent-clipped=4.0 2024-03-16 05:09:45,807 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_skip_rate, batch_count=76116.66666666667, ans=0.0 2024-03-16 05:10:23,229 INFO [train_char.py:689] (1/2) Epoch 46, batch 50, loss[loss=0.06271, simple_loss=0.1159, pruned_loss=0.004761, over 24290.00 frames. ], tot_loss[loss=0.06092, simple_loss=0.1128, pruned_loss=0.004526, over 1085926.58 frames. ], batch size: 296, lr: 7.00e-03, grad_scale: 32.0 2024-03-16 05:11:24,749 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:11:28,373 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer1.prob, batch_count=76350.0, ans=0.125 2024-03-16 05:11:36,013 INFO [train_char.py:689] (1/2) Epoch 46, batch 100, loss[loss=0.07208, simple_loss=0.1342, pruned_loss=0.004961, over 24214.00 frames. ], tot_loss[loss=0.06139, simple_loss=0.114, pruned_loss=0.004408, over 1910740.29 frames. ], batch size: 224, lr: 6.99e-03, grad_scale: 32.0 2024-03-16 05:11:49,094 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.attention_skip_rate, batch_count=76416.66666666667, ans=0.0 2024-03-16 05:11:50,478 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.attention_skip_rate, batch_count=76416.66666666667, ans=0.0 2024-03-16 05:11:56,609 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.653e+01 7.572e+01 9.774e+01 1.307e+02 2.580e+02, threshold=1.955e+02, percent-clipped=6.0 2024-03-16 05:12:09,648 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.bypass.skip_rate, batch_count=76450.0, ans=0.07 2024-03-16 05:12:13,911 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=384, metric=16.55 vs. limit=22.5 2024-03-16 05:12:21,104 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer1.prob, batch_count=76483.33333333333, ans=0.125 2024-03-16 05:12:31,320 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=76516.66666666667, ans=0.125 2024-03-16 05:12:39,924 INFO [train_char.py:689] (1/2) Epoch 46, batch 150, loss[loss=0.05589, simple_loss=0.105, pruned_loss=0.003417, over 24366.00 frames. ], tot_loss[loss=0.06111, simple_loss=0.1136, pruned_loss=0.004328, over 2551249.52 frames. ], batch size: 129, lr: 6.98e-03, grad_scale: 32.0 2024-03-16 05:12:51,857 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.nonlin_attention.balancer.max_positive, batch_count=76583.33333333333, ans=0.95 2024-03-16 05:12:53,123 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer2.prob, batch_count=76583.33333333333, ans=0.125 2024-03-16 05:13:03,633 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=76583.33333333333, ans=0.0 2024-03-16 05:13:09,955 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.bypass.scale_min, batch_count=76616.66666666667, ans=0.2 2024-03-16 05:13:15,920 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=76616.66666666667, ans=0.0 2024-03-16 05:13:25,826 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn2.whiten, num_groups=1, num_channels=384, metric=19.44 vs. limit=22.5 2024-03-16 05:13:34,666 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=8.04 vs. limit=15.0 2024-03-16 05:13:47,787 INFO [train_char.py:689] (1/2) Epoch 46, batch 200, loss[loss=0.06957, simple_loss=0.1302, pruned_loss=0.004497, over 24192.00 frames. ], tot_loss[loss=0.06063, simple_loss=0.1127, pruned_loss=0.004287, over 3051536.40 frames. ], batch size: 188, lr: 6.98e-03, grad_scale: 32.0 2024-03-16 05:13:48,013 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass.scale_min, batch_count=76716.66666666667, ans=0.2 2024-03-16 05:13:52,623 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=11.09 vs. limit=15.0 2024-03-16 05:13:58,412 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer2.prob, batch_count=76716.66666666667, ans=0.125 2024-03-16 05:14:02,248 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=76750.0, ans=0.0 2024-03-16 05:14:02,810 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn2.whiten, num_groups=1, num_channels=384, metric=16.90 vs. limit=22.5 2024-03-16 05:14:07,201 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff3_skip_rate, batch_count=76750.0, ans=0.0 2024-03-16 05:14:09,636 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 6.091e+01 8.061e+01 9.831e+01 1.300e+02 2.584e+02, threshold=1.966e+02, percent-clipped=3.0 2024-03-16 05:14:32,618 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.balancer2.prob, batch_count=76816.66666666667, ans=0.125 2024-03-16 05:14:35,305 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=256, metric=16.23 vs. limit=22.5 2024-03-16 05:14:42,702 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward2.hidden_balancer.prob, batch_count=76850.0, ans=0.125 2024-03-16 05:14:49,019 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer2.min_abs, batch_count=76850.0, ans=0.5 2024-03-16 05:14:52,618 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer1.prob, batch_count=76850.0, ans=0.125 2024-03-16 05:14:55,026 INFO [train_char.py:689] (1/2) Epoch 46, batch 250, loss[loss=0.07175, simple_loss=0.1344, pruned_loss=0.004523, over 24155.00 frames. ], tot_loss[loss=0.061, simple_loss=0.1133, pruned_loss=0.004325, over 3441966.06 frames. ], batch size: 223, lr: 6.97e-03, grad_scale: 16.0 2024-03-16 05:14:56,619 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=76883.33333333333, ans=0.125 2024-03-16 05:14:58,036 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=76883.33333333333, ans=0.0 2024-03-16 05:15:10,721 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=76916.66666666667, ans=0.1 2024-03-16 05:15:22,971 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=8.20 vs. limit=15.0 2024-03-16 05:15:30,933 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_skip_rate, batch_count=76950.0, ans=0.0 2024-03-16 05:15:48,649 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=77016.66666666667, ans=0.0 2024-03-16 05:15:56,273 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.nonlin_attention.balancer.prob, batch_count=77016.66666666667, ans=0.125 2024-03-16 05:15:58,641 INFO [train_char.py:689] (1/2) Epoch 46, batch 300, loss[loss=0.06273, simple_loss=0.1183, pruned_loss=0.003591, over 24255.00 frames. ], tot_loss[loss=0.06141, simple_loss=0.1142, pruned_loss=0.004289, over 3747417.81 frames. ], batch size: 296, lr: 6.96e-03, grad_scale: 16.0 2024-03-16 05:16:17,802 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module1.balancer1.prob, batch_count=77083.33333333333, ans=0.125 2024-03-16 05:16:23,018 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer2.prob, batch_count=77083.33333333333, ans=0.125 2024-03-16 05:16:23,745 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.270e+01 7.466e+01 9.405e+01 1.135e+02 2.803e+02, threshold=1.881e+02, percent-clipped=4.0 2024-03-16 05:16:48,628 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=10.80 vs. limit=15.0 2024-03-16 05:16:51,981 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_module2.balancer1.prob, batch_count=77183.33333333333, ans=0.125 2024-03-16 05:16:55,431 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer2.prob, batch_count=77183.33333333333, ans=0.125 2024-03-16 05:17:07,717 INFO [train_char.py:689] (1/2) Epoch 46, batch 350, loss[loss=0.05255, simple_loss=0.09443, pruned_loss=0.00534, over 23934.00 frames. ], tot_loss[loss=0.06197, simple_loss=0.1153, pruned_loss=0.004313, over 3989026.69 frames. ], batch size: 407, lr: 6.95e-03, grad_scale: 16.0 2024-03-16 05:17:13,080 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.attention_skip_rate, batch_count=77216.66666666667, ans=0.0 2024-03-16 05:17:41,633 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=13.15 vs. limit=15.0 2024-03-16 05:17:56,570 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass_mid.scale_min, batch_count=77316.66666666667, ans=0.2 2024-03-16 05:18:13,739 INFO [train_char.py:689] (1/2) Epoch 46, batch 400, loss[loss=0.06036, simple_loss=0.1018, pruned_loss=0.009481, over 23988.00 frames. ], tot_loss[loss=0.06224, simple_loss=0.1157, pruned_loss=0.004376, over 4177098.97 frames. ], batch size: 381, lr: 6.95e-03, grad_scale: 32.0 2024-03-16 05:18:36,545 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.580e+01 8.290e+01 9.923e+01 1.317e+02 2.697e+02, threshold=1.985e+02, percent-clipped=6.0 2024-03-16 05:18:45,717 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_module2.balancer1.prob, batch_count=77450.0, ans=0.125 2024-03-16 05:18:48,053 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.ff2_skip_rate, batch_count=77450.0, ans=0.0 2024-03-16 05:18:56,662 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=77483.33333333333, ans=0.0 2024-03-16 05:19:07,234 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=256, metric=8.29 vs. limit=15.0 2024-03-16 05:19:18,479 INFO [train_char.py:689] (1/2) Epoch 46, batch 450, loss[loss=0.06978, simple_loss=0.1288, pruned_loss=0.005384, over 24195.00 frames. ], tot_loss[loss=0.06285, simple_loss=0.1168, pruned_loss=0.004447, over 4324463.32 frames. ], batch size: 266, lr: 6.94e-03, grad_scale: 32.0 2024-03-16 05:19:31,044 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff2_skip_rate, batch_count=77583.33333333333, ans=0.0 2024-03-16 05:19:35,223 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=12.06 vs. limit=15.0 2024-03-16 05:19:47,951 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.convnext.out_whiten, num_groups=1, num_channels=128, metric=4.43 vs. limit=5.0 2024-03-16 05:19:48,484 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.conv_module1.balancer2.min_abs, batch_count=77616.66666666667, ans=0.5 2024-03-16 05:20:04,036 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.4.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:20:04,315 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=5.68 vs. limit=15.0 2024-03-16 05:20:15,295 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=10.18 vs. limit=22.5 2024-03-16 05:20:22,005 INFO [train_char.py:689] (1/2) Epoch 46, batch 500, loss[loss=0.06978, simple_loss=0.1323, pruned_loss=0.003647, over 24074.00 frames. ], tot_loss[loss=0.0638, simple_loss=0.1187, pruned_loss=0.004444, over 4437161.77 frames. ], batch size: 236, lr: 6.93e-03, grad_scale: 16.0 2024-03-16 05:21:21,562 INFO [train_char.py:689] (1/2) Epoch 47, batch 0, loss[loss=0.05251, simple_loss=0.09677, pruned_loss=0.004124, over 24016.00 frames. ], tot_loss[loss=0.05251, simple_loss=0.09677, pruned_loss=0.004124, over 24016.00 frames. ], batch size: 381, lr: 6.86e-03, grad_scale: 32.0 2024-03-16 05:21:21,562 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 05:21:35,117 INFO [train_char.py:721] (1/2) Epoch 47, validation: loss=0.05658, simple_loss=0.1057, pruned_loss=0.003729, over 657665.00 frames. 2024-03-16 05:21:35,118 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 05:21:39,563 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=77740.0, ans=0.1 2024-03-16 05:21:39,590 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=77740.0, ans=0.125 2024-03-16 05:21:41,704 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=192, metric=8.82 vs. limit=15.0 2024-03-16 05:21:50,206 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.425e+01 7.279e+01 8.678e+01 1.182e+02 2.496e+02, threshold=1.736e+02, percent-clipped=5.0 2024-03-16 05:22:01,102 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=77806.66666666667, ans=0.2 2024-03-16 05:22:23,479 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=77840.0, ans=0.125 2024-03-16 05:22:44,907 INFO [train_char.py:689] (1/2) Epoch 47, batch 50, loss[loss=0.07537, simple_loss=0.1376, pruned_loss=0.006552, over 24093.00 frames. ], tot_loss[loss=0.06046, simple_loss=0.1125, pruned_loss=0.004195, over 1090334.69 frames. ], batch size: 223, lr: 6.85e-03, grad_scale: 32.0 2024-03-16 05:22:48,024 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module1.balancer1.min_positive, batch_count=77906.66666666667, ans=0.025 2024-03-16 05:22:48,506 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=19.95 vs. limit=22.5 2024-03-16 05:22:51,328 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=256, metric=9.57 vs. limit=15.0 2024-03-16 05:23:02,760 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=77940.0, ans=0.1 2024-03-16 05:23:08,288 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.attention_skip_rate, batch_count=77940.0, ans=0.0 2024-03-16 05:23:18,795 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.8.prob, batch_count=77973.33333333333, ans=0.125 2024-03-16 05:23:22,639 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.hidden_balancer.prob, batch_count=77973.33333333333, ans=0.125 2024-03-16 05:23:22,697 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer1.prob, batch_count=77973.33333333333, ans=0.125 2024-03-16 05:23:27,149 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=10.51 vs. limit=15.0 2024-03-16 05:23:32,624 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.28 vs. limit=6.0 2024-03-16 05:23:34,687 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_skip_rate, batch_count=78006.66666666667, ans=0.0 2024-03-16 05:23:44,011 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=7.18 vs. limit=15.0 2024-03-16 05:23:48,633 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer2.prob, batch_count=78040.0, ans=0.125 2024-03-16 05:23:56,059 INFO [train_char.py:689] (1/2) Epoch 47, batch 100, loss[loss=0.06354, simple_loss=0.1222, pruned_loss=0.002438, over 24130.00 frames. ], tot_loss[loss=0.0596, simple_loss=0.111, pruned_loss=0.004107, over 1917303.85 frames. ], batch size: 188, lr: 6.84e-03, grad_scale: 32.0 2024-03-16 05:23:59,650 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.whiten, num_groups=1, num_channels=256, metric=4.57 vs. limit=12.0 2024-03-16 05:24:10,164 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.889e+01 7.269e+01 9.818e+01 1.349e+02 2.367e+02, threshold=1.964e+02, percent-clipped=9.0 2024-03-16 05:24:32,400 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=78140.0, ans=0.125 2024-03-16 05:24:33,612 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=78173.33333333333, ans=0.1 2024-03-16 05:24:33,664 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=78173.33333333333, ans=0.0 2024-03-16 05:24:47,130 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer1.prob, batch_count=78173.33333333333, ans=0.125 2024-03-16 05:24:57,519 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module1.balancer2.min_abs, batch_count=78206.66666666667, ans=0.5 2024-03-16 05:25:03,873 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=78240.0, ans=0.125 2024-03-16 05:25:04,017 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff2_skip_rate, batch_count=78240.0, ans=0.0 2024-03-16 05:25:04,366 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.whiten, num_groups=1, num_channels=512, metric=5.73 vs. limit=12.0 2024-03-16 05:25:04,862 INFO [train_char.py:689] (1/2) Epoch 47, batch 150, loss[loss=0.07214, simple_loss=0.1352, pruned_loss=0.004544, over 24224.00 frames. ], tot_loss[loss=0.05939, simple_loss=0.1109, pruned_loss=0.003932, over 2558366.76 frames. ], batch size: 212, lr: 6.84e-03, grad_scale: 32.0 2024-03-16 05:25:06,417 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer1.prob, batch_count=78240.0, ans=0.125 2024-03-16 05:25:34,672 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.out_combiner.scale_min, batch_count=78306.66666666667, ans=0.2 2024-03-16 05:25:35,298 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=8.21 vs. limit=10.0 2024-03-16 05:25:41,528 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.feed_forward3.out_whiten, num_groups=1, num_channels=256, metric=23.82 vs. limit=15.0 2024-03-16 05:25:42,476 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.balancer_ff3.min_abs, batch_count=78340.0, ans=0.2 2024-03-16 05:26:08,651 INFO [train_char.py:689] (1/2) Epoch 47, batch 200, loss[loss=0.0553, simple_loss=0.1042, pruned_loss=0.003198, over 24356.00 frames. ], tot_loss[loss=0.06047, simple_loss=0.1127, pruned_loss=0.004117, over 3061729.50 frames. ], batch size: 129, lr: 6.83e-03, grad_scale: 32.0 2024-03-16 05:26:22,776 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.616e+01 8.335e+01 9.932e+01 1.326e+02 2.634e+02, threshold=1.986e+02, percent-clipped=5.0 2024-03-16 05:26:24,301 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=78440.0, ans=0.1 2024-03-16 05:26:37,573 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=11.00 vs. limit=15.0 2024-03-16 05:26:42,119 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.bypass.scale_min, batch_count=78473.33333333333, ans=0.2 2024-03-16 05:26:50,507 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.feed_forward1.out_whiten, num_groups=1, num_channels=256, metric=13.00 vs. limit=15.0 2024-03-16 05:27:00,157 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=78506.66666666667, ans=0.125 2024-03-16 05:27:16,579 INFO [train_char.py:689] (1/2) Epoch 47, batch 250, loss[loss=0.064, simple_loss=0.1186, pruned_loss=0.0047, over 21540.00 frames. ], tot_loss[loss=0.0604, simple_loss=0.1126, pruned_loss=0.004102, over 3450469.97 frames. ], batch size: 85, lr: 6.82e-03, grad_scale: 32.0 2024-03-16 05:27:32,119 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.0.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:27:47,457 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.balancer1.prob, batch_count=78640.0, ans=0.125 2024-03-16 05:27:48,908 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:27:59,447 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.scale_min, batch_count=78673.33333333333, ans=0.2 2024-03-16 05:28:18,288 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer1.prob, batch_count=78706.66666666667, ans=0.125 2024-03-16 05:28:24,298 INFO [train_char.py:689] (1/2) Epoch 47, batch 300, loss[loss=0.05821, simple_loss=0.1107, pruned_loss=0.002854, over 24358.00 frames. ], tot_loss[loss=0.06087, simple_loss=0.1136, pruned_loss=0.004082, over 3749971.29 frames. ], batch size: 172, lr: 6.82e-03, grad_scale: 32.0 2024-03-16 05:28:28,403 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.out_combiner.scale_min, batch_count=78740.0, ans=0.2 2024-03-16 05:28:38,181 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.527e+01 7.236e+01 9.244e+01 1.311e+02 2.742e+02, threshold=1.849e+02, percent-clipped=4.0 2024-03-16 05:28:49,870 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.bypass.scale_min, batch_count=78773.33333333333, ans=0.2 2024-03-16 05:28:51,533 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten1, num_groups=1, num_channels=288, metric=7.46 vs. limit=10.0 2024-03-16 05:29:03,499 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=78840.0, ans=0.125 2024-03-16 05:29:11,505 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.feed_forward2.out_whiten, num_groups=1, num_channels=384, metric=10.34 vs. limit=15.0 2024-03-16 05:29:21,162 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=78873.33333333333, ans=0.1 2024-03-16 05:29:29,720 INFO [train_char.py:689] (1/2) Epoch 47, batch 350, loss[loss=0.05716, simple_loss=0.1086, pruned_loss=0.002864, over 24369.00 frames. ], tot_loss[loss=0.06085, simple_loss=0.1135, pruned_loss=0.004093, over 3989456.81 frames. ], batch size: 158, lr: 6.81e-03, grad_scale: 32.0 2024-03-16 05:29:34,191 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.conv_module1.whiten, num_groups=1, num_channels=384, metric=2.91 vs. limit=15.0 2024-03-16 05:29:47,493 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_module2.balancer2.prob, batch_count=78940.0, ans=0.125 2024-03-16 05:30:10,776 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=79006.66666666667, ans=0.125 2024-03-16 05:30:21,956 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:30:29,109 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=11.84 vs. limit=15.0 2024-03-16 05:30:33,519 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.bypass.skip_rate, batch_count=79040.0, ans=0.09899494936611666 2024-03-16 05:30:35,066 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=384, metric=13.11 vs. limit=22.5 2024-03-16 05:30:36,856 INFO [train_char.py:689] (1/2) Epoch 47, batch 400, loss[loss=0.06659, simple_loss=0.1242, pruned_loss=0.004486, over 24353.00 frames. ], tot_loss[loss=0.06127, simple_loss=0.1142, pruned_loss=0.004166, over 4177963.97 frames. ], batch size: 180, lr: 6.81e-03, grad_scale: 32.0 2024-03-16 05:30:41,017 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.attention_skip_rate, batch_count=79073.33333333333, ans=0.0 2024-03-16 05:30:50,569 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.395e+01 8.041e+01 9.960e+01 1.452e+02 2.854e+02, threshold=1.992e+02, percent-clipped=5.0 2024-03-16 05:31:13,940 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=15.75 vs. limit=22.5 2024-03-16 05:31:23,387 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=79173.33333333333, ans=0.0 2024-03-16 05:31:23,427 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module2.balancer1.prob, batch_count=79173.33333333333, ans=0.125 2024-03-16 05:31:29,623 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer2.prob, batch_count=79206.66666666667, ans=0.125 2024-03-16 05:31:36,115 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.1.self_attn1.whiten, num_groups=1, num_channels=256, metric=10.06 vs. limit=22.5 2024-03-16 05:31:41,516 INFO [train_char.py:689] (1/2) Epoch 47, batch 450, loss[loss=0.06934, simple_loss=0.1286, pruned_loss=0.005023, over 24225.00 frames. ], tot_loss[loss=0.06247, simple_loss=0.1165, pruned_loss=0.004242, over 4326031.91 frames. ], batch size: 212, lr: 6.80e-03, grad_scale: 16.0 2024-03-16 05:31:58,072 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.hidden_balancer.prob, batch_count=79273.33333333333, ans=0.125 2024-03-16 05:32:12,713 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.attention_skip_rate, batch_count=79306.66666666667, ans=0.0 2024-03-16 05:32:16,634 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.scale_min, batch_count=79306.66666666667, ans=0.2 2024-03-16 05:32:20,769 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=79340.0, ans=0.0 2024-03-16 05:32:30,605 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.attention_skip_rate, batch_count=79340.0, ans=0.0 2024-03-16 05:32:31,873 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer1.prob, batch_count=79373.33333333333, ans=0.125 2024-03-16 05:32:36,846 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.2.prob, batch_count=79373.33333333333, ans=0.125 2024-03-16 05:32:38,230 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.balancer2.prob, batch_count=79373.33333333333, ans=0.125 2024-03-16 05:32:45,858 INFO [train_char.py:689] (1/2) Epoch 47, batch 500, loss[loss=0.06888, simple_loss=0.128, pruned_loss=0.004882, over 24037.00 frames. ], tot_loss[loss=0.06327, simple_loss=0.118, pruned_loss=0.004258, over 4439076.81 frames. ], batch size: 236, lr: 6.79e-03, grad_scale: 16.0 2024-03-16 05:33:42,236 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=79430.0, ans=0.125 2024-03-16 05:33:49,265 INFO [train_char.py:689] (1/2) Epoch 48, batch 0, loss[loss=0.06692, simple_loss=0.1244, pruned_loss=0.004704, over 24229.00 frames. ], tot_loss[loss=0.06692, simple_loss=0.1244, pruned_loss=0.004704, over 24229.00 frames. ], batch size: 212, lr: 6.72e-03, grad_scale: 32.0 2024-03-16 05:33:49,266 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 05:34:03,266 INFO [train_char.py:721] (1/2) Epoch 48, validation: loss=0.0567, simple_loss=0.106, pruned_loss=0.003686, over 657665.00 frames. 2024-03-16 05:34:03,266 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 05:34:10,092 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.334e+01 6.973e+01 8.467e+01 1.015e+02 1.727e+02, threshold=1.693e+02, percent-clipped=0.0 2024-03-16 05:34:19,878 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.self_attn_weights.pos_emb_skip_rate, batch_count=79463.33333333333, ans=0.0 2024-03-16 05:35:10,706 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass_mid.scale_min, batch_count=79563.33333333333, ans=0.2 2024-03-16 05:35:13,182 INFO [train_char.py:689] (1/2) Epoch 48, batch 50, loss[loss=0.06575, simple_loss=0.1207, pruned_loss=0.005398, over 21730.00 frames. ], tot_loss[loss=0.06016, simple_loss=0.1121, pruned_loss=0.004127, over 1085858.11 frames. ], batch size: 86, lr: 6.71e-03, grad_scale: 32.0 2024-03-16 05:35:20,141 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer1.prob, batch_count=79596.66666666667, ans=0.125 2024-03-16 05:35:46,304 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=79663.33333333333, ans=0.1 2024-03-16 05:35:48,800 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer2.prob, batch_count=79663.33333333333, ans=0.125 2024-03-16 05:36:00,595 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module1.balancer2.prob, batch_count=79696.66666666667, ans=0.125 2024-03-16 05:36:05,640 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.attention_skip_rate, batch_count=79696.66666666667, ans=0.0 2024-03-16 05:36:05,692 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module1.balancer2.prob, batch_count=79696.66666666667, ans=0.125 2024-03-16 05:36:06,364 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.self_attn1.whiten, num_groups=1, num_channels=192, metric=14.37 vs. limit=22.5 2024-03-16 05:36:18,168 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff3_skip_rate, batch_count=79730.0, ans=0.0 2024-03-16 05:36:25,416 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.feed_forward1.out_whiten, num_groups=1, num_channels=192, metric=12.77 vs. limit=15.0 2024-03-16 05:36:26,577 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=384, metric=7.84 vs. limit=15.0 2024-03-16 05:36:28,351 INFO [train_char.py:689] (1/2) Epoch 48, batch 100, loss[loss=0.06654, simple_loss=0.126, pruned_loss=0.003557, over 24118.00 frames. ], tot_loss[loss=0.06086, simple_loss=0.113, pruned_loss=0.004355, over 1912879.93 frames. ], batch size: 188, lr: 6.71e-03, grad_scale: 32.0 2024-03-16 05:36:34,909 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.124e+01 8.336e+01 1.011e+02 1.346e+02 2.310e+02, threshold=2.023e+02, percent-clipped=10.0 2024-03-16 05:36:39,124 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.skip_rate, batch_count=79763.33333333333, ans=0.09899494936611666 2024-03-16 05:36:42,881 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward1.hidden_balancer.prob, batch_count=79796.66666666667, ans=0.125 2024-03-16 05:36:46,837 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_skip_rate, batch_count=79796.66666666667, ans=0.0 2024-03-16 05:36:48,070 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=79796.66666666667, ans=0.0 2024-03-16 05:36:52,605 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=5.13 vs. limit=6.0 2024-03-16 05:36:57,702 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.89 vs. limit=6.0 2024-03-16 05:37:16,414 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.balancer1.prob, batch_count=79863.33333333333, ans=0.125 2024-03-16 05:37:32,919 INFO [train_char.py:689] (1/2) Epoch 48, batch 150, loss[loss=0.07266, simple_loss=0.1364, pruned_loss=0.004443, over 24138.00 frames. ], tot_loss[loss=0.06097, simple_loss=0.1137, pruned_loss=0.00413, over 2556514.59 frames. ], batch size: 223, lr: 6.70e-03, grad_scale: 32.0 2024-03-16 05:37:36,310 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=5.79 vs. limit=6.0 2024-03-16 05:37:39,728 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=79930.0, ans=0.125 2024-03-16 05:37:53,970 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=79963.33333333333, ans=0.1 2024-03-16 05:37:57,613 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.ff3_skip_rate, batch_count=79996.66666666667, ans=0.0 2024-03-16 05:38:06,263 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=79996.66666666667, ans=0.125 2024-03-16 05:38:06,281 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_skip_rate, batch_count=79996.66666666667, ans=0.0 2024-03-16 05:38:10,048 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff3_skip_rate, batch_count=79996.66666666667, ans=0.0 2024-03-16 05:38:15,042 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.bypass.scale_min, batch_count=80030.0, ans=0.2 2024-03-16 05:38:42,815 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=80063.33333333333, ans=0.125 2024-03-16 05:38:45,101 INFO [train_char.py:689] (1/2) Epoch 48, batch 200, loss[loss=0.05461, simple_loss=0.1026, pruned_loss=0.003287, over 24144.00 frames. ], tot_loss[loss=0.06074, simple_loss=0.1133, pruned_loss=0.004071, over 3060307.50 frames. ], batch size: 362, lr: 6.69e-03, grad_scale: 32.0 2024-03-16 05:38:49,290 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module1.balancer2.prob, batch_count=80096.66666666667, ans=0.125 2024-03-16 05:38:51,309 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.008e+01 8.272e+01 1.042e+02 1.469e+02 3.123e+02, threshold=2.084e+02, percent-clipped=7.0 2024-03-16 05:39:06,112 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn2.whiten, num_groups=1, num_channels=384, metric=20.22 vs. limit=22.5 2024-03-16 05:39:52,924 INFO [train_char.py:689] (1/2) Epoch 48, batch 250, loss[loss=0.0713, simple_loss=0.1328, pruned_loss=0.004902, over 24094.00 frames. ], tot_loss[loss=0.06105, simple_loss=0.1139, pruned_loss=0.004073, over 3445971.36 frames. ], batch size: 199, lr: 6.69e-03, grad_scale: 32.0 2024-03-16 05:40:02,228 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.bypass.skip_rate, batch_count=80263.33333333333, ans=0.07 2024-03-16 05:40:06,482 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_whiten, num_groups=1, num_channels=512, metric=8.70 vs. limit=15.0 2024-03-16 05:40:15,173 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=80296.66666666667, ans=0.125 2024-03-16 05:40:15,215 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.4.encoder.layers.2.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:40:38,187 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.conv.8.prob, batch_count=80363.33333333333, ans=0.125 2024-03-16 05:40:49,528 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.skip_rate, batch_count=80396.66666666667, ans=0.07 2024-03-16 05:40:56,945 INFO [train_char.py:689] (1/2) Epoch 48, batch 300, loss[loss=0.06858, simple_loss=0.124, pruned_loss=0.006602, over 24214.00 frames. ], tot_loss[loss=0.06099, simple_loss=0.1138, pruned_loss=0.004077, over 3754637.22 frames. ], batch size: 296, lr: 6.68e-03, grad_scale: 32.0 2024-03-16 05:40:59,893 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.ff3_skip_rate, batch_count=80430.0, ans=0.0 2024-03-16 05:41:01,150 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=80430.0, ans=0.125 2024-03-16 05:41:03,444 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.399e+01 7.416e+01 8.993e+01 1.288e+02 2.345e+02, threshold=1.799e+02, percent-clipped=1.0 2024-03-16 05:41:11,366 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward2.hidden_balancer.prob, batch_count=80463.33333333333, ans=0.125 2024-03-16 05:41:33,537 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=80496.66666666667, ans=0.0 2024-03-16 05:42:07,120 INFO [train_char.py:689] (1/2) Epoch 48, batch 350, loss[loss=0.05562, simple_loss=0.1039, pruned_loss=0.003677, over 24420.00 frames. ], tot_loss[loss=0.06106, simple_loss=0.114, pruned_loss=0.004041, over 3990621.54 frames. ], batch size: 152, lr: 6.67e-03, grad_scale: 32.0 2024-03-16 05:42:09,714 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=80596.66666666667, ans=0.125 2024-03-16 05:42:17,775 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=384, metric=4.85 vs. limit=15.0 2024-03-16 05:42:42,883 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_skip_rate, batch_count=80663.33333333333, ans=0.0 2024-03-16 05:42:54,655 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=5.52 vs. limit=15.0 2024-03-16 05:42:56,050 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=384, metric=25.32 vs. limit=22.5 2024-03-16 05:43:12,060 INFO [train_char.py:689] (1/2) Epoch 48, batch 400, loss[loss=0.06141, simple_loss=0.1093, pruned_loss=0.006738, over 24176.00 frames. ], tot_loss[loss=0.06152, simple_loss=0.115, pruned_loss=0.004046, over 4177298.32 frames. ], batch size: 344, lr: 6.67e-03, grad_scale: 32.0 2024-03-16 05:43:18,383 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.522e+01 8.124e+01 1.041e+02 1.434e+02 2.698e+02, threshold=2.082e+02, percent-clipped=12.0 2024-03-16 05:43:22,524 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_skip_rate, batch_count=80763.33333333333, ans=0.0 2024-03-16 05:43:36,805 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.attention_skip_rate, batch_count=80796.66666666667, ans=0.0 2024-03-16 05:43:44,934 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.conv_module2.whiten, num_groups=1, num_channels=384, metric=5.45 vs. limit=15.0 2024-03-16 05:43:50,721 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass_mid.scale_min, batch_count=80863.33333333333, ans=0.2 2024-03-16 05:43:55,615 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=80863.33333333333, ans=0.125 2024-03-16 05:44:18,028 INFO [train_char.py:689] (1/2) Epoch 48, batch 450, loss[loss=0.05886, simple_loss=0.1085, pruned_loss=0.004599, over 24236.00 frames. ], tot_loss[loss=0.06195, simple_loss=0.1158, pruned_loss=0.004029, over 4324881.22 frames. ], batch size: 328, lr: 6.66e-03, grad_scale: 32.0 2024-03-16 05:44:24,444 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.bypass.scale_min, batch_count=80930.0, ans=0.2 2024-03-16 05:44:26,836 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer2.min_positive, batch_count=80930.0, ans=0.05 2024-03-16 05:44:54,244 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.conv_module1.balancer1.prob, batch_count=80996.66666666667, ans=0.125 2024-03-16 05:45:22,258 INFO [train_char.py:689] (1/2) Epoch 48, batch 500, loss[loss=0.0694, simple_loss=0.1314, pruned_loss=0.003681, over 24060.00 frames. ], tot_loss[loss=0.06263, simple_loss=0.1171, pruned_loss=0.004087, over 4437219.70 frames. ], batch size: 199, lr: 6.66e-03, grad_scale: 32.0 2024-03-16 05:45:26,464 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=81096.66666666667, ans=0.1 2024-03-16 05:45:28,443 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.489e+01 7.774e+01 9.184e+01 1.191e+02 2.311e+02, threshold=1.837e+02, percent-clipped=1.0 2024-03-16 05:46:21,990 INFO [train_char.py:689] (1/2) Epoch 49, batch 0, loss[loss=0.06579, simple_loss=0.1191, pruned_loss=0.006219, over 24066.00 frames. ], tot_loss[loss=0.06579, simple_loss=0.1191, pruned_loss=0.006219, over 24066.00 frames. ], batch size: 250, lr: 6.59e-03, grad_scale: 32.0 2024-03-16 05:46:21,990 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 05:46:35,551 INFO [train_char.py:721] (1/2) Epoch 49, validation: loss=0.05711, simple_loss=0.1065, pruned_loss=0.003837, over 657665.00 frames. 2024-03-16 05:46:35,552 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 05:46:45,188 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.conv_module2.balancer2.prob, batch_count=81120.0, ans=0.125 2024-03-16 05:47:08,272 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.conv_module2.whiten, num_groups=1, num_channels=384, metric=7.31 vs. limit=15.0 2024-03-16 05:47:33,533 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.1.whiten, num_groups=1, num_channels=192, metric=3.63 vs. limit=12.0 2024-03-16 05:47:42,320 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.self_attn_weights.pos_emb_skip_rate, batch_count=81253.33333333333, ans=0.0 2024-03-16 05:47:47,398 INFO [train_char.py:689] (1/2) Epoch 49, batch 50, loss[loss=0.0496, simple_loss=0.09693, pruned_loss=0.001134, over 24253.00 frames. ], tot_loss[loss=0.05961, simple_loss=0.1118, pruned_loss=0.003691, over 1089852.87 frames. ], batch size: 134, lr: 6.58e-03, grad_scale: 32.0 2024-03-16 05:47:59,770 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.out_combiner.scale_min, batch_count=81320.0, ans=0.2 2024-03-16 05:48:10,534 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.self_attn_weights.whiten_keys, num_groups=8, num_channels=256, metric=4.57 vs. limit=6.0 2024-03-16 05:48:15,983 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=81320.0, ans=0.1 2024-03-16 05:48:16,396 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=13.41 vs. limit=15.0 2024-03-16 05:48:56,088 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.069e+01 7.760e+01 9.968e+01 1.337e+02 2.730e+02, threshold=1.994e+02, percent-clipped=13.0 2024-03-16 05:48:57,607 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer1.prob, batch_count=81453.33333333333, ans=0.125 2024-03-16 05:48:58,611 INFO [train_char.py:689] (1/2) Epoch 49, batch 100, loss[loss=0.05646, simple_loss=0.105, pruned_loss=0.003963, over 24199.00 frames. ], tot_loss[loss=0.06042, simple_loss=0.1132, pruned_loss=0.00384, over 1911863.39 frames. ], batch size: 344, lr: 6.57e-03, grad_scale: 32.0 2024-03-16 05:49:01,510 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_module2.balancer1.min_positive, batch_count=81453.33333333333, ans=0.025 2024-03-16 05:49:01,531 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.balancer2.prob, batch_count=81453.33333333333, ans=0.125 2024-03-16 05:49:07,811 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.bypass.skip_rate, batch_count=81453.33333333333, ans=0.04949747468305833 2024-03-16 05:49:11,082 INFO [scaling.py:1023] (1/2) Whitening: name=encoder_embed.out_whiten, num_groups=1, num_channels=192, metric=7.25 vs. limit=8.0 2024-03-16 05:49:12,786 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.dropout.p, batch_count=81486.66666666667, ans=0.1 2024-03-16 05:49:28,522 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=81520.0, ans=0.125 2024-03-16 05:50:05,776 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=81620.0, ans=0.1 2024-03-16 05:50:06,846 INFO [train_char.py:689] (1/2) Epoch 49, batch 150, loss[loss=0.0674, simple_loss=0.1274, pruned_loss=0.003688, over 24291.00 frames. ], tot_loss[loss=0.05999, simple_loss=0.1123, pruned_loss=0.003841, over 2554027.77 frames. ], batch size: 180, lr: 6.57e-03, grad_scale: 32.0 2024-03-16 05:50:09,642 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.1.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:51:08,311 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.911e+01 8.219e+01 1.077e+02 1.581e+02 3.213e+02, threshold=2.153e+02, percent-clipped=13.0 2024-03-16 05:51:15,473 INFO [train_char.py:689] (1/2) Epoch 49, batch 200, loss[loss=0.07171, simple_loss=0.1349, pruned_loss=0.004274, over 24172.00 frames. ], tot_loss[loss=0.05996, simple_loss=0.1121, pruned_loss=0.003909, over 3053729.34 frames. ], batch size: 212, lr: 6.56e-03, grad_scale: 32.0 2024-03-16 05:51:16,507 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.whiten, num_groups=1, num_channels=192, metric=4.54 vs. limit=12.0 2024-03-16 05:51:17,044 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.conv_skip_rate, batch_count=81786.66666666667, ans=0.0 2024-03-16 05:51:46,428 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.1.feed_forward1.hidden_balancer.prob, batch_count=81853.33333333333, ans=0.125 2024-03-16 05:52:00,296 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.ff2_skip_rate, batch_count=81886.66666666667, ans=0.0 2024-03-16 05:52:09,344 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.nonlin_attention.balancer.prob, batch_count=81920.0, ans=0.125 2024-03-16 05:52:19,264 INFO [train_char.py:689] (1/2) Epoch 49, batch 250, loss[loss=0.05473, simple_loss=0.09464, pruned_loss=0.007412, over 23785.00 frames. ], tot_loss[loss=0.0605, simple_loss=0.1132, pruned_loss=0.003899, over 3449725.84 frames. ], batch size: 439, lr: 6.55e-03, grad_scale: 32.0 2024-03-16 05:52:31,286 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.conv_module1.whiten, num_groups=1, num_channels=512, metric=4.10 vs. limit=15.0 2024-03-16 05:52:32,231 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.conv_skip_rate, batch_count=81986.66666666667, ans=0.0 2024-03-16 05:52:36,434 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=512, metric=6.70 vs. limit=15.0 2024-03-16 05:52:41,481 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=81986.66666666667, ans=0.125 2024-03-16 05:52:49,105 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.ff3_skip_rate, batch_count=82020.0, ans=0.0 2024-03-16 05:52:58,450 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass_mid.scale_min, batch_count=82020.0, ans=0.2 2024-03-16 05:53:08,810 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.conv_module2.balancer2.prob, batch_count=82053.33333333333, ans=0.125 2024-03-16 05:53:23,612 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.621e+01 7.878e+01 9.786e+01 1.417e+02 2.568e+02, threshold=1.957e+02, percent-clipped=3.0 2024-03-16 05:53:26,179 INFO [train_char.py:689] (1/2) Epoch 49, batch 300, loss[loss=0.06453, simple_loss=0.1207, pruned_loss=0.004187, over 24236.00 frames. ], tot_loss[loss=0.06088, simple_loss=0.1139, pruned_loss=0.003917, over 3756067.59 frames. ], batch size: 311, lr: 6.55e-03, grad_scale: 32.0 2024-03-16 05:53:33,214 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=11.72 vs. limit=15.0 2024-03-16 05:53:44,247 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff2_skip_rate, batch_count=82153.33333333333, ans=0.0 2024-03-16 05:54:03,226 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.conv_module1.whiten, num_groups=1, num_channels=384, metric=2.77 vs. limit=15.0 2024-03-16 05:54:31,597 INFO [train_char.py:689] (1/2) Epoch 49, batch 350, loss[loss=0.05174, simple_loss=0.09966, pruned_loss=0.001908, over 24415.00 frames. ], tot_loss[loss=0.06099, simple_loss=0.1141, pruned_loss=0.003924, over 3992495.40 frames. ], batch size: 135, lr: 6.54e-03, grad_scale: 32.0 2024-03-16 05:54:38,163 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module2.balancer2.prob, batch_count=82286.66666666667, ans=0.125 2024-03-16 05:54:46,540 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.feed_forward1.out_proj.dropout_p, batch_count=82320.0, ans=0.1 2024-03-16 05:55:13,239 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=82386.66666666667, ans=0.125 2024-03-16 05:55:35,153 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_skip_rate, batch_count=82420.0, ans=0.0 2024-03-16 05:55:36,085 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.473e+01 7.996e+01 9.576e+01 1.213e+02 2.192e+02, threshold=1.915e+02, percent-clipped=3.0 2024-03-16 05:55:38,685 INFO [train_char.py:689] (1/2) Epoch 49, batch 400, loss[loss=0.06877, simple_loss=0.1274, pruned_loss=0.005055, over 24147.00 frames. ], tot_loss[loss=0.06203, simple_loss=0.116, pruned_loss=0.004057, over 4173985.40 frames. ], batch size: 251, lr: 6.54e-03, grad_scale: 32.0 2024-03-16 05:56:11,167 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder_embed.convnext.hidden_balancer.prob, batch_count=82520.0, ans=0.125 2024-03-16 05:56:36,598 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.hidden_balancer.prob, batch_count=82586.66666666667, ans=0.125 2024-03-16 05:56:42,696 INFO [train_char.py:689] (1/2) Epoch 49, batch 450, loss[loss=0.0669, simple_loss=0.1279, pruned_loss=0.002928, over 24207.00 frames. ], tot_loss[loss=0.0625, simple_loss=0.1169, pruned_loss=0.004031, over 4321726.32 frames. ], batch size: 212, lr: 6.53e-03, grad_scale: 32.0 2024-03-16 05:56:50,669 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.0.whiten, num_groups=1, num_channels=384, metric=5.02 vs. limit=12.0 2024-03-16 05:57:05,355 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer_ff3.min_abs, batch_count=82653.33333333333, ans=0.2 2024-03-16 05:57:14,687 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 05:57:20,837 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.attention_skip_rate, batch_count=82720.0, ans=0.0 2024-03-16 05:57:28,438 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_skip_rate, batch_count=82720.0, ans=0.0 2024-03-16 05:57:30,935 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.bypass_mid.scale_min, batch_count=82720.0, ans=0.2 2024-03-16 05:57:33,660 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass.skip_rate, batch_count=82753.33333333333, ans=0.04949747468305833 2024-03-16 05:57:45,186 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.454e+01 7.597e+01 9.286e+01 1.257e+02 1.873e+02, threshold=1.857e+02, percent-clipped=0.0 2024-03-16 05:57:47,741 INFO [train_char.py:689] (1/2) Epoch 49, batch 500, loss[loss=0.06714, simple_loss=0.1251, pruned_loss=0.004564, over 24021.00 frames. ], tot_loss[loss=0.06325, simple_loss=0.1184, pruned_loss=0.004033, over 4434826.36 frames. ], batch size: 250, lr: 6.52e-03, grad_scale: 32.0 2024-03-16 05:58:46,457 INFO [train_char.py:689] (1/2) Epoch 50, batch 0, loss[loss=0.06311, simple_loss=0.1204, pruned_loss=0.002914, over 24202.00 frames. ], tot_loss[loss=0.06311, simple_loss=0.1204, pruned_loss=0.002914, over 24202.00 frames. ], batch size: 212, lr: 6.46e-03, grad_scale: 32.0 2024-03-16 05:58:46,458 INFO [train_char.py:712] (1/2) Computing validation loss 2024-03-16 05:58:55,691 INFO [zipformer.py:1858] (1/2) name=encoder.encoders.5.encoder.layers.0.self_attn_weights, attn_weights_entropy = tensor([4.0112, 3.7516, 3.6279, 3.7097], device='cuda:1') 2024-03-16 05:59:03,820 INFO [train_char.py:721] (1/2) Epoch 50, validation: loss=0.05626, simple_loss=0.1051, pruned_loss=0.003704, over 657665.00 frames. 2024-03-16 05:59:03,821 INFO [train_char.py:722] (1/2) Maximum memory allocated so far is 25210MB 2024-03-16 05:59:06,078 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.1.whiten, num_groups=1, num_channels=512, metric=7.93 vs. limit=12.0 2024-03-16 05:59:11,146 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.5.encoder.layers.0.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.70 vs. limit=6.0 2024-03-16 05:59:46,942 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module1.balancer1.min_positive, batch_count=82910.0, ans=0.025 2024-03-16 05:59:57,595 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_skip_rate, batch_count=82943.33333333333, ans=0.0 2024-03-16 05:59:58,961 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.conv_skip_rate, batch_count=82943.33333333333, ans=0.0 2024-03-16 06:00:10,880 INFO [train_char.py:689] (1/2) Epoch 50, batch 50, loss[loss=0.06025, simple_loss=0.113, pruned_loss=0.003766, over 24390.00 frames. ], tot_loss[loss=0.05852, simple_loss=0.1094, pruned_loss=0.003806, over 1082898.52 frames. ], batch size: 180, lr: 6.45e-03, grad_scale: 32.0 2024-03-16 06:01:12,105 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.807e+01 7.372e+01 1.027e+02 1.338e+02 2.470e+02, threshold=2.053e+02, percent-clipped=9.0 2024-03-16 06:01:23,376 INFO [train_char.py:689] (1/2) Epoch 50, batch 100, loss[loss=0.05764, simple_loss=0.1081, pruned_loss=0.003602, over 24263.00 frames. ], tot_loss[loss=0.05913, simple_loss=0.1105, pruned_loss=0.003885, over 1915193.35 frames. ], batch size: 328, lr: 6.45e-03, grad_scale: 32.0 2024-03-16 06:01:23,685 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward2.hidden_balancer.prob, batch_count=83143.33333333333, ans=0.125 2024-03-16 06:01:30,060 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.attention_skip_rate, batch_count=83143.33333333333, ans=0.0 2024-03-16 06:01:40,390 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.balancer.min_positive, batch_count=83176.66666666667, ans=0.05 2024-03-16 06:01:51,905 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.feed_forward1.out_proj.dropout_p, batch_count=83210.0, ans=0.1 2024-03-16 06:01:53,285 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=83210.0, ans=0.0 2024-03-16 06:01:53,687 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.feed_forward1.out_whiten, num_groups=1, num_channels=384, metric=10.69 vs. limit=15.0 2024-03-16 06:01:54,582 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff3_skip_rate, batch_count=83210.0, ans=0.0 2024-03-16 06:02:13,845 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.bypass_mid.scale_min, batch_count=83276.66666666667, ans=0.2 2024-03-16 06:02:21,913 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.feed_forward3.out_whiten, num_groups=1, num_channels=384, metric=11.63 vs. limit=15.0 2024-03-16 06:02:24,526 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=9.67 vs. limit=15.0 2024-03-16 06:02:25,261 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.conv_module1.balancer2.min_abs, batch_count=83276.66666666667, ans=0.5 2024-03-16 06:02:27,570 INFO [train_char.py:689] (1/2) Epoch 50, batch 150, loss[loss=0.06522, simple_loss=0.1223, pruned_loss=0.004071, over 24122.00 frames. ], tot_loss[loss=0.05946, simple_loss=0.1113, pruned_loss=0.003814, over 2560024.84 frames. ], batch size: 188, lr: 6.44e-03, grad_scale: 32.0 2024-03-16 06:02:35,558 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.nonlin_attention.balancer.prob, batch_count=83310.0, ans=0.125 2024-03-16 06:02:38,504 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass_mid.scale_min, batch_count=83310.0, ans=0.2 2024-03-16 06:02:52,458 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.ff3_skip_rate, batch_count=83376.66666666667, ans=0.0 2024-03-16 06:03:00,527 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.whiten_keys, num_groups=4, num_channels=128, metric=4.07 vs. limit=6.0 2024-03-16 06:03:03,040 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=7.78 vs. limit=10.0 2024-03-16 06:03:06,414 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.conv_module1.balancer1.prob, batch_count=83410.0, ans=0.125 2024-03-16 06:03:24,167 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.731e+01 7.051e+01 1.024e+02 1.368e+02 2.363e+02, threshold=2.048e+02, percent-clipped=8.0 2024-03-16 06:03:25,066 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.1.encoder.layers.0.self_attn1.whiten, num_groups=1, num_channels=256, metric=17.82 vs. limit=22.5 2024-03-16 06:03:35,871 INFO [train_char.py:689] (1/2) Epoch 50, batch 200, loss[loss=0.06518, simple_loss=0.1239, pruned_loss=0.003204, over 24314.00 frames. ], tot_loss[loss=0.05935, simple_loss=0.1112, pruned_loss=0.003766, over 3065513.25 frames. ], batch size: 180, lr: 6.43e-03, grad_scale: 32.0 2024-03-16 06:03:50,529 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=83510.0, ans=0.125 2024-03-16 06:03:59,438 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.balancer2.prob, batch_count=83510.0, ans=0.125 2024-03-16 06:04:08,413 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.bypass.skip_rate, batch_count=83543.33333333333, ans=0.07 2024-03-16 06:04:09,175 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.0.layers.0.conv_module2.whiten, num_groups=1, num_channels=192, metric=7.20 vs. limit=15.0 2024-03-16 06:04:09,713 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=83543.33333333333, ans=0.125 2024-03-16 06:04:12,321 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.3.encoder.layers.3.self_attn_weights, loss-sum=0.000e+00 2024-03-16 06:04:24,731 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.4.encoder.layers.2.conv_module2.whiten, num_groups=1, num_channels=384, metric=3.85 vs. limit=15.0 2024-03-16 06:04:34,345 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward1.hidden_balancer.prob, batch_count=83610.0, ans=0.125 2024-03-16 06:04:36,915 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.2.self_attn_weights.pos_emb_skip_rate, batch_count=83610.0, ans=0.0 2024-03-16 06:04:39,521 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=83610.0, ans=0.125 2024-03-16 06:04:42,935 INFO [train_char.py:689] (1/2) Epoch 50, batch 250, loss[loss=0.05519, simple_loss=0.09624, pruned_loss=0.007067, over 23789.00 frames. ], tot_loss[loss=0.05958, simple_loss=0.1114, pruned_loss=0.003856, over 3456528.48 frames. ], batch size: 439, lr: 6.43e-03, grad_scale: 32.0 2024-03-16 06:05:03,537 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=83676.66666666667, ans=0.1 2024-03-16 06:05:18,863 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.balancer.max_positive, batch_count=83710.0, ans=0.95 2024-03-16 06:05:20,115 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.conv_skip_rate, batch_count=83743.33333333333, ans=0.0 2024-03-16 06:05:20,188 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.1.attention_skip_rate, batch_count=83743.33333333333, ans=0.0 2024-03-16 06:05:31,576 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.5.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 06:05:35,041 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.399e+01 7.830e+01 1.003e+02 1.417e+02 3.027e+02, threshold=2.006e+02, percent-clipped=7.0 2024-03-16 06:05:45,060 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.feed_forward3.hidden_balancer.prob, batch_count=83776.66666666667, ans=0.125 2024-03-16 06:05:49,518 INFO [train_char.py:689] (1/2) Epoch 50, batch 300, loss[loss=0.0603, simple_loss=0.1123, pruned_loss=0.004139, over 24364.00 frames. ], tot_loss[loss=0.05973, simple_loss=0.1116, pruned_loss=0.003909, over 3756347.24 frames. ], batch size: 152, lr: 6.42e-03, grad_scale: 32.0 2024-03-16 06:06:00,046 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=83810.0, ans=0.1 2024-03-16 06:06:01,369 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.nonlin_attention.balancer.prob, batch_count=83843.33333333333, ans=0.125 2024-03-16 06:06:02,725 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.balancer1.prob, batch_count=83843.33333333333, ans=0.125 2024-03-16 06:06:10,164 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.balancer2.prob, batch_count=83843.33333333333, ans=0.125 2024-03-16 06:06:10,270 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.1.conv_module1.balancer1.max_abs, batch_count=83843.33333333333, ans=10.0 2024-03-16 06:06:19,413 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.feed_forward1.out_proj.dropout_p, batch_count=83876.66666666667, ans=0.1 2024-03-16 06:06:19,437 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.0.conv_module2.balancer2.prob, batch_count=83876.66666666667, ans=0.125 2024-03-16 06:06:36,957 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.0.bypass.scale_min, batch_count=83910.0, ans=0.2 2024-03-16 06:06:55,892 INFO [train_char.py:689] (1/2) Epoch 50, batch 350, loss[loss=0.06104, simple_loss=0.115, pruned_loss=0.003561, over 24224.00 frames. ], tot_loss[loss=0.06025, simple_loss=0.1128, pruned_loss=0.003853, over 3997053.43 frames. ], batch size: 311, lr: 6.42e-03, grad_scale: 32.0 2024-03-16 06:06:58,764 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.2.feed_forward2.hidden_balancer.prob, batch_count=83976.66666666667, ans=0.125 2024-03-16 06:07:05,554 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.2.whiten.whitening_limit, batch_count=83976.66666666667, ans=12.0 2024-03-16 06:07:26,934 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.2.encoder.layers.1.nonlin_attention.whiten2, num_groups=1, num_channels=384, metric=12.76 vs. limit=15.0 2024-03-16 06:07:40,892 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.1.conv_module1.balancer2.prob, batch_count=84076.66666666667, ans=0.125 2024-03-16 06:07:44,627 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.1.ff3_skip_rate, batch_count=84076.66666666667, ans=0.0 2024-03-16 06:07:49,371 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 4.894e+01 8.440e+01 1.085e+02 1.456e+02 2.350e+02, threshold=2.171e+02, percent-clipped=3.0 2024-03-16 06:07:52,109 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.ff3_skip_rate, batch_count=84110.0, ans=0.0 2024-03-16 06:07:56,020 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.3.encoder.layers.3.feed_forward1.out_proj.dropout_p, batch_count=84110.0, ans=0.1 2024-03-16 06:08:00,929 INFO [train_char.py:689] (1/2) Epoch 50, batch 400, loss[loss=0.06928, simple_loss=0.1306, pruned_loss=0.003981, over 24086.00 frames. ], tot_loss[loss=0.06065, simple_loss=0.1135, pruned_loss=0.003886, over 4183410.45 frames. ], batch size: 236, lr: 6.41e-03, grad_scale: 32.0 2024-03-16 06:08:02,462 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.2.encoder.layers.1.conv_module2.balancer1.min_positive, batch_count=84143.33333333333, ans=0.025 2024-03-16 06:08:04,310 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.0.feed_forward3.out_whiten, num_groups=1, num_channels=512, metric=13.24 vs. limit=15.0 2024-03-16 06:08:37,072 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.2.feed_forward2.out_whiten, num_groups=1, num_channels=512, metric=11.05 vs. limit=15.0 2024-03-16 06:08:45,015 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.self_attn_weights.pos_emb_skip_rate, batch_count=84243.33333333333, ans=0.0 2024-03-16 06:09:00,860 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.ff3_skip_rate, batch_count=84276.66666666667, ans=0.0 2024-03-16 06:09:05,567 INFO [train_char.py:689] (1/2) Epoch 50, batch 450, loss[loss=0.06752, simple_loss=0.1263, pruned_loss=0.004351, over 24151.00 frames. ], tot_loss[loss=0.06172, simple_loss=0.1157, pruned_loss=0.003895, over 4327725.17 frames. ], batch size: 188, lr: 6.40e-03, grad_scale: 32.0 2024-03-16 06:09:21,238 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.1.encoder.layers.0.conv_module2.balancer1.max_abs, batch_count=84343.33333333333, ans=10.0 2024-03-16 06:09:27,927 INFO [scaling.py:1023] (1/2) Whitening: name=encoder.encoders.3.encoder.layers.3.nonlin_attention.whiten1, num_groups=1, num_channels=384, metric=6.07 vs. limit=10.0 2024-03-16 06:09:28,726 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.5.encoder.layers.0.conv_skip_rate, batch_count=84343.33333333333, ans=0.0 2024-03-16 06:09:35,872 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.0.layers.0.feed_forward1.out_proj.dropout_p, batch_count=84376.66666666667, ans=0.1 2024-03-16 06:09:57,529 WARNING [optim.py:487] (1/2) Clipping_scale=2.0, grad-norm quartiles 5.533e+01 7.607e+01 9.613e+01 1.214e+02 2.346e+02, threshold=1.923e+02, percent-clipped=1.0 2024-03-16 06:10:06,661 INFO [scaling.py:214] (1/2) ScheduledFloat: name=encoder.encoders.4.encoder.layers.0.feed_forward3.hidden_balancer.prob, batch_count=84443.33333333333, ans=0.125 2024-03-16 06:10:09,385 INFO [train_char.py:689] (1/2) Epoch 50, batch 500, loss[loss=0.06937, simple_loss=0.1291, pruned_loss=0.004811, over 24054.00 frames. ], tot_loss[loss=0.06246, simple_loss=0.117, pruned_loss=0.003972, over 4440028.49 frames. ], batch size: 199, lr: 6.40e-03, grad_scale: 32.0 2024-03-16 06:10:15,915 INFO [scaling.py:1119] (1/2) WithLoss: name=encoder.encoders.1.encoder.layers.0.self_attn_weights, loss-sum=0.000e+00 2024-03-16 06:10:18,394 INFO [train_char.py:1026] (1/2) Done!