icefall-asr-ami-pruned-transducer-stateless7 / log /fast_beam_search /log-decode-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8-2022-11-21-07-23-44
desh2608's picture
update with averaged model
06747da
raw
history blame
No virus
36.5 kB
2022-11-21 07:23:44,328 INFO [decode.py:574] Decoding started
2022-11-21 07:23:44,329 INFO [decode.py:580] Device: cuda:0
2022-11-21 07:23:44,335 INFO [decode.py:590] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 100, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'warm_step': 2000, 'env_info': {'k2-version': '1.21', 'k2-build-type': 'Debug', 'k2-with-cuda': True, 'k2-git-sha1': 'f271e82ef30f75fecbae44b163e1244e53def116', 'k2-git-date': 'Fri Oct 28 05:02:16 2022', 'lhotse-version': '1.9.0.dev+git.97bf4b0.dirty', 'torch-version': '1.10.0+cu111', 'torch-cuda-available': True, 'torch-cuda-version': '11.1', 'python-version': '3.8', 'icefall-git-branch': 'ami_recipe', 'icefall-git-sha1': 'd1b5a16-clean', 'icefall-git-date': 'Sun Nov 20 22:32:57 2022', 'icefall-path': '/exp/draj/mini_scale_2022/icefall', 'k2-path': '/exp/draj/mini_scale_2022/k2/k2/python/k2/__init__.py', 'lhotse-path': '/exp/draj/mini_scale_2022/lhotse/lhotse/__init__.py', 'hostname': 'r3n07', 'IP address': '10.1.3.7'}, 'epoch': 14, 'iter': 0, 'avg': 8, 'use_averaged_model': True, 'exp_dir': PosixPath('pruned_transducer_stateless7/exp/v2'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'decoding_method': 'fast_beam_search', 'beam_size': 4, 'beam': 4, 'ngram_lm_scale': 0.01, 'max_contexts': 4, 'max_states': 8, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': '2,4,3,2,4', 'feedforward_dims': '1024,1024,2048,2048,1024', 'nhead': '8,8,8,8,8', 'encoder_dims': '384,384,384,384,384', 'attention_dims': '192,192,192,192,192', 'encoder_unmasked_dims': '256,256,256,256,256', 'zipformer_downsampling_factors': '1,2,4,8,2', 'cnn_module_kernels': '31,31,31,31,31', 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/manifests'), 'enable_musan': True, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'max_duration': 500, 'max_cuts': None, 'num_buckets': 50, 'on_the_fly_feats': False, 'shuffle': True, 'num_workers': 8, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'ihm_only': False, 'res_dir': PosixPath('pruned_transducer_stateless7/exp/v2/fast_beam_search'), 'suffix': 'epoch-14-avg-8-beam-4-max-contexts-4-max-states-8', 'blank_id': 0, 'unk_id': 2, 'vocab_size': 500}
2022-11-21 07:23:44,335 INFO [decode.py:592] About to create model
2022-11-21 07:23:44,944 INFO [zipformer.py:179] At encoder stack 4, which has downsampling_factor=2, we will combine the outputs of layers 1 and 3, with downsampling_factors=2 and 8.
2022-11-21 07:23:44,954 INFO [decode.py:659] Calculating the averaged model over epoch range from 6 (excluded) to 14
2022-11-21 07:24:19,994 INFO [decode.py:694] Number of model parameters: 70369391
2022-11-21 07:24:19,995 INFO [asr_datamodule.py:392] About to get AMI IHM dev cuts
2022-11-21 07:24:20,001 INFO [asr_datamodule.py:413] About to get AMI IHM test cuts
2022-11-21 07:24:20,003 INFO [asr_datamodule.py:398] About to get AMI SDM dev cuts
2022-11-21 07:24:20,005 INFO [asr_datamodule.py:419] About to get AMI SDM test cuts
2022-11-21 07:24:20,006 INFO [asr_datamodule.py:407] About to get AMI GSS-enhanced dev cuts
2022-11-21 07:24:20,008 INFO [asr_datamodule.py:428] About to get AMI GSS-enhanced test cuts
2022-11-21 07:24:22,027 INFO [decode.py:726] Decoding dev_ihm
2022-11-21 07:24:25,205 INFO [decode.py:469] batch 0/?, cuts processed until now is 72
2022-11-21 07:24:27,591 INFO [decode.py:469] batch 2/?, cuts processed until now is 537
2022-11-21 07:24:30,197 INFO [decode.py:469] batch 4/?, cuts processed until now is 689
2022-11-21 07:24:32,818 INFO [decode.py:469] batch 6/?, cuts processed until now is 823
2022-11-21 07:24:35,376 INFO [decode.py:469] batch 8/?, cuts processed until now is 985
2022-11-21 07:24:39,599 INFO [decode.py:469] batch 10/?, cuts processed until now is 1088
2022-11-21 07:24:42,143 INFO [decode.py:469] batch 12/?, cuts processed until now is 1263
2022-11-21 07:24:44,281 INFO [decode.py:469] batch 14/?, cuts processed until now is 1521
2022-11-21 07:24:46,328 INFO [decode.py:469] batch 16/?, cuts processed until now is 1903
2022-11-21 07:24:49,625 INFO [decode.py:469] batch 18/?, cuts processed until now is 2032
2022-11-21 07:24:52,952 INFO [decode.py:469] batch 20/?, cuts processed until now is 2117
2022-11-21 07:24:55,096 INFO [decode.py:469] batch 22/?, cuts processed until now is 2375
2022-11-21 07:24:57,164 INFO [decode.py:469] batch 24/?, cuts processed until now is 2824
2022-11-21 07:24:59,771 INFO [decode.py:469] batch 26/?, cuts processed until now is 2969
2022-11-21 07:25:02,174 INFO [decode.py:469] batch 28/?, cuts processed until now is 3245
2022-11-21 07:25:04,637 INFO [decode.py:469] batch 30/?, cuts processed until now is 3401
2022-11-21 07:25:07,830 INFO [decode.py:469] batch 32/?, cuts processed until now is 3519
2022-11-21 07:25:10,391 INFO [decode.py:469] batch 34/?, cuts processed until now is 3694
2022-11-21 07:25:13,058 INFO [decode.py:469] batch 36/?, cuts processed until now is 3818
2022-11-21 07:25:15,685 INFO [decode.py:469] batch 38/?, cuts processed until now is 3970
2022-11-21 07:25:17,743 INFO [decode.py:469] batch 40/?, cuts processed until now is 4750
2022-11-21 07:25:20,098 INFO [decode.py:469] batch 42/?, cuts processed until now is 5038
2022-11-21 07:25:23,210 INFO [decode.py:469] batch 44/?, cuts processed until now is 5144
2022-11-21 07:25:26,314 INFO [decode.py:469] batch 46/?, cuts processed until now is 5253
2022-11-21 07:25:29,565 INFO [decode.py:469] batch 48/?, cuts processed until now is 5672
2022-11-21 07:25:29,826 INFO [zipformer.py:1414] attn_weights_entropy = tensor([2.8204, 2.7644, 2.5354, 2.9453, 2.4156, 2.6087, 2.6265, 3.2272],
device='cuda:0'), covar=tensor([0.1307, 0.2458, 0.3112, 0.2146, 0.2234, 0.1893, 0.2297, 0.2367],
device='cuda:0'), in_proj_covar=tensor([0.0085, 0.0085, 0.0091, 0.0079, 0.0076, 0.0081, 0.0082, 0.0061],
device='cuda:0'), out_proj_covar=tensor([0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0002],
device='cuda:0')
2022-11-21 07:25:31,873 INFO [decode.py:469] batch 50/?, cuts processed until now is 5878
2022-11-21 07:25:33,840 INFO [decode.py:469] batch 52/?, cuts processed until now is 6260
2022-11-21 07:25:35,858 INFO [decode.py:469] batch 54/?, cuts processed until now is 6808
2022-11-21 07:25:38,267 INFO [decode.py:469] batch 56/?, cuts processed until now is 7117
2022-11-21 07:25:40,248 INFO [decode.py:469] batch 58/?, cuts processed until now is 7565
2022-11-21 07:25:42,301 INFO [decode.py:469] batch 60/?, cuts processed until now is 8078
2022-11-21 07:25:44,330 INFO [decode.py:469] batch 62/?, cuts processed until now is 8626
2022-11-21 07:25:46,604 INFO [decode.py:469] batch 64/?, cuts processed until now is 9174
2022-11-21 07:25:48,737 INFO [decode.py:469] batch 66/?, cuts processed until now is 9455
2022-11-21 07:25:50,787 INFO [decode.py:469] batch 68/?, cuts processed until now is 9968
2022-11-21 07:25:52,859 INFO [decode.py:469] batch 70/?, cuts processed until now is 10481
2022-11-21 07:25:54,965 INFO [decode.py:469] batch 72/?, cuts processed until now is 11264
2022-11-21 07:25:56,861 INFO [decode.py:469] batch 74/?, cuts processed until now is 11669
2022-11-21 07:25:58,451 INFO [decode.py:469] batch 76/?, cuts processed until now is 11761
2022-11-21 07:26:00,024 INFO [decode.py:469] batch 78/?, cuts processed until now is 11843
2022-11-21 07:26:01,765 INFO [decode.py:469] batch 80/?, cuts processed until now is 11956
2022-11-21 07:26:03,121 INFO [decode.py:469] batch 82/?, cuts processed until now is 12467
2022-11-21 07:26:07,061 INFO [decode.py:469] batch 84/?, cuts processed until now is 12586
2022-11-21 07:26:08,969 INFO [decode.py:485] The transcripts are stored in pruned_transducer_stateless7/exp/v2/fast_beam_search/recogs-dev_ihm-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:26:09,125 INFO [utils.py:530] [dev_ihm-beam_4_max_contexts_4_max_states_8] %WER 19.44% [18459 / 94940, 2783 ins, 3992 del, 11684 sub ]
2022-11-21 07:26:09,814 INFO [utils.py:530] [dev_ihm-beam_4_max_contexts_4_max_states_8] %WER 12.30% [45497 / 369873, 10772 ins, 17562 del, 17163 sub ]
2022-11-21 07:26:10,877 INFO [decode.py:511] Wrote detailed error stats to pruned_transducer_stateless7/exp/v2/fast_beam_search/wers-dev_ihm-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:26:10,885 INFO [decode.py:531]
For dev_ihm, WER/CER of different settings are:
beam_4_max_contexts_4_max_states_8 19.44 12.3 best for dev_ihm
2022-11-21 07:26:10,897 INFO [decode.py:726] Decoding test_ihm
2022-11-21 07:26:14,017 INFO [decode.py:469] batch 0/?, cuts processed until now is 69
2022-11-21 07:26:16,531 INFO [decode.py:469] batch 2/?, cuts processed until now is 555
2022-11-21 07:26:19,207 INFO [decode.py:469] batch 4/?, cuts processed until now is 703
2022-11-21 07:26:21,907 INFO [decode.py:469] batch 6/?, cuts processed until now is 830
2022-11-21 07:26:24,538 INFO [decode.py:469] batch 8/?, cuts processed until now is 987
2022-11-21 07:26:28,097 INFO [decode.py:469] batch 10/?, cuts processed until now is 1095
2022-11-21 07:26:30,701 INFO [decode.py:469] batch 12/?, cuts processed until now is 1267
2022-11-21 07:26:32,859 INFO [decode.py:469] batch 14/?, cuts processed until now is 1532
2022-11-21 07:26:34,925 INFO [decode.py:469] batch 16/?, cuts processed until now is 1931
2022-11-21 07:26:38,720 INFO [decode.py:469] batch 18/?, cuts processed until now is 2055
2022-11-21 07:26:42,970 INFO [decode.py:469] batch 20/?, cuts processed until now is 2124
2022-11-21 07:26:45,358 INFO [decode.py:469] batch 22/?, cuts processed until now is 2388
2022-11-21 07:26:47,447 INFO [decode.py:469] batch 24/?, cuts processed until now is 2856
2022-11-21 07:26:50,554 INFO [decode.py:469] batch 26/?, cuts processed until now is 2996
2022-11-21 07:26:53,323 INFO [decode.py:469] batch 28/?, cuts processed until now is 3278
2022-11-21 07:26:55,819 INFO [decode.py:469] batch 30/?, cuts processed until now is 3430
2022-11-21 07:26:59,582 INFO [decode.py:469] batch 32/?, cuts processed until now is 3535
2022-11-21 07:27:02,557 INFO [decode.py:469] batch 34/?, cuts processed until now is 3706
2022-11-21 07:27:05,346 INFO [decode.py:469] batch 36/?, cuts processed until now is 3822
2022-11-21 07:27:08,027 INFO [decode.py:469] batch 38/?, cuts processed until now is 3969
2022-11-21 07:27:11,470 INFO [decode.py:469] batch 40/?, cuts processed until now is 4411
2022-11-21 07:27:11,769 INFO [zipformer.py:1414] attn_weights_entropy = tensor([1.9139, 2.2858, 1.8880, 1.5129, 1.7399, 2.4293, 2.5477, 2.5261],
device='cuda:0'), covar=tensor([0.0995, 0.0696, 0.1996, 0.1805, 0.0938, 0.0902, 0.0388, 0.0921],
device='cuda:0'), in_proj_covar=tensor([0.0155, 0.0166, 0.0141, 0.0170, 0.0150, 0.0167, 0.0133, 0.0161],
device='cuda:0'), out_proj_covar=tensor([0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0003],
device='cuda:0')
2022-11-21 07:27:13,771 INFO [decode.py:469] batch 42/?, cuts processed until now is 5058
2022-11-21 07:27:15,998 INFO [decode.py:469] batch 44/?, cuts processed until now is 5544
2022-11-21 07:27:18,607 INFO [decode.py:469] batch 46/?, cuts processed until now is 5685
2022-11-21 07:27:20,960 INFO [decode.py:469] batch 48/?, cuts processed until now is 5890
2022-11-21 07:27:23,447 INFO [decode.py:469] batch 50/?, cuts processed until now is 6372
2022-11-21 07:27:25,569 INFO [decode.py:469] batch 52/?, cuts processed until now is 6706
2022-11-21 07:27:27,547 INFO [decode.py:469] batch 54/?, cuts processed until now is 7105
2022-11-21 07:27:31,263 INFO [decode.py:469] batch 56/?, cuts processed until now is 7290
2022-11-21 07:27:33,635 INFO [decode.py:469] batch 58/?, cuts processed until now is 8116
2022-11-21 07:27:37,667 INFO [decode.py:469] batch 60/?, cuts processed until now is 8258
2022-11-21 07:27:39,712 INFO [decode.py:469] batch 62/?, cuts processed until now is 8794
2022-11-21 07:27:41,794 INFO [decode.py:469] batch 64/?, cuts processed until now is 9330
2022-11-21 07:27:45,450 INFO [decode.py:469] batch 66/?, cuts processed until now is 9476
2022-11-21 07:27:48,517 INFO [decode.py:469] batch 68/?, cuts processed until now is 9921
2022-11-21 07:27:50,540 INFO [decode.py:469] batch 70/?, cuts processed until now is 10251
2022-11-21 07:27:53,295 INFO [decode.py:469] batch 72/?, cuts processed until now is 10679
2022-11-21 07:27:55,643 INFO [decode.py:469] batch 74/?, cuts processed until now is 10794
2022-11-21 07:27:57,283 INFO [decode.py:469] batch 76/?, cuts processed until now is 11039
2022-11-21 07:27:58,317 INFO [decode.py:469] batch 78/?, cuts processed until now is 11155
2022-11-21 07:27:59,870 INFO [decode.py:469] batch 80/?, cuts processed until now is 11600
2022-11-21 07:28:02,120 INFO [decode.py:469] batch 82/?, cuts processed until now is 12041
2022-11-21 07:28:03,513 INFO [decode.py:469] batch 84/?, cuts processed until now is 12110
2022-11-21 07:28:03,713 INFO [decode.py:485] The transcripts are stored in pruned_transducer_stateless7/exp/v2/fast_beam_search/recogs-test_ihm-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:28:03,850 INFO [utils.py:530] [test_ihm-beam_4_max_contexts_4_max_states_8] %WER 18.04% [16174 / 89659, 1994 ins, 4043 del, 10137 sub ]
2022-11-21 07:28:04,562 INFO [utils.py:530] [test_ihm-beam_4_max_contexts_4_max_states_8] %WER 11.30% [40040 / 354205, 8698 ins, 16856 del, 14486 sub ]
2022-11-21 07:28:05,419 INFO [decode.py:511] Wrote detailed error stats to pruned_transducer_stateless7/exp/v2/fast_beam_search/wers-test_ihm-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:28:05,420 INFO [decode.py:531]
For test_ihm, WER/CER of different settings are:
beam_4_max_contexts_4_max_states_8 18.04 11.3 best for test_ihm
2022-11-21 07:28:05,423 INFO [decode.py:726] Decoding dev_sdm
2022-11-21 07:28:08,299 INFO [decode.py:469] batch 0/?, cuts processed until now is 71
2022-11-21 07:28:10,682 INFO [decode.py:469] batch 2/?, cuts processed until now is 535
2022-11-21 07:28:13,287 INFO [decode.py:469] batch 4/?, cuts processed until now is 686
2022-11-21 07:28:15,897 INFO [decode.py:469] batch 6/?, cuts processed until now is 819
2022-11-21 07:28:18,416 INFO [decode.py:469] batch 8/?, cuts processed until now is 980
2022-11-21 07:28:22,683 INFO [decode.py:469] batch 10/?, cuts processed until now is 1083
2022-11-21 07:28:25,237 INFO [decode.py:469] batch 12/?, cuts processed until now is 1257
2022-11-21 07:28:27,455 INFO [decode.py:469] batch 14/?, cuts processed until now is 1513
2022-11-21 07:28:29,522 INFO [decode.py:469] batch 16/?, cuts processed until now is 1892
2022-11-21 07:28:29,759 INFO [zipformer.py:1414] attn_weights_entropy = tensor([3.2296, 5.3755, 3.8462, 5.1250, 4.2310, 3.9708, 3.6648, 4.8498],
device='cuda:0'), covar=tensor([0.1051, 0.0208, 0.0721, 0.0170, 0.0474, 0.0684, 0.1304, 0.0173],
device='cuda:0'), in_proj_covar=tensor([0.0148, 0.0117, 0.0145, 0.0122, 0.0157, 0.0156, 0.0153, 0.0133],
device='cuda:0'), out_proj_covar=tensor([0.0003, 0.0003, 0.0003, 0.0003, 0.0004, 0.0004, 0.0003, 0.0003],
device='cuda:0')
2022-11-21 07:28:32,783 INFO [decode.py:469] batch 18/?, cuts processed until now is 2020
2022-11-21 07:28:36,128 INFO [decode.py:469] batch 20/?, cuts processed until now is 2106
2022-11-21 07:28:38,309 INFO [decode.py:469] batch 22/?, cuts processed until now is 2362
2022-11-21 07:28:40,406 INFO [decode.py:469] batch 24/?, cuts processed until now is 2807
2022-11-21 07:28:43,055 INFO [decode.py:469] batch 26/?, cuts processed until now is 2952
2022-11-21 07:28:45,585 INFO [decode.py:469] batch 28/?, cuts processed until now is 3226
2022-11-21 07:28:48,094 INFO [decode.py:469] batch 30/?, cuts processed until now is 3381
2022-11-21 07:28:51,280 INFO [decode.py:469] batch 32/?, cuts processed until now is 3499
2022-11-21 07:28:53,904 INFO [decode.py:469] batch 34/?, cuts processed until now is 3673
2022-11-21 07:28:56,660 INFO [decode.py:469] batch 36/?, cuts processed until now is 3797
2022-11-21 07:28:59,290 INFO [decode.py:469] batch 38/?, cuts processed until now is 3948
2022-11-21 07:29:01,372 INFO [decode.py:469] batch 40/?, cuts processed until now is 4722
2022-11-21 07:29:03,719 INFO [decode.py:469] batch 42/?, cuts processed until now is 5007
2022-11-21 07:29:05,197 INFO [zipformer.py:1414] attn_weights_entropy = tensor([3.0721, 5.2833, 3.7184, 4.9679, 4.1797, 3.8883, 3.4355, 4.6723],
device='cuda:0'), covar=tensor([0.1258, 0.0157, 0.0812, 0.0174, 0.0420, 0.0680, 0.1516, 0.0198],
device='cuda:0'), in_proj_covar=tensor([0.0148, 0.0117, 0.0145, 0.0122, 0.0157, 0.0156, 0.0153, 0.0133],
device='cuda:0'), out_proj_covar=tensor([0.0003, 0.0003, 0.0003, 0.0003, 0.0004, 0.0004, 0.0003, 0.0003],
device='cuda:0')
2022-11-21 07:29:06,849 INFO [decode.py:469] batch 44/?, cuts processed until now is 5112
2022-11-21 07:29:10,007 INFO [decode.py:469] batch 46/?, cuts processed until now is 5219
2022-11-21 07:29:13,149 INFO [decode.py:469] batch 48/?, cuts processed until now is 5636
2022-11-21 07:29:15,458 INFO [decode.py:469] batch 50/?, cuts processed until now is 5842
2022-11-21 07:29:17,633 INFO [decode.py:469] batch 52/?, cuts processed until now is 6222
2022-11-21 07:29:19,688 INFO [decode.py:469] batch 54/?, cuts processed until now is 6766
2022-11-21 07:29:22,232 INFO [decode.py:469] batch 56/?, cuts processed until now is 7072
2022-11-21 07:29:24,213 INFO [decode.py:469] batch 58/?, cuts processed until now is 7518
2022-11-21 07:29:26,268 INFO [decode.py:469] batch 60/?, cuts processed until now is 8027
2022-11-21 07:29:28,302 INFO [decode.py:469] batch 62/?, cuts processed until now is 8571
2022-11-21 07:29:30,453 INFO [decode.py:469] batch 64/?, cuts processed until now is 9115
2022-11-21 07:29:32,592 INFO [decode.py:469] batch 66/?, cuts processed until now is 9395
2022-11-21 07:29:34,615 INFO [decode.py:469] batch 68/?, cuts processed until now is 9904
2022-11-21 07:29:36,669 INFO [decode.py:469] batch 70/?, cuts processed until now is 10413
2022-11-21 07:29:38,819 INFO [decode.py:469] batch 72/?, cuts processed until now is 11190
2022-11-21 07:29:40,798 INFO [decode.py:469] batch 74/?, cuts processed until now is 11589
2022-11-21 07:29:42,439 INFO [decode.py:469] batch 76/?, cuts processed until now is 11699
2022-11-21 07:29:44,155 INFO [decode.py:469] batch 78/?, cuts processed until now is 11799
2022-11-21 07:29:45,602 INFO [decode.py:469] batch 80/?, cuts processed until now is 11889
2022-11-21 07:29:47,113 INFO [decode.py:469] batch 82/?, cuts processed until now is 12461
2022-11-21 07:29:48,833 INFO [decode.py:469] batch 84/?, cuts processed until now is 12568
2022-11-21 07:29:53,234 INFO [decode.py:469] batch 86/?, cuts processed until now is 12601
2022-11-21 07:29:53,462 INFO [decode.py:485] The transcripts are stored in pruned_transducer_stateless7/exp/v2/fast_beam_search/recogs-dev_sdm-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:29:53,616 INFO [utils.py:530] [dev_sdm-beam_4_max_contexts_4_max_states_8] %WER 31.11% [29537 / 94940, 4266 ins, 7752 del, 17519 sub ]
2022-11-21 07:29:54,425 INFO [utils.py:530] [dev_sdm-beam_4_max_contexts_4_max_states_8] %WER 22.60% [83608 / 369873, 18843 ins, 33372 del, 31393 sub ]
2022-11-21 07:29:55,361 INFO [decode.py:511] Wrote detailed error stats to pruned_transducer_stateless7/exp/v2/fast_beam_search/wers-dev_sdm-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:29:55,362 INFO [decode.py:531]
For dev_sdm, WER/CER of different settings are:
beam_4_max_contexts_4_max_states_8 31.11 22.6 best for dev_sdm
2022-11-21 07:29:55,365 INFO [decode.py:726] Decoding test_sdm
2022-11-21 07:29:58,300 INFO [decode.py:469] batch 0/?, cuts processed until now is 69
2022-11-21 07:30:00,729 INFO [decode.py:469] batch 2/?, cuts processed until now is 555
2022-11-21 07:30:03,378 INFO [decode.py:469] batch 4/?, cuts processed until now is 703
2022-11-21 07:30:06,055 INFO [decode.py:469] batch 6/?, cuts processed until now is 831
2022-11-21 07:30:08,668 INFO [decode.py:469] batch 8/?, cuts processed until now is 988
2022-11-21 07:30:12,581 INFO [decode.py:469] batch 10/?, cuts processed until now is 1096
2022-11-21 07:30:15,300 INFO [decode.py:469] batch 12/?, cuts processed until now is 1268
2022-11-21 07:30:17,517 INFO [decode.py:469] batch 14/?, cuts processed until now is 1533
2022-11-21 07:30:19,563 INFO [decode.py:469] batch 16/?, cuts processed until now is 1932
2022-11-21 07:30:23,125 INFO [decode.py:469] batch 18/?, cuts processed until now is 2057
2022-11-21 07:30:27,346 INFO [decode.py:469] batch 20/?, cuts processed until now is 2126
2022-11-21 07:30:29,550 INFO [decode.py:469] batch 22/?, cuts processed until now is 2390
2022-11-21 07:30:31,498 INFO [decode.py:469] batch 24/?, cuts processed until now is 2858
2022-11-21 07:30:34,241 INFO [decode.py:469] batch 26/?, cuts processed until now is 2998
2022-11-21 07:30:36,726 INFO [decode.py:469] batch 28/?, cuts processed until now is 3280
2022-11-21 07:30:39,207 INFO [decode.py:469] batch 30/?, cuts processed until now is 3432
2022-11-21 07:30:42,945 INFO [decode.py:469] batch 32/?, cuts processed until now is 3537
2022-11-21 07:30:45,682 INFO [decode.py:469] batch 34/?, cuts processed until now is 3709
2022-11-21 07:30:48,451 INFO [decode.py:469] batch 36/?, cuts processed until now is 3825
2022-11-21 07:30:50,144 INFO [zipformer.py:1414] attn_weights_entropy = tensor([2.6736, 4.3904, 3.2157, 2.0058, 3.9507, 1.8350, 3.8480, 2.5184],
device='cuda:0'), covar=tensor([0.1310, 0.0095, 0.0785, 0.2064, 0.0167, 0.1716, 0.0194, 0.1407],
device='cuda:0'), in_proj_covar=tensor([0.0112, 0.0093, 0.0102, 0.0107, 0.0090, 0.0114, 0.0086, 0.0106],
device='cuda:0'), out_proj_covar=tensor([0.0005, 0.0003, 0.0004, 0.0004, 0.0003, 0.0004, 0.0003, 0.0004],
device='cuda:0')
2022-11-21 07:30:51,093 INFO [decode.py:469] batch 38/?, cuts processed until now is 3972
2022-11-21 07:30:53,910 INFO [decode.py:469] batch 40/?, cuts processed until now is 4410
2022-11-21 07:30:56,064 INFO [decode.py:469] batch 42/?, cuts processed until now is 5060
2022-11-21 07:30:58,290 INFO [decode.py:469] batch 44/?, cuts processed until now is 5546
2022-11-21 07:31:00,886 INFO [decode.py:469] batch 46/?, cuts processed until now is 5687
2022-11-21 07:31:03,150 INFO [decode.py:469] batch 48/?, cuts processed until now is 5893
2022-11-21 07:31:05,780 INFO [decode.py:469] batch 50/?, cuts processed until now is 6379
2022-11-21 07:31:07,885 INFO [decode.py:469] batch 52/?, cuts processed until now is 6713
2022-11-21 07:31:09,866 INFO [decode.py:469] batch 54/?, cuts processed until now is 7112
2022-11-21 07:31:13,533 INFO [decode.py:469] batch 56/?, cuts processed until now is 7298
2022-11-21 07:31:15,680 INFO [decode.py:469] batch 58/?, cuts processed until now is 8130
2022-11-21 07:31:19,632 INFO [decode.py:469] batch 60/?, cuts processed until now is 8273
2022-11-21 07:31:21,656 INFO [decode.py:469] batch 62/?, cuts processed until now is 8813
2022-11-21 07:31:23,663 INFO [decode.py:469] batch 64/?, cuts processed until now is 9353
2022-11-21 07:31:27,304 INFO [decode.py:469] batch 66/?, cuts processed until now is 9500
2022-11-21 07:31:30,529 INFO [decode.py:469] batch 68/?, cuts processed until now is 9944
2022-11-21 07:31:32,566 INFO [decode.py:469] batch 70/?, cuts processed until now is 10274
2022-11-21 07:31:35,274 INFO [decode.py:469] batch 72/?, cuts processed until now is 10711
2022-11-21 07:31:37,422 INFO [decode.py:469] batch 74/?, cuts processed until now is 10820
2022-11-21 07:31:38,986 INFO [decode.py:469] batch 76/?, cuts processed until now is 11076
2022-11-21 07:31:40,050 INFO [decode.py:469] batch 78/?, cuts processed until now is 11209
2022-11-21 07:31:41,518 INFO [decode.py:469] batch 80/?, cuts processed until now is 11651
2022-11-21 07:31:43,710 INFO [decode.py:469] batch 82/?, cuts processed until now is 12070
2022-11-21 07:31:45,094 INFO [decode.py:485] The transcripts are stored in pruned_transducer_stateless7/exp/v2/fast_beam_search/recogs-test_sdm-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:31:45,239 INFO [utils.py:530] [test_sdm-beam_4_max_contexts_4_max_states_8] %WER 32.10% [28784 / 89659, 3596 ins, 8598 del, 16590 sub ]
2022-11-21 07:31:46,024 INFO [utils.py:530] [test_sdm-beam_4_max_contexts_4_max_states_8] %WER 23.50% [83238 / 354205, 17319 ins, 35917 del, 30002 sub ]
2022-11-21 07:31:46,923 INFO [decode.py:511] Wrote detailed error stats to pruned_transducer_stateless7/exp/v2/fast_beam_search/wers-test_sdm-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:31:46,924 INFO [decode.py:531]
For test_sdm, WER/CER of different settings are:
beam_4_max_contexts_4_max_states_8 32.1 23.5 best for test_sdm
2022-11-21 07:31:46,927 INFO [decode.py:726] Decoding dev_gss
2022-11-21 07:31:49,806 INFO [decode.py:469] batch 0/?, cuts processed until now is 71
2022-11-21 07:31:52,154 INFO [decode.py:469] batch 2/?, cuts processed until now is 535
2022-11-21 07:31:54,746 INFO [decode.py:469] batch 4/?, cuts processed until now is 686
2022-11-21 07:31:56,352 INFO [zipformer.py:1414] attn_weights_entropy = tensor([2.9126, 2.7834, 2.6260, 2.9843, 2.3963, 2.7063, 2.7318, 3.2210],
device='cuda:0'), covar=tensor([0.1129, 0.2337, 0.2865, 0.1633, 0.2293, 0.1222, 0.1967, 0.1578],
device='cuda:0'), in_proj_covar=tensor([0.0085, 0.0085, 0.0091, 0.0079, 0.0076, 0.0081, 0.0082, 0.0061],
device='cuda:0'), out_proj_covar=tensor([0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0002],
device='cuda:0')
2022-11-21 07:31:57,361 INFO [decode.py:469] batch 6/?, cuts processed until now is 819
2022-11-21 07:31:59,997 INFO [decode.py:469] batch 8/?, cuts processed until now is 980
2022-11-21 07:32:04,080 INFO [decode.py:469] batch 10/?, cuts processed until now is 1083
2022-11-21 07:32:06,614 INFO [decode.py:469] batch 12/?, cuts processed until now is 1257
2022-11-21 07:32:08,800 INFO [decode.py:469] batch 14/?, cuts processed until now is 1513
2022-11-21 07:32:10,784 INFO [decode.py:469] batch 16/?, cuts processed until now is 1892
2022-11-21 07:32:14,062 INFO [decode.py:469] batch 18/?, cuts processed until now is 2020
2022-11-21 07:32:17,418 INFO [decode.py:469] batch 20/?, cuts processed until now is 2106
2022-11-21 07:32:19,570 INFO [decode.py:469] batch 22/?, cuts processed until now is 2362
2022-11-21 07:32:21,526 INFO [decode.py:469] batch 24/?, cuts processed until now is 2807
2022-11-21 07:32:24,206 INFO [decode.py:469] batch 26/?, cuts processed until now is 2952
2022-11-21 07:32:26,668 INFO [decode.py:469] batch 28/?, cuts processed until now is 3226
2022-11-21 07:32:29,184 INFO [decode.py:469] batch 30/?, cuts processed until now is 3381
2022-11-21 07:32:32,300 INFO [decode.py:469] batch 32/?, cuts processed until now is 3499
2022-11-21 07:32:35,016 INFO [decode.py:469] batch 34/?, cuts processed until now is 3673
2022-11-21 07:32:37,774 INFO [decode.py:469] batch 36/?, cuts processed until now is 3797
2022-11-21 07:32:40,404 INFO [decode.py:469] batch 38/?, cuts processed until now is 3948
2022-11-21 07:32:42,407 INFO [decode.py:469] batch 40/?, cuts processed until now is 4722
2022-11-21 07:32:44,821 INFO [decode.py:469] batch 42/?, cuts processed until now is 5007
2022-11-21 07:32:44,898 INFO [zipformer.py:1414] attn_weights_entropy = tensor([4.6719, 4.8682, 4.8357, 4.7075, 4.3808, 4.2843, 5.2940, 4.7127],
device='cuda:0'), covar=tensor([0.0299, 0.0504, 0.0213, 0.1114, 0.0385, 0.0185, 0.0502, 0.0335],
device='cuda:0'), in_proj_covar=tensor([0.0075, 0.0097, 0.0083, 0.0108, 0.0078, 0.0067, 0.0133, 0.0090],
device='cuda:0'), out_proj_covar=tensor([0.0002, 0.0002, 0.0002, 0.0002, 0.0002, 0.0001, 0.0003, 0.0002],
device='cuda:0')
2022-11-21 07:32:47,969 INFO [decode.py:469] batch 44/?, cuts processed until now is 5112
2022-11-21 07:32:51,111 INFO [decode.py:469] batch 46/?, cuts processed until now is 5219
2022-11-21 07:32:54,156 INFO [decode.py:469] batch 48/?, cuts processed until now is 5636
2022-11-21 07:32:56,833 INFO [decode.py:469] batch 50/?, cuts processed until now is 5842
2022-11-21 07:32:58,983 INFO [decode.py:469] batch 52/?, cuts processed until now is 6222
2022-11-21 07:33:01,171 INFO [decode.py:469] batch 54/?, cuts processed until now is 6766
2022-11-21 07:33:03,409 INFO [decode.py:469] batch 56/?, cuts processed until now is 7072
2022-11-21 07:33:05,639 INFO [decode.py:469] batch 58/?, cuts processed until now is 7518
2022-11-21 07:33:07,714 INFO [decode.py:469] batch 60/?, cuts processed until now is 8027
2022-11-21 07:33:09,719 INFO [decode.py:469] batch 62/?, cuts processed until now is 8571
2022-11-21 07:33:11,794 INFO [decode.py:469] batch 64/?, cuts processed until now is 9115
2022-11-21 07:33:14,178 INFO [decode.py:469] batch 66/?, cuts processed until now is 9395
2022-11-21 07:33:16,211 INFO [decode.py:469] batch 68/?, cuts processed until now is 9904
2022-11-21 07:33:18,244 INFO [decode.py:469] batch 70/?, cuts processed until now is 10413
2022-11-21 07:33:20,203 INFO [decode.py:469] batch 72/?, cuts processed until now is 11190
2022-11-21 07:33:22,234 INFO [decode.py:469] batch 74/?, cuts processed until now is 11589
2022-11-21 07:33:23,888 INFO [decode.py:469] batch 76/?, cuts processed until now is 11699
2022-11-21 07:33:25,565 INFO [decode.py:469] batch 78/?, cuts processed until now is 11799
2022-11-21 07:33:27,007 INFO [decode.py:469] batch 80/?, cuts processed until now is 11889
2022-11-21 07:33:28,516 INFO [decode.py:469] batch 82/?, cuts processed until now is 12461
2022-11-21 07:33:30,315 INFO [decode.py:469] batch 84/?, cuts processed until now is 12568
2022-11-21 07:33:34,767 INFO [decode.py:469] batch 86/?, cuts processed until now is 12601
2022-11-21 07:33:34,997 INFO [decode.py:485] The transcripts are stored in pruned_transducer_stateless7/exp/v2/fast_beam_search/recogs-dev_gss-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:33:35,158 INFO [utils.py:530] [dev_gss-beam_4_max_contexts_4_max_states_8] %WER 22.21% [21087 / 94940, 2793 ins, 4898 del, 13396 sub ]
2022-11-21 07:33:36,016 INFO [utils.py:530] [dev_gss-beam_4_max_contexts_4_max_states_8] %WER 14.58% [53945 / 369873, 11680 ins, 21193 del, 21072 sub ]
2022-11-21 07:33:36,931 INFO [decode.py:511] Wrote detailed error stats to pruned_transducer_stateless7/exp/v2/fast_beam_search/wers-dev_gss-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:33:36,932 INFO [decode.py:531]
For dev_gss, WER/CER of different settings are:
beam_4_max_contexts_4_max_states_8 22.21 14.58 best for dev_gss
2022-11-21 07:33:36,937 INFO [decode.py:726] Decoding test_gss
2022-11-21 07:33:39,707 INFO [decode.py:469] batch 0/?, cuts processed until now is 69
2022-11-21 07:33:42,083 INFO [decode.py:469] batch 2/?, cuts processed until now is 555
2022-11-21 07:33:44,793 INFO [decode.py:469] batch 4/?, cuts processed until now is 703
2022-11-21 07:33:47,525 INFO [decode.py:469] batch 6/?, cuts processed until now is 831
2022-11-21 07:33:50,108 INFO [decode.py:469] batch 8/?, cuts processed until now is 988
2022-11-21 07:33:53,919 INFO [decode.py:469] batch 10/?, cuts processed until now is 1096
2022-11-21 07:33:56,557 INFO [decode.py:469] batch 12/?, cuts processed until now is 1268
2022-11-21 07:33:58,734 INFO [decode.py:469] batch 14/?, cuts processed until now is 1533
2022-11-21 07:34:00,776 INFO [decode.py:469] batch 16/?, cuts processed until now is 1932
2022-11-21 07:34:04,515 INFO [decode.py:469] batch 18/?, cuts processed until now is 2057
2022-11-21 07:34:09,449 INFO [decode.py:469] batch 20/?, cuts processed until now is 2126
2022-11-21 07:34:11,973 INFO [decode.py:469] batch 22/?, cuts processed until now is 2390
2022-11-21 07:34:14,076 INFO [decode.py:469] batch 24/?, cuts processed until now is 2858
2022-11-21 07:34:16,849 INFO [decode.py:469] batch 26/?, cuts processed until now is 2998
2022-11-21 07:34:19,366 INFO [decode.py:469] batch 28/?, cuts processed until now is 3280
2022-11-21 07:34:21,883 INFO [decode.py:469] batch 30/?, cuts processed until now is 3432
2022-11-21 07:34:26,257 INFO [decode.py:469] batch 32/?, cuts processed until now is 3537
2022-11-21 07:34:29,018 INFO [decode.py:469] batch 34/?, cuts processed until now is 3709
2022-11-21 07:34:31,828 INFO [decode.py:469] batch 36/?, cuts processed until now is 3825
2022-11-21 07:34:33,635 INFO [zipformer.py:1414] attn_weights_entropy = tensor([2.7291, 2.7342, 2.5623, 2.6762, 2.4725, 2.3896, 2.7740, 2.9601],
device='cuda:0'), covar=tensor([0.1757, 0.2501, 0.3308, 0.2930, 0.2414, 0.1908, 0.2092, 0.2814],
device='cuda:0'), in_proj_covar=tensor([0.0085, 0.0085, 0.0091, 0.0079, 0.0076, 0.0081, 0.0082, 0.0061],
device='cuda:0'), out_proj_covar=tensor([0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0003, 0.0002],
device='cuda:0')
2022-11-21 07:34:34,533 INFO [decode.py:469] batch 38/?, cuts processed until now is 3972
2022-11-21 07:34:37,436 INFO [decode.py:469] batch 40/?, cuts processed until now is 4410
2022-11-21 07:34:39,586 INFO [decode.py:469] batch 42/?, cuts processed until now is 5060
2022-11-21 07:34:41,855 INFO [decode.py:469] batch 44/?, cuts processed until now is 5546
2022-11-21 07:34:44,505 INFO [decode.py:469] batch 46/?, cuts processed until now is 5687
2022-11-21 07:34:46,834 INFO [decode.py:469] batch 48/?, cuts processed until now is 5893
2022-11-21 07:34:49,536 INFO [decode.py:469] batch 50/?, cuts processed until now is 6379
2022-11-21 07:34:51,695 INFO [decode.py:469] batch 52/?, cuts processed until now is 6713
2022-11-21 07:34:53,757 INFO [decode.py:469] batch 54/?, cuts processed until now is 7112
2022-11-21 07:34:57,539 INFO [decode.py:469] batch 56/?, cuts processed until now is 7298
2022-11-21 07:34:59,735 INFO [decode.py:469] batch 58/?, cuts processed until now is 8130
2022-11-21 07:35:03,764 INFO [decode.py:469] batch 60/?, cuts processed until now is 8273
2022-11-21 07:35:05,847 INFO [decode.py:469] batch 62/?, cuts processed until now is 8813
2022-11-21 07:35:07,988 INFO [decode.py:469] batch 64/?, cuts processed until now is 9353
2022-11-21 07:35:11,702 INFO [decode.py:469] batch 66/?, cuts processed until now is 9500
2022-11-21 07:35:14,850 INFO [decode.py:469] batch 68/?, cuts processed until now is 9944
2022-11-21 07:35:16,926 INFO [decode.py:469] batch 70/?, cuts processed until now is 10274
2022-11-21 07:35:19,851 INFO [decode.py:469] batch 72/?, cuts processed until now is 10711
2022-11-21 07:35:22,031 INFO [decode.py:469] batch 74/?, cuts processed until now is 10820
2022-11-21 07:35:23,609 INFO [decode.py:469] batch 76/?, cuts processed until now is 11076
2022-11-21 07:35:24,687 INFO [decode.py:469] batch 78/?, cuts processed until now is 11209
2022-11-21 07:35:26,170 INFO [decode.py:469] batch 80/?, cuts processed until now is 11651
2022-11-21 07:35:28,374 INFO [decode.py:469] batch 82/?, cuts processed until now is 12070
2022-11-21 07:35:29,767 INFO [decode.py:485] The transcripts are stored in pruned_transducer_stateless7/exp/v2/fast_beam_search/recogs-test_gss-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:35:29,910 INFO [utils.py:530] [test_gss-beam_4_max_contexts_4_max_states_8] %WER 22.83% [20466 / 89659, 2179 ins, 5438 del, 12849 sub ]
2022-11-21 07:35:30,564 INFO [utils.py:530] [test_gss-beam_4_max_contexts_4_max_states_8] %WER 15.27% [54095 / 354205, 10381 ins, 23091 del, 20623 sub ]
2022-11-21 07:35:31,560 INFO [decode.py:511] Wrote detailed error stats to pruned_transducer_stateless7/exp/v2/fast_beam_search/wers-test_gss-beam_4_max_contexts_4_max_states_8-epoch-14-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-11-21 07:35:31,561 INFO [decode.py:531]
For test_gss, WER/CER of different settings are:
beam_4_max_contexts_4_max_states_8 22.83 15.27 best for test_gss
2022-11-21 07:35:31,565 INFO [decode.py:743] Done!