icefall-asr-alimeeting-pruned-transducer-stateless7 / log /modified_beam_search /log-decode-epoch-15-avg-8-modified_beam_search-beam-size-4-2022-12-09-00-24-42
desh2608's picture
add pretrained model
26cafdd
raw
history blame
No virus
35.7 kB
2022-12-09 00:24:42,587 INFO [decode.py:551] Decoding started
2022-12-09 00:24:42,588 INFO [decode.py:557] Device: cuda:0
2022-12-09 00:24:42,654 INFO [lexicon.py:168] Loading pre-compiled data/lang_char/Linv.pt
2022-12-09 00:24:42,664 INFO [decode.py:563] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 100, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'warm_step': 2000, 'env_info': {'k2-version': '1.23', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'b2ce63f3940018e7b433c43fd802fc50ab006a76', 'k2-git-date': 'Wed Nov 23 08:43:43 2022', 'lhotse-version': '1.9.0.dev+git.97bf4b0.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'ali_meeting', 'icefall-git-sha1': 'f13cf61-dirty', 'icefall-git-date': 'Tue Dec 6 03:34:27 2022', 'icefall-path': '/exp/draj/mini_scale_2022/icefall', 'k2-path': '/exp/draj/mini_scale_2022/k2/k2/python/k2/__init__.py', 'lhotse-path': '/exp/draj/mini_scale_2022/lhotse/lhotse/__init__.py', 'hostname': 'r3n05', 'IP address': '10.1.3.5'}, 'epoch': 15, 'iter': 0, 'avg': 8, 'use_averaged_model': True, 'exp_dir': PosixPath('pruned_transducer_stateless7/exp/v1'), 'lang_dir': 'data/lang_char', 'decoding_method': 'modified_beam_search', 'beam_size': 4, 'beam': 4, 'ngram_lm_scale': 0.01, 'max_contexts': 4, 'max_states': 8, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': '2,4,3,2,4', 'feedforward_dims': '1024,1024,2048,2048,1024', 'nhead': '8,8,8,8,8', 'encoder_dims': '384,384,384,384,384', 'attention_dims': '192,192,192,192,192', 'encoder_unmasked_dims': '256,256,256,256,256', 'zipformer_downsampling_factors': '1,2,4,8,2', 'cnn_module_kernels': '31,31,31,31,31', 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/manifests'), 'enable_musan': True, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'max_duration': 500, 'max_cuts': None, 'num_buckets': 50, 'on_the_fly_feats': False, 'shuffle': True, 'num_workers': 8, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'res_dir': PosixPath('pruned_transducer_stateless7/exp/v1/modified_beam_search'), 'suffix': 'epoch-15-avg-8-modified_beam_search-beam-size-4', 'blank_id': 0, 'vocab_size': 3290}
2022-12-09 00:24:42,664 INFO [decode.py:565] About to create model
2022-12-09 00:24:43,124 INFO [zipformer.py:179] At encoder stack 4, which has downsampling_factor=2, we will combine the outputs of layers 1 and 3, with downsampling_factors=2 and 8.
2022-12-09 00:24:43,173 INFO [decode.py:632] Calculating the averaged model over epoch range from 7 (excluded) to 15
2022-12-09 00:25:10,741 INFO [decode.py:655] Number of model parameters: 75734561
2022-12-09 00:25:10,741 INFO [asr_datamodule.py:381] About to get AliMeeting IHM eval cuts
2022-12-09 00:25:10,760 INFO [asr_datamodule.py:402] About to get AliMeeting IHM test cuts
2022-12-09 00:25:10,773 INFO [asr_datamodule.py:387] About to get AliMeeting SDM eval cuts
2022-12-09 00:25:10,784 INFO [asr_datamodule.py:408] About to get AliMeeting SDM test cuts
2022-12-09 00:25:10,803 INFO [asr_datamodule.py:396] About to get AliMeeting GSS-enhanced eval cuts
2022-12-09 00:25:10,826 INFO [asr_datamodule.py:417] About to get AliMeeting GSS-enhanced test cuts
2022-12-09 00:25:12,696 INFO [decode.py:687] Decoding eval_ihm
2022-12-09 00:25:18,820 INFO [decode.py:463] batch 0/?, cuts processed until now is 58
2022-12-09 00:25:24,241 INFO [decode.py:463] batch 2/?, cuts processed until now is 512
2022-12-09 00:25:31,979 INFO [decode.py:463] batch 4/?, cuts processed until now is 645
2022-12-09 00:25:40,148 INFO [decode.py:463] batch 6/?, cuts processed until now is 750
2022-12-09 00:25:47,934 INFO [decode.py:463] batch 8/?, cuts processed until now is 883
2022-12-09 00:25:55,121 INFO [decode.py:463] batch 10/?, cuts processed until now is 1082
2022-12-09 00:26:02,868 INFO [decode.py:463] batch 12/?, cuts processed until now is 1279
2022-12-09 00:26:02,981 INFO [zipformer.py:1414] attn_weights_entropy = tensor([4.6414, 4.6510, 4.5449, 4.8082, 4.0814, 4.1877, 4.8428, 4.4703],
device='cuda:0'), covar=tensor([0.0608, 0.0354, 0.0841, 0.0608, 0.1302, 0.0402, 0.0377, 0.0977],
device='cuda:0'), in_proj_covar=tensor([0.0122, 0.0118, 0.0125, 0.0134, 0.0129, 0.0102, 0.0145, 0.0125],
device='cuda:0'), out_proj_covar=tensor([0.0002, 0.0002, 0.0002, 0.0002, 0.0002, 0.0002, 0.0002, 0.0002],
device='cuda:0')
2022-12-09 00:26:09,417 INFO [decode.py:463] batch 14/?, cuts processed until now is 1538
2022-12-09 00:26:16,019 INFO [decode.py:463] batch 16/?, cuts processed until now is 1845
2022-12-09 00:26:22,962 INFO [decode.py:463] batch 18/?, cuts processed until now is 2084
2022-12-09 00:26:28,777 INFO [decode.py:463] batch 20/?, cuts processed until now is 2523
2022-12-09 00:26:34,867 INFO [decode.py:463] batch 22/?, cuts processed until now is 2949
2022-12-09 00:26:42,001 INFO [decode.py:463] batch 24/?, cuts processed until now is 3160
2022-12-09 00:26:48,378 INFO [decode.py:463] batch 26/?, cuts processed until now is 3586
2022-12-09 00:26:55,879 INFO [decode.py:463] batch 28/?, cuts processed until now is 3758
2022-12-09 00:27:02,020 INFO [decode.py:463] batch 30/?, cuts processed until now is 4116
2022-12-09 00:27:06,528 INFO [decode.py:463] batch 32/?, cuts processed until now is 4742
2022-12-09 00:27:11,534 INFO [decode.py:463] batch 34/?, cuts processed until now is 5368
2022-12-09 00:27:17,535 INFO [decode.py:463] batch 36/?, cuts processed until now is 5796
2022-12-09 00:27:18,924 INFO [decode.py:463] batch 38/?, cuts processed until now is 5908
2022-12-09 00:27:26,130 INFO [decode.py:463] batch 40/?, cuts processed until now is 6026
2022-12-09 00:27:31,228 INFO [decode.py:463] batch 42/?, cuts processed until now is 6171
2022-12-09 00:27:37,988 INFO [decode.py:463] batch 44/?, cuts processed until now is 6390
2022-12-09 00:27:42,164 INFO [decode.py:463] batch 46/?, cuts processed until now is 6456
2022-12-09 00:27:42,623 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/modified_beam_search/recogs-eval_ihm-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:27:42,719 INFO [utils.py:536] [eval_ihm-beam_size_4] %WER 9.58% [7771 / 81111, 935 ins, 1663 del, 5173 sub ]
2022-12-09 00:27:42,957 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/modified_beam_search/errs-eval_ihm-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:27:42,959 INFO [decode.py:508]
For eval_ihm, WER of different settings are:
beam_size_4 9.58 best for eval_ihm
2022-12-09 00:27:42,959 INFO [decode.py:687] Decoding test_ihm
2022-12-09 00:27:48,952 INFO [decode.py:463] batch 0/?, cuts processed until now is 49
2022-12-09 00:27:54,771 INFO [decode.py:463] batch 2/?, cuts processed until now is 433
2022-12-09 00:28:03,112 INFO [decode.py:463] batch 4/?, cuts processed until now is 545
2022-12-09 00:28:11,577 INFO [decode.py:463] batch 6/?, cuts processed until now is 637
2022-12-09 00:28:20,309 INFO [decode.py:463] batch 8/?, cuts processed until now is 754
2022-12-09 00:28:29,065 INFO [decode.py:463] batch 10/?, cuts processed until now is 845
2022-12-09 00:28:36,939 INFO [decode.py:463] batch 12/?, cuts processed until now is 976
2022-12-09 00:28:44,236 INFO [decode.py:463] batch 14/?, cuts processed until now is 1175
2022-12-09 00:28:50,384 INFO [decode.py:463] batch 16/?, cuts processed until now is 1483
2022-12-09 00:28:58,746 INFO [decode.py:463] batch 18/?, cuts processed until now is 1590
2022-12-09 00:29:07,677 INFO [decode.py:463] batch 20/?, cuts processed until now is 1658
2022-12-09 00:29:07,960 INFO [zipformer.py:1414] attn_weights_entropy = tensor([3.1712, 2.8828, 2.9930, 2.1980, 2.5329, 3.0201, 3.0163, 2.7156],
device='cuda:0'), covar=tensor([0.0848, 0.1539, 0.1073, 0.1706, 0.1543, 0.0510, 0.1103, 0.1445],
device='cuda:0'), in_proj_covar=tensor([0.0124, 0.0170, 0.0124, 0.0117, 0.0121, 0.0128, 0.0106, 0.0128],
device='cuda:0'), out_proj_covar=tensor([0.0005, 0.0006, 0.0005, 0.0005, 0.0005, 0.0005, 0.0005, 0.0005],
device='cuda:0')
2022-12-09 00:29:14,837 INFO [decode.py:463] batch 22/?, cuts processed until now is 1856
2022-12-09 00:29:20,542 INFO [decode.py:463] batch 24/?, cuts processed until now is 2224
2022-12-09 00:29:28,891 INFO [decode.py:463] batch 26/?, cuts processed until now is 2325
2022-12-09 00:29:36,342 INFO [decode.py:463] batch 28/?, cuts processed until now is 2546
2022-12-09 00:29:44,445 INFO [decode.py:463] batch 30/?, cuts processed until now is 2653
2022-12-09 00:29:53,058 INFO [decode.py:463] batch 32/?, cuts processed until now is 2744
2022-12-09 00:30:01,060 INFO [decode.py:463] batch 34/?, cuts processed until now is 2875
2022-12-09 00:30:05,796 INFO [zipformer.py:1414] attn_weights_entropy = tensor([3.4639, 3.9804, 3.8866, 4.0083, 2.9221, 3.9239, 3.5020, 2.1089],
device='cuda:0'), covar=tensor([0.2868, 0.1054, 0.1046, 0.0689, 0.1069, 0.0660, 0.1511, 0.3060],
device='cuda:0'), in_proj_covar=tensor([0.0138, 0.0066, 0.0052, 0.0054, 0.0082, 0.0064, 0.0085, 0.0091],
device='cuda:0'), out_proj_covar=tensor([0.0007, 0.0004, 0.0004, 0.0004, 0.0005, 0.0004, 0.0005, 0.0005],
device='cuda:0')
2022-12-09 00:30:10,496 INFO [decode.py:463] batch 36/?, cuts processed until now is 2961
2022-12-09 00:30:18,741 INFO [decode.py:463] batch 38/?, cuts processed until now is 3072
2022-12-09 00:30:25,005 INFO [decode.py:463] batch 40/?, cuts processed until now is 3440
2022-12-09 00:30:29,753 INFO [decode.py:463] batch 42/?, cuts processed until now is 3956
2022-12-09 00:30:35,471 INFO [decode.py:463] batch 44/?, cuts processed until now is 4342
2022-12-09 00:30:43,655 INFO [decode.py:463] batch 46/?, cuts processed until now is 4443
2022-12-09 00:30:51,033 INFO [decode.py:463] batch 48/?, cuts processed until now is 4595
2022-12-09 00:30:57,408 INFO [decode.py:463] batch 50/?, cuts processed until now is 4872
2022-12-09 00:31:04,365 INFO [decode.py:463] batch 52/?, cuts processed until now is 5061
2022-12-09 00:31:12,188 INFO [decode.py:463] batch 54/?, cuts processed until now is 5219
2022-12-09 00:31:15,526 INFO [decode.py:463] batch 56/?, cuts processed until now is 5892
2022-12-09 00:31:22,589 INFO [decode.py:463] batch 58/?, cuts processed until now is 6090
2022-12-09 00:31:27,553 INFO [decode.py:463] batch 60/?, cuts processed until now is 6517
2022-12-09 00:31:34,231 INFO [decode.py:463] batch 62/?, cuts processed until now is 6715
2022-12-09 00:31:41,261 INFO [decode.py:463] batch 64/?, cuts processed until now is 6897
2022-12-09 00:31:46,410 INFO [decode.py:463] batch 66/?, cuts processed until now is 7304
2022-12-09 00:31:53,372 INFO [decode.py:463] batch 68/?, cuts processed until now is 7488
2022-12-09 00:32:00,322 INFO [decode.py:463] batch 70/?, cuts processed until now is 7720
2022-12-09 00:32:06,936 INFO [decode.py:463] batch 72/?, cuts processed until now is 7938
2022-12-09 00:32:12,023 INFO [decode.py:463] batch 74/?, cuts processed until now is 8367
2022-12-09 00:32:17,835 INFO [decode.py:463] batch 76/?, cuts processed until now is 8674
2022-12-09 00:32:23,809 INFO [decode.py:463] batch 78/?, cuts processed until now is 8982
2022-12-09 00:32:29,406 INFO [decode.py:463] batch 80/?, cuts processed until now is 9350
2022-12-09 00:32:34,331 INFO [decode.py:463] batch 82/?, cuts processed until now is 9810
2022-12-09 00:32:39,302 INFO [decode.py:463] batch 84/?, cuts processed until now is 10237
2022-12-09 00:32:43,638 INFO [decode.py:463] batch 86/?, cuts processed until now is 10755
2022-12-09 00:32:48,155 INFO [decode.py:463] batch 88/?, cuts processed until now is 11278
2022-12-09 00:32:53,182 INFO [decode.py:463] batch 90/?, cuts processed until now is 11705
2022-12-09 00:32:59,271 INFO [decode.py:463] batch 92/?, cuts processed until now is 12013
2022-12-09 00:33:05,419 INFO [decode.py:463] batch 94/?, cuts processed until now is 12290
2022-12-09 00:33:11,469 INFO [decode.py:463] batch 96/?, cuts processed until now is 12597
2022-12-09 00:33:17,185 INFO [decode.py:463] batch 98/?, cuts processed until now is 12963
2022-12-09 00:33:21,943 INFO [decode.py:463] batch 100/?, cuts processed until now is 13420
2022-12-09 00:33:27,380 INFO [decode.py:463] batch 102/?, cuts processed until now is 13877
2022-12-09 00:33:32,013 INFO [decode.py:463] batch 104/?, cuts processed until now is 14543
2022-12-09 00:33:36,461 INFO [decode.py:463] batch 106/?, cuts processed until now is 15209
2022-12-09 00:33:38,950 INFO [zipformer.py:1414] attn_weights_entropy = tensor([2.1245, 1.9127, 2.2915, 1.8888, 2.0640, 1.1755, 2.0946, 2.0218],
device='cuda:0'), covar=tensor([0.0894, 0.1076, 0.0611, 0.0984, 0.1650, 0.0887, 0.0727, 0.1079],
device='cuda:0'), in_proj_covar=tensor([0.0022, 0.0023, 0.0024, 0.0022, 0.0023, 0.0034, 0.0023, 0.0024],
device='cuda:0'), out_proj_covar=tensor([0.0001, 0.0001, 0.0001, 0.0001, 0.0001, 0.0002, 0.0001, 0.0001],
device='cuda:0')
2022-12-09 00:33:39,036 INFO [zipformer.py:1414] attn_weights_entropy = tensor([4.1049, 2.3319, 3.7990, 4.0203, 3.6045, 2.4305, 4.0939, 2.8954],
device='cuda:0'), covar=tensor([0.0258, 0.0949, 0.0516, 0.0229, 0.0445, 0.1350, 0.0229, 0.0869],
device='cuda:0'), in_proj_covar=tensor([0.0258, 0.0233, 0.0347, 0.0293, 0.0235, 0.0280, 0.0265, 0.0260],
device='cuda:0'), out_proj_covar=tensor([0.0002, 0.0002, 0.0003, 0.0003, 0.0002, 0.0003, 0.0003, 0.0002],
device='cuda:0')
2022-12-09 00:33:41,684 INFO [decode.py:463] batch 108/?, cuts processed until now is 15599
2022-12-09 00:33:45,094 INFO [decode.py:463] batch 110/?, cuts processed until now is 15787
2022-12-09 00:33:48,336 INFO [decode.py:463] batch 112/?, cuts processed until now is 15881
2022-12-09 00:33:53,053 INFO [decode.py:463] batch 114/?, cuts processed until now is 15926
2022-12-09 00:33:55,603 INFO [decode.py:463] batch 116/?, cuts processed until now is 16287
2022-12-09 00:34:01,766 INFO [decode.py:463] batch 118/?, cuts processed until now is 16357
2022-12-09 00:34:02,021 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/modified_beam_search/recogs-test_ihm-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:34:02,301 INFO [utils.py:536] [test_ihm-beam_size_4] %WER 11.53% [24194 / 209845, 2327 ins, 6065 del, 15802 sub ]
2022-12-09 00:34:02,937 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/modified_beam_search/errs-test_ihm-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:34:02,939 INFO [decode.py:508]
For test_ihm, WER of different settings are:
beam_size_4 11.53 best for test_ihm
2022-12-09 00:34:02,939 INFO [decode.py:687] Decoding eval_sdm
2022-12-09 00:34:08,817 INFO [decode.py:463] batch 0/?, cuts processed until now is 58
2022-12-09 00:34:14,230 INFO [decode.py:463] batch 2/?, cuts processed until now is 512
2022-12-09 00:34:14,530 INFO [zipformer.py:1414] attn_weights_entropy = tensor([3.3616, 3.0196, 3.0734, 2.1451, 2.7260, 3.2309, 3.1917, 2.7306],
device='cuda:0'), covar=tensor([0.0816, 0.1592, 0.1031, 0.1734, 0.1312, 0.0629, 0.0811, 0.1497],
device='cuda:0'), in_proj_covar=tensor([0.0124, 0.0170, 0.0124, 0.0117, 0.0121, 0.0128, 0.0106, 0.0128],
device='cuda:0'), out_proj_covar=tensor([0.0005, 0.0006, 0.0005, 0.0005, 0.0005, 0.0005, 0.0005, 0.0005],
device='cuda:0')
2022-12-09 00:34:22,052 INFO [decode.py:463] batch 4/?, cuts processed until now is 645
2022-12-09 00:34:30,387 INFO [decode.py:463] batch 6/?, cuts processed until now is 750
2022-12-09 00:34:38,010 INFO [decode.py:463] batch 8/?, cuts processed until now is 883
2022-12-09 00:34:45,164 INFO [decode.py:463] batch 10/?, cuts processed until now is 1082
2022-12-09 00:34:52,858 INFO [decode.py:463] batch 12/?, cuts processed until now is 1279
2022-12-09 00:34:59,493 INFO [decode.py:463] batch 14/?, cuts processed until now is 1538
2022-12-09 00:35:06,187 INFO [decode.py:463] batch 16/?, cuts processed until now is 1845
2022-12-09 00:35:13,032 INFO [decode.py:463] batch 18/?, cuts processed until now is 2084
2022-12-09 00:35:18,873 INFO [decode.py:463] batch 20/?, cuts processed until now is 2523
2022-12-09 00:35:25,017 INFO [decode.py:463] batch 22/?, cuts processed until now is 2949
2022-12-09 00:35:32,181 INFO [decode.py:463] batch 24/?, cuts processed until now is 3160
2022-12-09 00:35:38,571 INFO [decode.py:463] batch 26/?, cuts processed until now is 3586
2022-12-09 00:35:45,999 INFO [decode.py:463] batch 28/?, cuts processed until now is 3758
2022-12-09 00:35:52,230 INFO [decode.py:463] batch 30/?, cuts processed until now is 4116
2022-12-09 00:35:56,931 INFO [decode.py:463] batch 32/?, cuts processed until now is 4742
2022-12-09 00:36:01,998 INFO [decode.py:463] batch 34/?, cuts processed until now is 5368
2022-12-09 00:36:08,024 INFO [decode.py:463] batch 36/?, cuts processed until now is 5796
2022-12-09 00:36:09,364 INFO [decode.py:463] batch 38/?, cuts processed until now is 5908
2022-12-09 00:36:16,584 INFO [decode.py:463] batch 40/?, cuts processed until now is 6026
2022-12-09 00:36:21,870 INFO [decode.py:463] batch 42/?, cuts processed until now is 6171
2022-12-09 00:36:28,727 INFO [decode.py:463] batch 44/?, cuts processed until now is 6390
2022-12-09 00:36:32,833 INFO [decode.py:463] batch 46/?, cuts processed until now is 6456
2022-12-09 00:36:33,291 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/modified_beam_search/recogs-eval_sdm-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:36:33,393 INFO [utils.py:536] [eval_sdm-beam_size_4] %WER 23.37% [18953 / 81111, 2202 ins, 4566 del, 12185 sub ]
2022-12-09 00:36:33,653 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/modified_beam_search/errs-eval_sdm-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:36:33,654 INFO [decode.py:508]
For eval_sdm, WER of different settings are:
beam_size_4 23.37 best for eval_sdm
2022-12-09 00:36:33,654 INFO [decode.py:687] Decoding test_sdm
2022-12-09 00:36:39,860 INFO [decode.py:463] batch 0/?, cuts processed until now is 49
2022-12-09 00:36:45,565 INFO [decode.py:463] batch 2/?, cuts processed until now is 433
2022-12-09 00:36:53,725 INFO [decode.py:463] batch 4/?, cuts processed until now is 545
2022-12-09 00:37:02,310 INFO [decode.py:463] batch 6/?, cuts processed until now is 637
2022-12-09 00:37:10,388 INFO [decode.py:463] batch 8/?, cuts processed until now is 754
2022-12-09 00:37:18,827 INFO [decode.py:463] batch 10/?, cuts processed until now is 845
2022-12-09 00:37:26,764 INFO [decode.py:463] batch 12/?, cuts processed until now is 976
2022-12-09 00:37:33,846 INFO [decode.py:463] batch 14/?, cuts processed until now is 1175
2022-12-09 00:37:39,823 INFO [decode.py:463] batch 16/?, cuts processed until now is 1483
2022-12-09 00:37:48,201 INFO [decode.py:463] batch 18/?, cuts processed until now is 1590
2022-12-09 00:37:57,103 INFO [decode.py:463] batch 20/?, cuts processed until now is 1658
2022-12-09 00:38:04,045 INFO [decode.py:463] batch 22/?, cuts processed until now is 1856
2022-12-09 00:38:09,733 INFO [decode.py:463] batch 24/?, cuts processed until now is 2224
2022-12-09 00:38:17,904 INFO [decode.py:463] batch 26/?, cuts processed until now is 2325
2022-12-09 00:38:25,264 INFO [decode.py:463] batch 28/?, cuts processed until now is 2546
2022-12-09 00:38:33,413 INFO [decode.py:463] batch 30/?, cuts processed until now is 2653
2022-12-09 00:38:41,809 INFO [decode.py:463] batch 32/?, cuts processed until now is 2744
2022-12-09 00:38:49,880 INFO [decode.py:463] batch 34/?, cuts processed until now is 2875
2022-12-09 00:38:58,512 INFO [decode.py:463] batch 36/?, cuts processed until now is 2961
2022-12-09 00:39:06,612 INFO [decode.py:463] batch 38/?, cuts processed until now is 3072
2022-12-09 00:39:12,803 INFO [decode.py:463] batch 40/?, cuts processed until now is 3440
2022-12-09 00:39:17,555 INFO [decode.py:463] batch 42/?, cuts processed until now is 3956
2022-12-09 00:39:23,290 INFO [decode.py:463] batch 44/?, cuts processed until now is 4342
2022-12-09 00:39:31,656 INFO [decode.py:463] batch 46/?, cuts processed until now is 4443
2022-12-09 00:39:39,129 INFO [decode.py:463] batch 48/?, cuts processed until now is 4595
2022-12-09 00:39:45,308 INFO [decode.py:463] batch 50/?, cuts processed until now is 4872
2022-12-09 00:39:52,345 INFO [decode.py:463] batch 52/?, cuts processed until now is 5061
2022-12-09 00:39:56,217 INFO [zipformer.py:1414] attn_weights_entropy = tensor([1.6000, 1.4851, 1.5670, 1.7876, 2.0592, 2.2136, 2.2064, 2.2146],
device='cuda:0'), covar=tensor([0.0835, 0.1291, 0.0534, 0.0907, 0.0421, 0.0452, 0.0473, 0.0667],
device='cuda:0'), in_proj_covar=tensor([0.0013, 0.0014, 0.0012, 0.0013, 0.0013, 0.0022, 0.0018, 0.0023],
device='cuda:0'), out_proj_covar=tensor([1.0210e-04, 1.1124e-04, 9.6061e-05, 1.0387e-04, 1.0128e-04, 1.6022e-04,
1.3170e-04, 1.5378e-04], device='cuda:0')
2022-12-09 00:40:00,577 INFO [decode.py:463] batch 54/?, cuts processed until now is 5219
2022-12-09 00:40:03,868 INFO [decode.py:463] batch 56/?, cuts processed until now is 5892
2022-12-09 00:40:10,729 INFO [decode.py:463] batch 58/?, cuts processed until now is 6090
2022-12-09 00:40:15,737 INFO [decode.py:463] batch 60/?, cuts processed until now is 6517
2022-12-09 00:40:22,725 INFO [decode.py:463] batch 62/?, cuts processed until now is 6715
2022-12-09 00:40:29,584 INFO [decode.py:463] batch 64/?, cuts processed until now is 6897
2022-12-09 00:40:34,840 INFO [decode.py:463] batch 66/?, cuts processed until now is 7304
2022-12-09 00:40:38,166 INFO [zipformer.py:1414] attn_weights_entropy = tensor([5.0007, 4.7579, 4.4578, 4.4537, 4.7314, 4.8085, 5.0901, 4.9900],
device='cuda:0'), covar=tensor([0.0520, 0.0275, 0.1399, 0.3184, 0.0407, 0.0582, 0.0450, 0.0589],
device='cuda:0'), in_proj_covar=tensor([0.0334, 0.0221, 0.0402, 0.0522, 0.0289, 0.0383, 0.0353, 0.0328],
device='cuda:0'), out_proj_covar=tensor([0.0004, 0.0002, 0.0004, 0.0005, 0.0003, 0.0004, 0.0004, 0.0003],
device='cuda:0')
2022-12-09 00:40:41,753 INFO [decode.py:463] batch 68/?, cuts processed until now is 7488
2022-12-09 00:40:45,163 INFO [zipformer.py:1414] attn_weights_entropy = tensor([4.8467, 2.7370, 4.7665, 3.2192, 4.6985, 2.2952, 3.7403, 4.4403],
device='cuda:0'), covar=tensor([0.0689, 0.4731, 0.0564, 0.8768, 0.0368, 0.4544, 0.1277, 0.0518],
device='cuda:0'), in_proj_covar=tensor([0.0221, 0.0205, 0.0174, 0.0283, 0.0196, 0.0207, 0.0197, 0.0178],
device='cuda:0'), out_proj_covar=tensor([0.0004, 0.0004, 0.0003, 0.0005, 0.0004, 0.0004, 0.0004, 0.0004],
device='cuda:0')
2022-12-09 00:40:48,909 INFO [decode.py:463] batch 70/?, cuts processed until now is 7720
2022-12-09 00:40:55,528 INFO [decode.py:463] batch 72/?, cuts processed until now is 7938
2022-12-09 00:41:00,650 INFO [decode.py:463] batch 74/?, cuts processed until now is 8367
2022-12-09 00:41:06,497 INFO [decode.py:463] batch 76/?, cuts processed until now is 8674
2022-12-09 00:41:12,660 INFO [decode.py:463] batch 78/?, cuts processed until now is 8982
2022-12-09 00:41:18,265 INFO [decode.py:463] batch 80/?, cuts processed until now is 9350
2022-12-09 00:41:23,150 INFO [decode.py:463] batch 82/?, cuts processed until now is 9810
2022-12-09 00:41:28,140 INFO [decode.py:463] batch 84/?, cuts processed until now is 10237
2022-12-09 00:41:32,828 INFO [decode.py:463] batch 86/?, cuts processed until now is 10755
2022-12-09 00:41:37,127 INFO [decode.py:463] batch 88/?, cuts processed until now is 11278
2022-12-09 00:41:42,072 INFO [decode.py:463] batch 90/?, cuts processed until now is 11705
2022-12-09 00:41:48,165 INFO [decode.py:463] batch 92/?, cuts processed until now is 12013
2022-12-09 00:41:54,603 INFO [decode.py:463] batch 94/?, cuts processed until now is 12290
2022-12-09 00:42:00,600 INFO [decode.py:463] batch 96/?, cuts processed until now is 12597
2022-12-09 00:42:06,239 INFO [decode.py:463] batch 98/?, cuts processed until now is 12963
2022-12-09 00:42:10,927 INFO [decode.py:463] batch 100/?, cuts processed until now is 13420
2022-12-09 00:42:16,548 INFO [decode.py:463] batch 102/?, cuts processed until now is 13877
2022-12-09 00:42:21,141 INFO [decode.py:463] batch 104/?, cuts processed until now is 14543
2022-12-09 00:42:25,549 INFO [decode.py:463] batch 106/?, cuts processed until now is 15209
2022-12-09 00:42:30,704 INFO [decode.py:463] batch 108/?, cuts processed until now is 15599
2022-12-09 00:42:34,228 INFO [decode.py:463] batch 110/?, cuts processed until now is 15787
2022-12-09 00:42:37,414 INFO [decode.py:463] batch 112/?, cuts processed until now is 15881
2022-12-09 00:42:42,239 INFO [decode.py:463] batch 114/?, cuts processed until now is 15926
2022-12-09 00:42:44,787 INFO [decode.py:463] batch 116/?, cuts processed until now is 16287
2022-12-09 00:42:50,934 INFO [decode.py:463] batch 118/?, cuts processed until now is 16357
2022-12-09 00:42:51,160 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/modified_beam_search/recogs-test_sdm-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:42:51,439 INFO [utils.py:536] [test_sdm-beam_size_4] %WER 25.85% [54237 / 209845, 5655 ins, 15315 del, 33267 sub ]
2022-12-09 00:42:52,114 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/modified_beam_search/errs-test_sdm-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:42:52,131 INFO [decode.py:508]
For test_sdm, WER of different settings are:
beam_size_4 25.85 best for test_sdm
2022-12-09 00:42:52,131 INFO [decode.py:687] Decoding eval_gss
2022-12-09 00:42:57,705 INFO [decode.py:463] batch 0/?, cuts processed until now is 58
2022-12-09 00:43:02,998 INFO [decode.py:463] batch 2/?, cuts processed until now is 512
2022-12-09 00:43:10,732 INFO [decode.py:463] batch 4/?, cuts processed until now is 645
2022-12-09 00:43:18,974 INFO [decode.py:463] batch 6/?, cuts processed until now is 750
2022-12-09 00:43:26,729 INFO [decode.py:463] batch 8/?, cuts processed until now is 883
2022-12-09 00:43:34,520 INFO [decode.py:463] batch 10/?, cuts processed until now is 1082
2022-12-09 00:43:42,187 INFO [decode.py:463] batch 12/?, cuts processed until now is 1279
2022-12-09 00:43:48,868 INFO [decode.py:463] batch 14/?, cuts processed until now is 1538
2022-12-09 00:43:55,552 INFO [decode.py:463] batch 16/?, cuts processed until now is 1845
2022-12-09 00:44:02,697 INFO [decode.py:463] batch 18/?, cuts processed until now is 2084
2022-12-09 00:44:08,512 INFO [decode.py:463] batch 20/?, cuts processed until now is 2523
2022-12-09 00:44:14,694 INFO [decode.py:463] batch 22/?, cuts processed until now is 2949
2022-12-09 00:44:21,839 INFO [decode.py:463] batch 24/?, cuts processed until now is 3160
2022-12-09 00:44:28,126 INFO [decode.py:463] batch 26/?, cuts processed until now is 3586
2022-12-09 00:44:35,600 INFO [decode.py:463] batch 28/?, cuts processed until now is 3758
2022-12-09 00:44:41,779 INFO [decode.py:463] batch 30/?, cuts processed until now is 4116
2022-12-09 00:44:46,367 INFO [decode.py:463] batch 32/?, cuts processed until now is 4742
2022-12-09 00:44:51,420 INFO [decode.py:463] batch 34/?, cuts processed until now is 5368
2022-12-09 00:44:57,453 INFO [decode.py:463] batch 36/?, cuts processed until now is 5796
2022-12-09 00:44:58,784 INFO [decode.py:463] batch 38/?, cuts processed until now is 5908
2022-12-09 00:45:06,648 INFO [decode.py:463] batch 40/?, cuts processed until now is 6026
2022-12-09 00:45:11,801 INFO [decode.py:463] batch 42/?, cuts processed until now is 6171
2022-12-09 00:45:18,587 INFO [decode.py:463] batch 44/?, cuts processed until now is 6390
2022-12-09 00:45:22,679 INFO [decode.py:463] batch 46/?, cuts processed until now is 6456
2022-12-09 00:45:23,140 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/modified_beam_search/recogs-eval_gss-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:45:23,237 INFO [utils.py:536] [eval_gss-beam_size_4] %WER 11.82% [9586 / 81111, 1083 ins, 2070 del, 6433 sub ]
2022-12-09 00:45:23,529 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/modified_beam_search/errs-eval_gss-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:45:23,530 INFO [decode.py:508]
For eval_gss, WER of different settings are:
beam_size_4 11.82 best for eval_gss
2022-12-09 00:45:23,530 INFO [decode.py:687] Decoding test_gss
2022-12-09 00:45:29,371 INFO [decode.py:463] batch 0/?, cuts processed until now is 49
2022-12-09 00:45:35,151 INFO [decode.py:463] batch 2/?, cuts processed until now is 433
2022-12-09 00:45:43,327 INFO [decode.py:463] batch 4/?, cuts processed until now is 545
2022-12-09 00:45:51,796 INFO [decode.py:463] batch 6/?, cuts processed until now is 637
2022-12-09 00:45:59,790 INFO [decode.py:463] batch 8/?, cuts processed until now is 754
2022-12-09 00:46:08,420 INFO [decode.py:463] batch 10/?, cuts processed until now is 845
2022-12-09 00:46:16,376 INFO [decode.py:463] batch 12/?, cuts processed until now is 976
2022-12-09 00:46:23,166 INFO [decode.py:463] batch 14/?, cuts processed until now is 1175
2022-12-09 00:46:29,309 INFO [decode.py:463] batch 16/?, cuts processed until now is 1483
2022-12-09 00:46:37,642 INFO [decode.py:463] batch 18/?, cuts processed until now is 1590
2022-12-09 00:46:46,533 INFO [decode.py:463] batch 20/?, cuts processed until now is 1658
2022-12-09 00:46:53,462 INFO [decode.py:463] batch 22/?, cuts processed until now is 1856
2022-12-09 00:46:59,089 INFO [decode.py:463] batch 24/?, cuts processed until now is 2224
2022-12-09 00:47:07,379 INFO [decode.py:463] batch 26/?, cuts processed until now is 2325
2022-12-09 00:47:14,709 INFO [decode.py:463] batch 28/?, cuts processed until now is 2546
2022-12-09 00:47:22,610 INFO [decode.py:463] batch 30/?, cuts processed until now is 2653
2022-12-09 00:47:31,118 INFO [decode.py:463] batch 32/?, cuts processed until now is 2744
2022-12-09 00:47:39,205 INFO [decode.py:463] batch 34/?, cuts processed until now is 2875
2022-12-09 00:47:47,684 INFO [decode.py:463] batch 36/?, cuts processed until now is 2961
2022-12-09 00:47:55,780 INFO [decode.py:463] batch 38/?, cuts processed until now is 3072
2022-12-09 00:48:01,964 INFO [decode.py:463] batch 40/?, cuts processed until now is 3440
2022-12-09 00:48:06,693 INFO [decode.py:463] batch 42/?, cuts processed until now is 3956
2022-12-09 00:48:12,503 INFO [decode.py:463] batch 44/?, cuts processed until now is 4342
2022-12-09 00:48:20,529 INFO [decode.py:463] batch 46/?, cuts processed until now is 4443
2022-12-09 00:48:27,926 INFO [decode.py:463] batch 48/?, cuts processed until now is 4595
2022-12-09 00:48:34,365 INFO [decode.py:463] batch 50/?, cuts processed until now is 4872
2022-12-09 00:48:41,248 INFO [decode.py:463] batch 52/?, cuts processed until now is 5061
2022-12-09 00:48:49,108 INFO [decode.py:463] batch 54/?, cuts processed until now is 5219
2022-12-09 00:48:49,354 INFO [zipformer.py:1414] attn_weights_entropy = tensor([1.7659, 1.8372, 1.8194, 1.5972, 1.5217, 1.0486, 1.1054, 0.9389],
device='cuda:0'), covar=tensor([0.0083, 0.0055, 0.0058, 0.0054, 0.0177, 0.0286, 0.0174, 0.0357],
device='cuda:0'), in_proj_covar=tensor([0.0013, 0.0014, 0.0012, 0.0013, 0.0013, 0.0022, 0.0018, 0.0023],
device='cuda:0'), out_proj_covar=tensor([1.0210e-04, 1.1124e-04, 9.6061e-05, 1.0387e-04, 1.0128e-04, 1.6022e-04,
1.3170e-04, 1.5378e-04], device='cuda:0')
2022-12-09 00:48:52,461 INFO [decode.py:463] batch 56/?, cuts processed until now is 5892
2022-12-09 00:48:59,524 INFO [decode.py:463] batch 58/?, cuts processed until now is 6090
2022-12-09 00:49:04,540 INFO [decode.py:463] batch 60/?, cuts processed until now is 6517
2022-12-09 00:49:11,300 INFO [decode.py:463] batch 62/?, cuts processed until now is 6715
2022-12-09 00:49:18,348 INFO [decode.py:463] batch 64/?, cuts processed until now is 6897
2022-12-09 00:49:23,684 INFO [decode.py:463] batch 66/?, cuts processed until now is 7304
2022-12-09 00:49:30,622 INFO [decode.py:463] batch 68/?, cuts processed until now is 7488
2022-12-09 00:49:37,553 INFO [decode.py:463] batch 70/?, cuts processed until now is 7720
2022-12-09 00:49:44,162 INFO [decode.py:463] batch 72/?, cuts processed until now is 7938
2022-12-09 00:49:49,325 INFO [decode.py:463] batch 74/?, cuts processed until now is 8367
2022-12-09 00:49:55,304 INFO [decode.py:463] batch 76/?, cuts processed until now is 8674
2022-12-09 00:50:01,418 INFO [decode.py:463] batch 78/?, cuts processed until now is 8982
2022-12-09 00:50:06,983 INFO [decode.py:463] batch 80/?, cuts processed until now is 9350
2022-12-09 00:50:12,257 INFO [decode.py:463] batch 82/?, cuts processed until now is 9810
2022-12-09 00:50:17,564 INFO [decode.py:463] batch 84/?, cuts processed until now is 10237
2022-12-09 00:50:22,238 INFO [decode.py:463] batch 86/?, cuts processed until now is 10755
2022-12-09 00:50:26,646 INFO [decode.py:463] batch 88/?, cuts processed until now is 11278
2022-12-09 00:50:31,792 INFO [decode.py:463] batch 90/?, cuts processed until now is 11705
2022-12-09 00:50:37,988 INFO [decode.py:463] batch 92/?, cuts processed until now is 12013
2022-12-09 00:50:44,298 INFO [decode.py:463] batch 94/?, cuts processed until now is 12290
2022-12-09 00:50:50,344 INFO [decode.py:463] batch 96/?, cuts processed until now is 12597
2022-12-09 00:50:56,139 INFO [decode.py:463] batch 98/?, cuts processed until now is 12963
2022-12-09 00:51:00,984 INFO [decode.py:463] batch 100/?, cuts processed until now is 13420
2022-12-09 00:51:06,482 INFO [decode.py:463] batch 102/?, cuts processed until now is 13877
2022-12-09 00:51:11,124 INFO [decode.py:463] batch 104/?, cuts processed until now is 14543
2022-12-09 00:51:15,561 INFO [decode.py:463] batch 106/?, cuts processed until now is 15209
2022-12-09 00:51:20,691 INFO [decode.py:463] batch 108/?, cuts processed until now is 15599
2022-12-09 00:51:24,220 INFO [decode.py:463] batch 110/?, cuts processed until now is 15787
2022-12-09 00:51:27,406 INFO [decode.py:463] batch 112/?, cuts processed until now is 15881
2022-12-09 00:51:32,122 INFO [decode.py:463] batch 114/?, cuts processed until now is 15926
2022-12-09 00:51:34,657 INFO [decode.py:463] batch 116/?, cuts processed until now is 16287
2022-12-09 00:51:40,769 INFO [decode.py:463] batch 118/?, cuts processed until now is 16357
2022-12-09 00:51:41,004 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/modified_beam_search/recogs-test_gss-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:51:41,262 INFO [utils.py:536] [test_gss-beam_size_4] %WER 14.22% [29831 / 209845, 2727 ins, 7497 del, 19607 sub ]
2022-12-09 00:51:41,889 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/modified_beam_search/errs-test_gss-beam_size_4-epoch-15-avg-8-modified_beam_search-beam-size-4.txt
2022-12-09 00:51:41,890 INFO [decode.py:508]
For test_gss, WER of different settings are:
beam_size_4 14.22 best for test_gss
2022-12-09 00:51:41,891 INFO [decode.py:703] Done!