File size: 34,662 Bytes
26cafdd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
2022-12-09 00:04:39,800 INFO [decode.py:551] Decoding started
2022-12-09 00:04:39,801 INFO [decode.py:557] Device: cuda:0
2022-12-09 00:04:39,862 INFO [lexicon.py:168] Loading pre-compiled data/lang_char/Linv.pt
2022-12-09 00:04:39,872 INFO [decode.py:563] {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 100, 'reset_interval': 200, 'valid_interval': 3000, 'feature_dim': 80, 'subsampling_factor': 4, 'warm_step': 2000, 'env_info': {'k2-version': '1.23', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'b2ce63f3940018e7b433c43fd802fc50ab006a76', 'k2-git-date': 'Wed Nov 23 08:43:43 2022', 'lhotse-version': '1.9.0.dev+git.97bf4b0.dirty', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'ali_meeting', 'icefall-git-sha1': 'f13cf61-dirty', 'icefall-git-date': 'Tue Dec 6 03:34:27 2022', 'icefall-path': '/exp/draj/mini_scale_2022/icefall', 'k2-path': '/exp/draj/mini_scale_2022/k2/k2/python/k2/__init__.py', 'lhotse-path': '/exp/draj/mini_scale_2022/lhotse/lhotse/__init__.py', 'hostname': 'r7n08', 'IP address': '10.1.7.8'}, 'epoch': 15, 'iter': 0, 'avg': 8, 'use_averaged_model': True, 'exp_dir': PosixPath('pruned_transducer_stateless7/exp/v1'), 'lang_dir': 'data/lang_char', 'decoding_method': 'fast_beam_search', 'beam_size': 4, 'beam': 4, 'ngram_lm_scale': 0.01, 'max_contexts': 4, 'max_states': 8, 'context_size': 2, 'max_sym_per_frame': 1, 'num_paths': 200, 'nbest_scale': 0.5, 'num_encoder_layers': '2,4,3,2,4', 'feedforward_dims': '1024,1024,2048,2048,1024', 'nhead': '8,8,8,8,8', 'encoder_dims': '384,384,384,384,384', 'attention_dims': '192,192,192,192,192', 'encoder_unmasked_dims': '256,256,256,256,256', 'zipformer_downsampling_factors': '1,2,4,8,2', 'cnn_module_kernels': '31,31,31,31,31', 'decoder_dim': 512, 'joiner_dim': 512, 'manifest_dir': PosixPath('data/manifests'), 'enable_musan': True, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'max_duration': 500, 'max_cuts': None, 'num_buckets': 50, 'on_the_fly_feats': False, 'shuffle': True, 'num_workers': 8, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'res_dir': PosixPath('pruned_transducer_stateless7/exp/v1/fast_beam_search'), 'suffix': 'epoch-15-avg-8-beam-4-max-contexts-4-max-states-8', 'blank_id': 0, 'vocab_size': 3290}
2022-12-09 00:04:39,872 INFO [decode.py:565] About to create model
2022-12-09 00:04:40,318 INFO [zipformer.py:179] At encoder stack 4, which has downsampling_factor=2, we will combine the outputs of layers 1 and 3, with downsampling_factors=2 and 8.
2022-12-09 00:04:40,367 INFO [decode.py:632] Calculating the averaged model over epoch range from 7 (excluded) to 15
2022-12-09 00:04:56,153 INFO [decode.py:655] Number of model parameters: 75734561
2022-12-09 00:04:56,153 INFO [asr_datamodule.py:381] About to get AliMeeting IHM eval cuts
2022-12-09 00:04:56,156 INFO [asr_datamodule.py:402] About to get AliMeeting IHM test cuts
2022-12-09 00:04:56,158 INFO [asr_datamodule.py:387] About to get AliMeeting SDM eval cuts
2022-12-09 00:04:56,159 INFO [asr_datamodule.py:408] About to get AliMeeting SDM test cuts
2022-12-09 00:04:56,161 INFO [asr_datamodule.py:396] About to get AliMeeting GSS-enhanced eval cuts
2022-12-09 00:04:56,163 INFO [asr_datamodule.py:417] About to get AliMeeting GSS-enhanced test cuts
2022-12-09 00:04:57,914 INFO [decode.py:687] Decoding eval_ihm
2022-12-09 00:05:00,542 INFO [decode.py:463] batch 0/?, cuts processed until now is 58
2022-12-09 00:05:02,549 INFO [zipformer.py:1414] attn_weights_entropy = tensor([3.0737, 1.5896, 3.0429, 1.9527, 3.1868, 3.0717, 2.1364, 3.2050],
       device='cuda:0'), covar=tensor([0.0130, 0.1163, 0.0235, 0.0949, 0.0167, 0.0218, 0.0828, 0.0144],
       device='cuda:0'), in_proj_covar=tensor([0.0160, 0.0151, 0.0146, 0.0161, 0.0155, 0.0161, 0.0120, 0.0130],
       device='cuda:0'), out_proj_covar=tensor([0.0003, 0.0004, 0.0003, 0.0004, 0.0004, 0.0004, 0.0003, 0.0003],
       device='cuda:0')
2022-12-09 00:05:03,160 INFO [decode.py:463] batch 2/?, cuts processed until now is 512
2022-12-09 00:05:05,612 INFO [decode.py:463] batch 4/?, cuts processed until now is 645
2022-12-09 00:05:08,007 INFO [decode.py:463] batch 6/?, cuts processed until now is 750
2022-12-09 00:05:10,443 INFO [decode.py:463] batch 8/?, cuts processed until now is 883
2022-12-09 00:05:13,132 INFO [decode.py:463] batch 10/?, cuts processed until now is 1082
2022-12-09 00:05:15,399 INFO [decode.py:463] batch 12/?, cuts processed until now is 1279
2022-12-09 00:05:17,544 INFO [decode.py:463] batch 14/?, cuts processed until now is 1538
2022-12-09 00:05:19,807 INFO [decode.py:463] batch 16/?, cuts processed until now is 1845
2022-12-09 00:05:22,396 INFO [decode.py:463] batch 18/?, cuts processed until now is 2084
2022-12-09 00:05:24,347 INFO [decode.py:463] batch 20/?, cuts processed until now is 2523
2022-12-09 00:05:26,412 INFO [decode.py:463] batch 22/?, cuts processed until now is 2949
2022-12-09 00:05:28,474 INFO [decode.py:463] batch 24/?, cuts processed until now is 3160
2022-12-09 00:05:30,832 INFO [decode.py:463] batch 26/?, cuts processed until now is 3586
2022-12-09 00:05:32,989 INFO [decode.py:463] batch 28/?, cuts processed until now is 3758
2022-12-09 00:05:34,898 INFO [decode.py:463] batch 30/?, cuts processed until now is 4116
2022-12-09 00:05:36,464 INFO [decode.py:463] batch 32/?, cuts processed until now is 4742
2022-12-09 00:05:38,170 INFO [decode.py:463] batch 34/?, cuts processed until now is 5368
2022-12-09 00:05:40,030 INFO [decode.py:463] batch 36/?, cuts processed until now is 5796
2022-12-09 00:05:40,607 INFO [decode.py:463] batch 38/?, cuts processed until now is 5908
2022-12-09 00:05:43,001 INFO [decode.py:463] batch 40/?, cuts processed until now is 6026
2022-12-09 00:05:44,872 INFO [decode.py:463] batch 42/?, cuts processed until now is 6171
2022-12-09 00:05:47,165 INFO [decode.py:463] batch 44/?, cuts processed until now is 6390
2022-12-09 00:05:48,661 INFO [decode.py:463] batch 46/?, cuts processed until now is 6456
2022-12-09 00:05:49,043 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/fast_beam_search/recogs-eval_ihm-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:05:49,145 INFO [utils.py:536] [eval_ihm-beam_4_max_contexts_4_max_states_8] %WER 9.92% [8049 / 81111, 835 ins, 2146 del, 5068 sub ]
2022-12-09 00:05:49,383 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/fast_beam_search/errs-eval_ihm-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:05:49,386 INFO [decode.py:508] 
For eval_ihm, WER of different settings are:
beam_4_max_contexts_4_max_states_8	9.92	best for eval_ihm

2022-12-09 00:05:49,386 INFO [decode.py:687] Decoding test_ihm
2022-12-09 00:05:52,047 INFO [decode.py:463] batch 0/?, cuts processed until now is 49
2022-12-09 00:05:54,386 INFO [decode.py:463] batch 2/?, cuts processed until now is 433
2022-12-09 00:05:57,029 INFO [decode.py:463] batch 4/?, cuts processed until now is 545
2022-12-09 00:05:59,525 INFO [decode.py:463] batch 6/?, cuts processed until now is 637
2022-12-09 00:06:02,173 INFO [decode.py:463] batch 8/?, cuts processed until now is 754
2022-12-09 00:06:04,845 INFO [decode.py:463] batch 10/?, cuts processed until now is 845
2022-12-09 00:06:07,294 INFO [decode.py:463] batch 12/?, cuts processed until now is 976
2022-12-09 00:06:09,802 INFO [decode.py:463] batch 14/?, cuts processed until now is 1175
2022-12-09 00:06:11,772 INFO [decode.py:463] batch 16/?, cuts processed until now is 1483
2022-12-09 00:06:14,661 INFO [decode.py:463] batch 18/?, cuts processed until now is 1590
2022-12-09 00:06:17,229 INFO [decode.py:463] batch 20/?, cuts processed until now is 1658
2022-12-09 00:06:19,732 INFO [decode.py:463] batch 22/?, cuts processed until now is 1856
2022-12-09 00:06:21,553 INFO [decode.py:463] batch 24/?, cuts processed until now is 2224
2022-12-09 00:06:24,742 INFO [decode.py:463] batch 26/?, cuts processed until now is 2325
2022-12-09 00:06:26,248 INFO [zipformer.py:1414] attn_weights_entropy = tensor([1.7884, 1.7374, 1.7931, 1.6476, 1.5451, 1.3756, 1.2392, 0.9835],
       device='cuda:0'), covar=tensor([0.0257, 0.0477, 0.0302, 0.0285, 0.0391, 0.0372, 0.0391, 0.0706],
       device='cuda:0'), in_proj_covar=tensor([0.0013, 0.0014, 0.0012, 0.0013, 0.0013, 0.0022, 0.0018, 0.0023],
       device='cuda:0'), out_proj_covar=tensor([1.0210e-04, 1.1124e-04, 9.6061e-05, 1.0387e-04, 1.0128e-04, 1.6022e-04,
        1.3170e-04, 1.5378e-04], device='cuda:0')
2022-12-09 00:06:26,881 INFO [decode.py:463] batch 28/?, cuts processed until now is 2546
2022-12-09 00:06:29,637 INFO [decode.py:463] batch 30/?, cuts processed until now is 2653
2022-12-09 00:06:32,547 INFO [decode.py:463] batch 32/?, cuts processed until now is 2744
2022-12-09 00:06:35,176 INFO [decode.py:463] batch 34/?, cuts processed until now is 2875
2022-12-09 00:06:37,477 INFO [decode.py:463] batch 36/?, cuts processed until now is 2961
2022-12-09 00:06:40,111 INFO [decode.py:463] batch 38/?, cuts processed until now is 3072
2022-12-09 00:06:42,224 INFO [decode.py:463] batch 40/?, cuts processed until now is 3440
2022-12-09 00:06:44,157 INFO [decode.py:463] batch 42/?, cuts processed until now is 3956
2022-12-09 00:06:46,278 INFO [decode.py:463] batch 44/?, cuts processed until now is 4342
2022-12-09 00:06:49,059 INFO [decode.py:463] batch 46/?, cuts processed until now is 4443
2022-12-09 00:06:51,695 INFO [decode.py:463] batch 48/?, cuts processed until now is 4595
2022-12-09 00:06:54,151 INFO [decode.py:463] batch 50/?, cuts processed until now is 4872
2022-12-09 00:06:56,640 INFO [decode.py:463] batch 52/?, cuts processed until now is 5061
2022-12-09 00:06:59,037 INFO [decode.py:463] batch 54/?, cuts processed until now is 5219
2022-12-09 00:06:59,308 INFO [zipformer.py:1414] attn_weights_entropy = tensor([3.0441, 2.9961, 2.5706, 3.2930, 3.2433, 3.2067, 2.9979, 2.6837],
       device='cuda:0'), covar=tensor([0.0658, 0.0330, 0.1699, 0.0248, 0.0410, 0.0486, 0.0532, 0.0554],
       device='cuda:0'), in_proj_covar=tensor([0.0237, 0.0272, 0.0252, 0.0220, 0.0279, 0.0268, 0.0232, 0.0237],
       device='cuda:0'), out_proj_covar=tensor([0.0003, 0.0004, 0.0004, 0.0003, 0.0004, 0.0003, 0.0003, 0.0003],
       device='cuda:0')
2022-12-09 00:07:00,786 INFO [decode.py:463] batch 56/?, cuts processed until now is 5892
2022-12-09 00:07:03,557 INFO [decode.py:463] batch 58/?, cuts processed until now is 6090
2022-12-09 00:07:05,565 INFO [decode.py:463] batch 60/?, cuts processed until now is 6517
2022-12-09 00:07:08,044 INFO [decode.py:463] batch 62/?, cuts processed until now is 6715
2022-12-09 00:07:10,550 INFO [decode.py:463] batch 64/?, cuts processed until now is 6897
2022-12-09 00:07:12,918 INFO [decode.py:463] batch 66/?, cuts processed until now is 7304
2022-12-09 00:07:15,211 INFO [decode.py:463] batch 68/?, cuts processed until now is 7488
2022-12-09 00:07:17,291 INFO [decode.py:463] batch 70/?, cuts processed until now is 7720
2022-12-09 00:07:19,822 INFO [decode.py:463] batch 72/?, cuts processed until now is 7938
2022-12-09 00:07:22,356 INFO [decode.py:463] batch 74/?, cuts processed until now is 8367
2022-12-09 00:07:24,483 INFO [decode.py:463] batch 76/?, cuts processed until now is 8674
2022-12-09 00:07:26,624 INFO [decode.py:463] batch 78/?, cuts processed until now is 8982
2022-12-09 00:07:28,566 INFO [decode.py:463] batch 80/?, cuts processed until now is 9350
2022-12-09 00:07:31,094 INFO [decode.py:463] batch 82/?, cuts processed until now is 9810
2022-12-09 00:07:33,293 INFO [decode.py:463] batch 84/?, cuts processed until now is 10237
2022-12-09 00:07:35,118 INFO [decode.py:463] batch 86/?, cuts processed until now is 10755
2022-12-09 00:07:36,950 INFO [decode.py:463] batch 88/?, cuts processed until now is 11278
2022-12-09 00:07:39,502 INFO [decode.py:463] batch 90/?, cuts processed until now is 11705
2022-12-09 00:07:41,666 INFO [decode.py:463] batch 92/?, cuts processed until now is 12013
2022-12-09 00:07:43,998 INFO [decode.py:463] batch 94/?, cuts processed until now is 12290
2022-12-09 00:07:46,153 INFO [decode.py:463] batch 96/?, cuts processed until now is 12597
2022-12-09 00:07:48,323 INFO [decode.py:463] batch 98/?, cuts processed until now is 12963
2022-12-09 00:07:50,305 INFO [decode.py:463] batch 100/?, cuts processed until now is 13420
2022-12-09 00:07:52,320 INFO [decode.py:463] batch 102/?, cuts processed until now is 13877
2022-12-09 00:07:53,978 INFO [decode.py:463] batch 104/?, cuts processed until now is 14543
2022-12-09 00:07:55,982 INFO [decode.py:463] batch 106/?, cuts processed until now is 15209
2022-12-09 00:07:57,729 INFO [decode.py:463] batch 108/?, cuts processed until now is 15599
2022-12-09 00:07:59,309 INFO [decode.py:463] batch 110/?, cuts processed until now is 15787
2022-12-09 00:08:00,476 INFO [decode.py:463] batch 112/?, cuts processed until now is 15881
2022-12-09 00:08:02,663 INFO [decode.py:463] batch 114/?, cuts processed until now is 15926
2022-12-09 00:08:03,600 INFO [decode.py:463] batch 116/?, cuts processed until now is 16287
2022-12-09 00:08:05,650 INFO [decode.py:463] batch 118/?, cuts processed until now is 16357
2022-12-09 00:08:05,950 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/fast_beam_search/recogs-test_ihm-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:08:06,244 INFO [utils.py:536] [test_ihm-beam_4_max_contexts_4_max_states_8] %WER 12.07% [25334 / 209845, 2035 ins, 7940 del, 15359 sub ]
2022-12-09 00:08:06,874 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/fast_beam_search/errs-test_ihm-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:08:06,875 INFO [decode.py:508] 
For test_ihm, WER of different settings are:
beam_4_max_contexts_4_max_states_8	12.07	best for test_ihm

2022-12-09 00:08:06,876 INFO [decode.py:687] Decoding eval_sdm
2022-12-09 00:08:09,951 INFO [decode.py:463] batch 0/?, cuts processed until now is 58
2022-12-09 00:08:12,839 INFO [decode.py:463] batch 2/?, cuts processed until now is 512
2022-12-09 00:08:15,558 INFO [decode.py:463] batch 4/?, cuts processed until now is 645
2022-12-09 00:08:19,252 INFO [decode.py:463] batch 6/?, cuts processed until now is 750
2022-12-09 00:08:21,932 INFO [decode.py:463] batch 8/?, cuts processed until now is 883
2022-12-09 00:08:24,663 INFO [decode.py:463] batch 10/?, cuts processed until now is 1082
2022-12-09 00:08:26,962 INFO [decode.py:463] batch 12/?, cuts processed until now is 1279
2022-12-09 00:08:29,136 INFO [decode.py:463] batch 14/?, cuts processed until now is 1538
2022-12-09 00:08:31,430 INFO [decode.py:463] batch 16/?, cuts processed until now is 1845
2022-12-09 00:08:34,125 INFO [decode.py:463] batch 18/?, cuts processed until now is 2084
2022-12-09 00:08:36,099 INFO [decode.py:463] batch 20/?, cuts processed until now is 2523
2022-12-09 00:08:38,223 INFO [decode.py:463] batch 22/?, cuts processed until now is 2949
2022-12-09 00:08:40,617 INFO [decode.py:463] batch 24/?, cuts processed until now is 3160
2022-12-09 00:08:42,897 INFO [decode.py:463] batch 26/?, cuts processed until now is 3586
2022-12-09 00:08:45,251 INFO [decode.py:463] batch 28/?, cuts processed until now is 3758
2022-12-09 00:08:47,170 INFO [decode.py:463] batch 30/?, cuts processed until now is 4116
2022-12-09 00:08:48,754 INFO [decode.py:463] batch 32/?, cuts processed until now is 4742
2022-12-09 00:08:50,490 INFO [decode.py:463] batch 34/?, cuts processed until now is 5368
2022-12-09 00:08:52,388 INFO [decode.py:463] batch 36/?, cuts processed until now is 5796
2022-12-09 00:08:52,966 INFO [decode.py:463] batch 38/?, cuts processed until now is 5908
2022-12-09 00:08:55,520 INFO [decode.py:463] batch 40/?, cuts processed until now is 6026
2022-12-09 00:08:57,418 INFO [decode.py:463] batch 42/?, cuts processed until now is 6171
2022-12-09 00:08:59,744 INFO [decode.py:463] batch 44/?, cuts processed until now is 6390
2022-12-09 00:09:01,542 INFO [decode.py:463] batch 46/?, cuts processed until now is 6456
2022-12-09 00:09:01,956 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/fast_beam_search/recogs-eval_sdm-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:09:02,058 INFO [utils.py:536] [eval_sdm-beam_4_max_contexts_4_max_states_8] %WER 23.60% [19139 / 81111, 1582 ins, 6591 del, 10966 sub ]
2022-12-09 00:09:02,314 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/fast_beam_search/errs-eval_sdm-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:09:02,315 INFO [decode.py:508] 
For eval_sdm, WER of different settings are:
beam_4_max_contexts_4_max_states_8	23.6	best for eval_sdm

2022-12-09 00:09:02,315 INFO [decode.py:687] Decoding test_sdm
2022-12-09 00:09:05,263 INFO [decode.py:463] batch 0/?, cuts processed until now is 49
2022-12-09 00:09:08,151 INFO [decode.py:463] batch 2/?, cuts processed until now is 433
2022-12-09 00:09:11,022 INFO [decode.py:463] batch 4/?, cuts processed until now is 545
2022-12-09 00:09:14,432 INFO [decode.py:463] batch 6/?, cuts processed until now is 637
2022-12-09 00:09:18,065 INFO [decode.py:463] batch 8/?, cuts processed until now is 754
2022-12-09 00:09:21,737 INFO [decode.py:463] batch 10/?, cuts processed until now is 845
2022-12-09 00:09:24,463 INFO [decode.py:463] batch 12/?, cuts processed until now is 976
2022-12-09 00:09:27,235 INFO [decode.py:463] batch 14/?, cuts processed until now is 1175
2022-12-09 00:09:29,431 INFO [decode.py:463] batch 16/?, cuts processed until now is 1483
2022-12-09 00:09:32,640 INFO [decode.py:463] batch 18/?, cuts processed until now is 1590
2022-12-09 00:09:35,555 INFO [decode.py:463] batch 20/?, cuts processed until now is 1658
2022-12-09 00:09:38,331 INFO [decode.py:463] batch 22/?, cuts processed until now is 1856
2022-12-09 00:09:40,406 INFO [decode.py:463] batch 24/?, cuts processed until now is 2224
2022-12-09 00:09:44,139 INFO [decode.py:463] batch 26/?, cuts processed until now is 2325
2022-12-09 00:09:46,730 INFO [decode.py:463] batch 28/?, cuts processed until now is 2546
2022-12-09 00:09:50,522 INFO [decode.py:463] batch 30/?, cuts processed until now is 2653
2022-12-09 00:09:53,962 INFO [decode.py:463] batch 32/?, cuts processed until now is 2744
2022-12-09 00:09:54,241 INFO [zipformer.py:1414] attn_weights_entropy = tensor([2.7396, 3.1001, 3.0621, 3.0092, 2.4846, 3.1160, 2.9203, 1.5742],
       device='cuda:0'), covar=tensor([0.3222, 0.1240, 0.1216, 0.1279, 0.1173, 0.0968, 0.1290, 0.3379],
       device='cuda:0'), in_proj_covar=tensor([0.0138, 0.0066, 0.0052, 0.0054, 0.0082, 0.0064, 0.0085, 0.0091],
       device='cuda:0'), out_proj_covar=tensor([0.0007, 0.0004, 0.0004, 0.0004, 0.0005, 0.0004, 0.0005, 0.0005],
       device='cuda:0')
2022-12-09 00:09:56,816 INFO [decode.py:463] batch 34/?, cuts processed until now is 2875
2022-12-09 00:10:00,313 INFO [decode.py:463] batch 36/?, cuts processed until now is 2961
2022-12-09 00:10:03,278 INFO [decode.py:463] batch 38/?, cuts processed until now is 3072
2022-12-09 00:10:05,621 INFO [decode.py:463] batch 40/?, cuts processed until now is 3440
2022-12-09 00:10:07,739 INFO [decode.py:463] batch 42/?, cuts processed until now is 3956
2022-12-09 00:10:10,439 INFO [decode.py:463] batch 44/?, cuts processed until now is 4342
2022-12-09 00:10:13,586 INFO [decode.py:463] batch 46/?, cuts processed until now is 4443
2022-12-09 00:10:16,678 INFO [decode.py:463] batch 48/?, cuts processed until now is 4595
2022-12-09 00:10:18,979 INFO [decode.py:463] batch 50/?, cuts processed until now is 4872
2022-12-09 00:10:21,564 INFO [decode.py:463] batch 52/?, cuts processed until now is 5061
2022-12-09 00:10:24,051 INFO [decode.py:463] batch 54/?, cuts processed until now is 5219
2022-12-09 00:10:25,799 INFO [decode.py:463] batch 56/?, cuts processed until now is 5892
2022-12-09 00:10:28,484 INFO [decode.py:463] batch 58/?, cuts processed until now is 6090
2022-12-09 00:10:30,576 INFO [decode.py:463] batch 60/?, cuts processed until now is 6517
2022-12-09 00:10:33,227 INFO [decode.py:463] batch 62/?, cuts processed until now is 6715
2022-12-09 00:10:36,188 INFO [decode.py:463] batch 64/?, cuts processed until now is 6897
2022-12-09 00:10:38,500 INFO [decode.py:463] batch 66/?, cuts processed until now is 7304
2022-12-09 00:10:40,913 INFO [decode.py:463] batch 68/?, cuts processed until now is 7488
2022-12-09 00:10:43,631 INFO [decode.py:463] batch 70/?, cuts processed until now is 7720
2022-12-09 00:10:46,216 INFO [decode.py:463] batch 72/?, cuts processed until now is 7938
2022-12-09 00:10:48,352 INFO [decode.py:463] batch 74/?, cuts processed until now is 8367
2022-12-09 00:10:50,400 INFO [decode.py:463] batch 76/?, cuts processed until now is 8674
2022-12-09 00:10:52,491 INFO [decode.py:463] batch 78/?, cuts processed until now is 8982
2022-12-09 00:10:54,521 INFO [decode.py:463] batch 80/?, cuts processed until now is 9350
2022-12-09 00:10:56,482 INFO [decode.py:463] batch 82/?, cuts processed until now is 9810
2022-12-09 00:10:58,784 INFO [decode.py:463] batch 84/?, cuts processed until now is 10237
2022-12-09 00:11:00,569 INFO [decode.py:463] batch 86/?, cuts processed until now is 10755
2022-12-09 00:11:02,454 INFO [decode.py:463] batch 88/?, cuts processed until now is 11278
2022-12-09 00:11:04,558 INFO [decode.py:463] batch 90/?, cuts processed until now is 11705
2022-12-09 00:11:06,662 INFO [decode.py:463] batch 92/?, cuts processed until now is 12013
2022-12-09 00:11:08,880 INFO [decode.py:463] batch 94/?, cuts processed until now is 12290
2022-12-09 00:11:11,115 INFO [decode.py:463] batch 96/?, cuts processed until now is 12597
2022-12-09 00:11:13,001 INFO [decode.py:463] batch 98/?, cuts processed until now is 12963
2022-12-09 00:11:14,951 INFO [decode.py:463] batch 100/?, cuts processed until now is 13420
2022-12-09 00:11:16,891 INFO [decode.py:463] batch 102/?, cuts processed until now is 13877
2022-12-09 00:11:18,751 INFO [decode.py:463] batch 104/?, cuts processed until now is 14543
2022-12-09 00:11:20,406 INFO [decode.py:463] batch 106/?, cuts processed until now is 15209
2022-12-09 00:11:22,213 INFO [decode.py:463] batch 108/?, cuts processed until now is 15599
2022-12-09 00:11:23,831 INFO [decode.py:463] batch 110/?, cuts processed until now is 15787
2022-12-09 00:11:24,660 INFO [zipformer.py:1414] attn_weights_entropy = tensor([4.0644, 3.4893, 2.6910, 4.0270, 4.0221, 3.9192, 3.2926, 2.7434],
       device='cuda:0'), covar=tensor([0.0469, 0.1215, 0.3648, 0.0498, 0.0629, 0.1124, 0.1476, 0.4299],
       device='cuda:0'), in_proj_covar=tensor([0.0237, 0.0272, 0.0252, 0.0220, 0.0279, 0.0268, 0.0232, 0.0237],
       device='cuda:0'), out_proj_covar=tensor([0.0003, 0.0004, 0.0004, 0.0003, 0.0004, 0.0003, 0.0003, 0.0003],
       device='cuda:0')
2022-12-09 00:11:24,993 INFO [decode.py:463] batch 112/?, cuts processed until now is 15881
2022-12-09 00:11:27,253 INFO [decode.py:463] batch 114/?, cuts processed until now is 15926
2022-12-09 00:11:28,278 INFO [decode.py:463] batch 116/?, cuts processed until now is 16287
2022-12-09 00:11:30,637 INFO [decode.py:463] batch 118/?, cuts processed until now is 16357
2022-12-09 00:11:30,938 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/fast_beam_search/recogs-test_sdm-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:11:31,232 INFO [utils.py:536] [test_sdm-beam_4_max_contexts_4_max_states_8] %WER 26.38% [55365 / 209845, 4187 ins, 20994 del, 30184 sub ]
2022-12-09 00:11:31,895 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/fast_beam_search/errs-test_sdm-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:11:31,896 INFO [decode.py:508] 
For test_sdm, WER of different settings are:
beam_4_max_contexts_4_max_states_8	26.38	best for test_sdm

2022-12-09 00:11:31,896 INFO [decode.py:687] Decoding eval_gss
2022-12-09 00:11:34,643 INFO [decode.py:463] batch 0/?, cuts processed until now is 58
2022-12-09 00:11:37,265 INFO [decode.py:463] batch 2/?, cuts processed until now is 512
2022-12-09 00:11:39,788 INFO [decode.py:463] batch 4/?, cuts processed until now is 645
2022-12-09 00:11:42,519 INFO [decode.py:463] batch 6/?, cuts processed until now is 750
2022-12-09 00:11:45,087 INFO [decode.py:463] batch 8/?, cuts processed until now is 883
2022-12-09 00:11:45,431 INFO [zipformer.py:1414] attn_weights_entropy = tensor([4.4707, 2.5249, 4.5708, 2.7089, 4.2849, 2.1961, 3.3505, 4.2979],
       device='cuda:0'), covar=tensor([0.0574, 0.4876, 0.0310, 1.1301, 0.0766, 0.4357, 0.1456, 0.0313],
       device='cuda:0'), in_proj_covar=tensor([0.0221, 0.0205, 0.0174, 0.0283, 0.0196, 0.0207, 0.0197, 0.0178],
       device='cuda:0'), out_proj_covar=tensor([0.0004, 0.0004, 0.0003, 0.0005, 0.0004, 0.0004, 0.0004, 0.0004],
       device='cuda:0')
2022-12-09 00:11:48,123 INFO [decode.py:463] batch 10/?, cuts processed until now is 1082
2022-12-09 00:11:48,459 INFO [zipformer.py:1414] attn_weights_entropy = tensor([5.2040, 2.9190, 5.3102, 3.1520, 4.9151, 2.4271, 3.8951, 4.8336],
       device='cuda:0'), covar=tensor([0.0427, 0.4875, 0.0264, 1.0490, 0.0479, 0.4513, 0.1411, 0.0262],
       device='cuda:0'), in_proj_covar=tensor([0.0221, 0.0205, 0.0174, 0.0283, 0.0196, 0.0207, 0.0197, 0.0178],
       device='cuda:0'), out_proj_covar=tensor([0.0004, 0.0004, 0.0003, 0.0005, 0.0004, 0.0004, 0.0004, 0.0004],
       device='cuda:0')
2022-12-09 00:11:50,504 INFO [decode.py:463] batch 12/?, cuts processed until now is 1279
2022-12-09 00:11:52,727 INFO [decode.py:463] batch 14/?, cuts processed until now is 1538
2022-12-09 00:11:55,047 INFO [decode.py:463] batch 16/?, cuts processed until now is 1845
2022-12-09 00:11:57,873 INFO [decode.py:463] batch 18/?, cuts processed until now is 2084
2022-12-09 00:11:59,885 INFO [decode.py:463] batch 20/?, cuts processed until now is 2523
2022-12-09 00:12:02,047 INFO [decode.py:463] batch 22/?, cuts processed until now is 2949
2022-12-09 00:12:04,385 INFO [decode.py:463] batch 24/?, cuts processed until now is 3160
2022-12-09 00:12:06,707 INFO [decode.py:463] batch 26/?, cuts processed until now is 3586
2022-12-09 00:12:09,309 INFO [decode.py:463] batch 28/?, cuts processed until now is 3758
2022-12-09 00:12:11,407 INFO [decode.py:463] batch 30/?, cuts processed until now is 4116
2022-12-09 00:12:13,136 INFO [decode.py:463] batch 32/?, cuts processed until now is 4742
2022-12-09 00:12:14,909 INFO [decode.py:463] batch 34/?, cuts processed until now is 5368
2022-12-09 00:12:16,897 INFO [decode.py:463] batch 36/?, cuts processed until now is 5796
2022-12-09 00:12:17,494 INFO [decode.py:463] batch 38/?, cuts processed until now is 5908
2022-12-09 00:12:20,025 INFO [decode.py:463] batch 40/?, cuts processed until now is 6026
2022-12-09 00:12:21,986 INFO [decode.py:463] batch 42/?, cuts processed until now is 6171
2022-12-09 00:12:24,359 INFO [decode.py:463] batch 44/?, cuts processed until now is 6390
2022-12-09 00:12:25,982 INFO [decode.py:463] batch 46/?, cuts processed until now is 6456
2022-12-09 00:12:26,413 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/fast_beam_search/recogs-eval_gss-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:12:26,516 INFO [utils.py:536] [eval_gss-beam_4_max_contexts_4_max_states_8] %WER 12.30% [9980 / 81111, 904 ins, 2805 del, 6271 sub ]
2022-12-09 00:12:26,761 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/fast_beam_search/errs-eval_gss-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:12:26,762 INFO [decode.py:508] 
For eval_gss, WER of different settings are:
beam_4_max_contexts_4_max_states_8	12.3	best for eval_gss

2022-12-09 00:12:26,763 INFO [decode.py:687] Decoding test_gss
2022-12-09 00:12:29,434 INFO [decode.py:463] batch 0/?, cuts processed until now is 49
2022-12-09 00:12:31,943 INFO [decode.py:463] batch 2/?, cuts processed until now is 433
2022-12-09 00:12:34,693 INFO [decode.py:463] batch 4/?, cuts processed until now is 545
2022-12-09 00:12:37,478 INFO [decode.py:463] batch 6/?, cuts processed until now is 637
2022-12-09 00:12:40,171 INFO [decode.py:463] batch 8/?, cuts processed until now is 754
2022-12-09 00:12:43,222 INFO [decode.py:463] batch 10/?, cuts processed until now is 845
2022-12-09 00:12:45,929 INFO [decode.py:463] batch 12/?, cuts processed until now is 976
2022-12-09 00:12:48,699 INFO [decode.py:463] batch 14/?, cuts processed until now is 1175
2022-12-09 00:12:50,887 INFO [decode.py:463] batch 16/?, cuts processed until now is 1483
2022-12-09 00:12:54,090 INFO [decode.py:463] batch 18/?, cuts processed until now is 1590
2022-12-09 00:12:56,948 INFO [decode.py:463] batch 20/?, cuts processed until now is 1658
2022-12-09 00:12:59,715 INFO [decode.py:463] batch 22/?, cuts processed until now is 1856
2022-12-09 00:13:01,765 INFO [decode.py:463] batch 24/?, cuts processed until now is 2224
2022-12-09 00:13:05,033 INFO [decode.py:463] batch 26/?, cuts processed until now is 2325
2022-12-09 00:13:07,441 INFO [decode.py:463] batch 28/?, cuts processed until now is 2546
2022-12-09 00:13:10,575 INFO [decode.py:463] batch 30/?, cuts processed until now is 2653
2022-12-09 00:13:13,637 INFO [decode.py:463] batch 32/?, cuts processed until now is 2744
2022-12-09 00:13:16,282 INFO [decode.py:463] batch 34/?, cuts processed until now is 2875
2022-12-09 00:13:18,734 INFO [decode.py:463] batch 36/?, cuts processed until now is 2961
2022-12-09 00:13:21,537 INFO [decode.py:463] batch 38/?, cuts processed until now is 3072
2022-12-09 00:13:23,898 INFO [decode.py:463] batch 40/?, cuts processed until now is 3440
2022-12-09 00:13:26,091 INFO [decode.py:463] batch 42/?, cuts processed until now is 3956
2022-12-09 00:13:28,620 INFO [decode.py:463] batch 44/?, cuts processed until now is 4342
2022-12-09 00:13:31,701 INFO [decode.py:463] batch 46/?, cuts processed until now is 4443
2022-12-09 00:13:34,892 INFO [decode.py:463] batch 48/?, cuts processed until now is 4595
2022-12-09 00:13:37,348 INFO [decode.py:463] batch 50/?, cuts processed until now is 4872
2022-12-09 00:13:40,042 INFO [decode.py:463] batch 52/?, cuts processed until now is 5061
2022-12-09 00:13:42,670 INFO [decode.py:463] batch 54/?, cuts processed until now is 5219
2022-12-09 00:13:44,416 INFO [decode.py:463] batch 56/?, cuts processed until now is 5892
2022-12-09 00:13:47,372 INFO [decode.py:463] batch 58/?, cuts processed until now is 6090
2022-12-09 00:13:49,548 INFO [decode.py:463] batch 60/?, cuts processed until now is 6517
2022-12-09 00:13:52,266 INFO [decode.py:463] batch 62/?, cuts processed until now is 6715
2022-12-09 00:13:55,249 INFO [decode.py:463] batch 64/?, cuts processed until now is 6897
2022-12-09 00:13:57,716 INFO [decode.py:463] batch 66/?, cuts processed until now is 7304
2022-12-09 00:14:00,289 INFO [decode.py:463] batch 68/?, cuts processed until now is 7488
2022-12-09 00:14:02,611 INFO [decode.py:463] batch 70/?, cuts processed until now is 7720
2022-12-09 00:14:05,168 INFO [decode.py:463] batch 72/?, cuts processed until now is 7938
2022-12-09 00:14:07,503 INFO [decode.py:463] batch 74/?, cuts processed until now is 8367
2022-12-09 00:14:09,647 INFO [decode.py:463] batch 76/?, cuts processed until now is 8674
2022-12-09 00:14:11,923 INFO [decode.py:463] batch 78/?, cuts processed until now is 8982
2022-12-09 00:14:14,092 INFO [decode.py:463] batch 80/?, cuts processed until now is 9350
2022-12-09 00:14:16,181 INFO [decode.py:463] batch 82/?, cuts processed until now is 9810
2022-12-09 00:14:18,361 INFO [decode.py:463] batch 84/?, cuts processed until now is 10237
2022-12-09 00:14:20,381 INFO [decode.py:463] batch 86/?, cuts processed until now is 10755
2022-12-09 00:14:22,209 INFO [decode.py:463] batch 88/?, cuts processed until now is 11278
2022-12-09 00:14:24,483 INFO [decode.py:463] batch 90/?, cuts processed until now is 11705
2022-12-09 00:14:26,641 INFO [decode.py:463] batch 92/?, cuts processed until now is 12013
2022-12-09 00:14:29,060 INFO [decode.py:463] batch 94/?, cuts processed until now is 12290
2022-12-09 00:14:31,239 INFO [decode.py:463] batch 96/?, cuts processed until now is 12597
2022-12-09 00:14:33,271 INFO [decode.py:463] batch 98/?, cuts processed until now is 12963
2022-12-09 00:14:35,307 INFO [decode.py:463] batch 100/?, cuts processed until now is 13420
2022-12-09 00:14:37,571 INFO [decode.py:463] batch 102/?, cuts processed until now is 13877
2022-12-09 00:14:39,261 INFO [decode.py:463] batch 104/?, cuts processed until now is 14543
2022-12-09 00:14:40,975 INFO [decode.py:463] batch 106/?, cuts processed until now is 15209
2022-12-09 00:14:42,696 INFO [decode.py:463] batch 108/?, cuts processed until now is 15599
2022-12-09 00:14:44,299 INFO [decode.py:463] batch 110/?, cuts processed until now is 15787
2022-12-09 00:14:45,434 INFO [decode.py:463] batch 112/?, cuts processed until now is 15881
2022-12-09 00:14:47,672 INFO [decode.py:463] batch 114/?, cuts processed until now is 15926
2022-12-09 00:14:48,671 INFO [decode.py:463] batch 116/?, cuts processed until now is 16287
2022-12-09 00:14:50,930 INFO [decode.py:463] batch 118/?, cuts processed until now is 16357
2022-12-09 00:14:51,229 INFO [decode.py:479] The transcripts are stored in pruned_transducer_stateless7/exp/v1/fast_beam_search/recogs-test_gss-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:14:51,493 INFO [utils.py:536] [test_gss-beam_4_max_contexts_4_max_states_8] %WER 14.98% [31430 / 209845, 2279 ins, 10211 del, 18940 sub ]
2022-12-09 00:14:52,136 INFO [decode.py:492] Wrote detailed error stats to pruned_transducer_stateless7/exp/v1/fast_beam_search/errs-test_gss-beam_4_max_contexts_4_max_states_8-epoch-15-avg-8-beam-4-max-contexts-4-max-states-8.txt
2022-12-09 00:14:52,138 INFO [decode.py:508] 
For test_gss, WER of different settings are:
beam_4_max_contexts_4_max_states_8	14.98	best for test_gss

2022-12-09 00:14:52,138 INFO [decode.py:703] Done!