wgb14 commited on
Commit
e08bf6b
1 Parent(s): bcee600

Upload log/log-decode-2022-04-08-22-02-12

Browse files
Files changed (1) hide show
  1. log/log-decode-2022-04-08-22-02-12 +778 -0
log/log-decode-2022-04-08-22-02-12 ADDED
@@ -0,0 +1,778 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-04-08 22:02:12,850 INFO [decode.py:583] Decoding started
2
+ 2022-04-08 22:02:12,851 INFO [decode.py:584] {'subsampling_factor': 4, 'vgg_frontend': False, 'use_feat_batchnorm': True, 'feature_dim': 80, 'nhead': 8, 'attention_dim': 512, 'num_decoder_layers': 6, 'search_beam': 20, 'output_beam': 8, 'min_active_states': 30, 'max_active_states': 10000, 'use_double_scores': True, 'env_info': {'k2-version': '1.14', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '6833270cb228aba7bf9681fccd41e2b52f7d984c', 'k2-git-date': 'Wed Mar 16 11:16:05 2022', 'lhotse-version': '1.0.0.dev+git.d917411.clean', 'torch-cuda-available': True, 'torch-cuda-version': '11.1', 'python-version': '3.7', 'icefall-git-branch': 'gigaspeech_recipe', 'icefall-git-sha1': 'c3993a5-dirty', 'icefall-git-date': 'Mon Mar 21 13:49:39 2022', 'icefall-path': '/userhome/user/guanbo/icefall_decode', 'k2-path': '/opt/conda/lib/python3.7/site-packages/k2-1.14.dev20220408+cuda11.1.torch1.10.0-py3.7-linux-x86_64.egg/k2/__init__.py', 'lhotse-path': '/userhome/user/guanbo/lhotse/lhotse/__init__.py', 'hostname': 'd7b02ab00b70c011ec0a3ee069db84328338-chenx8564-0', 'IP address': '10.9.150.18'}, 'epoch': 18, 'avg': 6, 'method': 'attention-decoder', 'num_paths': 1000, 'nbest_scale': 0.5, 'exp_dir': PosixPath('conformer_ctc/exp_500_8_2'), 'lang_dir': PosixPath('data/lang_bpe_500'), 'lm_dir': PosixPath('data/lm'), 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 20, 'bucketing_sampler': True, 'num_buckets': 30, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 1.0, 'on_the_fly_feats': False, 'shuffle': True, 'return_cuts': True, 'num_workers': 1, 'enable_spec_aug': True, 'spec_aug_time_warp_factor': 80, 'enable_musan': True, 'subset': 'XL', 'lazy_load': True, 'small_dev': False}
3
+ 2022-04-08 22:02:13,611 INFO [lexicon.py:176] Loading pre-compiled data/lang_bpe_500/Linv.pt
4
+ 2022-04-08 22:02:13,897 INFO [decode.py:594] device: cuda:0
5
+ 2022-04-08 22:02:19,463 INFO [decode.py:656] Loading pre-compiled G_4_gram.pt
6
+ 2022-04-08 22:02:23,064 INFO [decode.py:692] averaging ['conformer_ctc/exp_500_8_2/epoch-13.pt', 'conformer_ctc/exp_500_8_2/epoch-14.pt', 'conformer_ctc/exp_500_8_2/epoch-15.pt', 'conformer_ctc/exp_500_8_2/epoch-16.pt', 'conformer_ctc/exp_500_8_2/epoch-17.pt', 'conformer_ctc/exp_500_8_2/epoch-18.pt']
7
+ 2022-04-08 22:04:17,302 INFO [decode.py:699] Number of model parameters: 109226120
8
+ 2022-04-08 22:04:17,303 INFO [asr_datamodule.py:372] About to get dev cuts
9
+ 2022-04-08 22:04:21,114 INFO [decode.py:497] batch 0/?, cuts processed until now is 3
10
+ 2022-04-08 22:06:56,367 INFO [decode.py:497] batch 100/?, cuts processed until now is 243
11
+ 2022-04-08 22:09:33,967 INFO [decode.py:497] batch 200/?, cuts processed until now is 464
12
+ 2022-04-08 22:12:05,730 INFO [decode.py:497] batch 300/?, cuts processed until now is 665
13
+ 2022-04-08 22:13:23,989 INFO [decode.py:736] Caught exception:
14
+ CUDA out of memory. Tried to allocate 4.93 GiB (GPU 0; 31.75 GiB total capacity; 24.54 GiB already allocated; 3.87 GiB free; 26.53 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
15
+
16
+ 2022-04-08 22:13:23,989 INFO [decode.py:743] num_arcs before pruning: 333034
17
+ 2022-04-08 22:13:23,989 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
18
+ 2022-04-08 22:13:24,010 INFO [decode.py:757] num_arcs after pruning: 7258
19
+ 2022-04-08 22:14:38,171 INFO [decode.py:497] batch 400/?, cuts processed until now is 891
20
+ 2022-04-08 22:17:05,640 INFO [decode.py:497] batch 500/?, cuts processed until now is 1098
21
+ 2022-04-08 22:19:29,901 INFO [decode.py:497] batch 600/?, cuts processed until now is 1363
22
+ 2022-04-08 22:20:05,953 INFO [decode.py:736] Caught exception:
23
+ CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 19.51 GiB already allocated; 7.07 GiB free; 23.32 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
24
+
25
+ 2022-04-08 22:20:05,954 INFO [decode.py:743] num_arcs before pruning: 514392
26
+ 2022-04-08 22:20:05,954 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
27
+ 2022-04-08 22:20:05,966 INFO [decode.py:757] num_arcs after pruning: 13888
28
+ 2022-04-08 22:22:02,765 INFO [decode.py:497] batch 700/?, cuts processed until now is 1626
29
+ 2022-04-08 22:24:05,393 INFO [decode.py:736] Caught exception:
30
+ CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 14.24 GiB already allocated; 7.07 GiB free; 23.33 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
31
+
32
+ 2022-04-08 22:24:05,393 INFO [decode.py:743] num_arcs before pruning: 164808
33
+ 2022-04-08 22:24:05,393 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
34
+ 2022-04-08 22:24:05,404 INFO [decode.py:757] num_arcs after pruning: 8771
35
+ 2022-04-08 22:24:40,652 INFO [decode.py:497] batch 800/?, cuts processed until now is 1870
36
+ 2022-04-08 22:25:03,574 INFO [decode.py:736] Caught exception:
37
+ CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 14.28 GiB already allocated; 7.07 GiB free; 23.32 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
38
+
39
+ 2022-04-08 22:25:03,575 INFO [decode.py:743] num_arcs before pruning: 267824
40
+ 2022-04-08 22:25:03,575 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
41
+ 2022-04-08 22:25:03,582 INFO [decode.py:757] num_arcs after pruning: 9250
42
+ 2022-04-08 22:27:25,872 INFO [decode.py:497] batch 900/?, cuts processed until now is 2134
43
+ 2022-04-08 22:29:45,824 INFO [decode.py:736] Caught exception:
44
+ CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 14.45 GiB already allocated; 7.06 GiB free; 23.33 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
45
+
46
+ 2022-04-08 22:29:45,825 INFO [decode.py:743] num_arcs before pruning: 236799
47
+ 2022-04-08 22:29:45,825 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
48
+ 2022-04-08 22:29:45,837 INFO [decode.py:757] num_arcs after pruning: 7885
49
+ 2022-04-08 22:30:03,747 INFO [decode.py:497] batch 1000/?, cuts processed until now is 2380
50
+ 2022-04-08 22:30:44,532 INFO [decode.py:736] Caught exception:
51
+
52
+ Some bad things happened. Please read the above error messages and stack
53
+ trace. If you are using Python, the following command may be helpful:
54
+
55
+ gdb --args python /path/to/your/code.py
56
+
57
+ (You can use `gdb` to debug the code. Please consider compiling
58
+ a debug version of k2.).
59
+
60
+ If you are unable to fix it, please open an issue at:
61
+
62
+ https://github.com/k2-fsa/k2/issues/new
63
+
64
+
65
+ 2022-04-08 22:30:44,532 INFO [decode.py:743] num_arcs before pruning: 632546
66
+ 2022-04-08 22:30:44,533 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
67
+ 2022-04-08 22:30:44,585 INFO [decode.py:757] num_arcs after pruning: 10602
68
+ 2022-04-08 22:32:41,978 INFO [decode.py:497] batch 1100/?, cuts processed until now is 2624
69
+ 2022-04-08 22:34:54,199 INFO [decode.py:736] Caught exception:
70
+ CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 19.67 GiB already allocated; 5.68 GiB free; 24.72 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
71
+
72
+ 2022-04-08 22:34:54,200 INFO [decode.py:743] num_arcs before pruning: 227558
73
+ 2022-04-08 22:34:54,200 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
74
+ 2022-04-08 22:34:54,218 INFO [decode.py:757] num_arcs after pruning: 8505
75
+ 2022-04-08 22:35:25,806 INFO [decode.py:497] batch 1200/?, cuts processed until now is 2889
76
+ 2022-04-08 22:38:28,827 INFO [decode.py:497] batch 1300/?, cuts processed until now is 3182
77
+ 2022-04-08 22:39:35,318 INFO [decode.py:736] Caught exception:
78
+ CUDA out of memory. Tried to allocate 2.65 GiB (GPU 0; 31.75 GiB total capacity; 27.28 GiB already allocated; 1.20 GiB free; 29.19 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
79
+
80
+ 2022-04-08 22:39:35,318 INFO [decode.py:743] num_arcs before pruning: 348294
81
+ 2022-04-08 22:39:35,318 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
82
+ 2022-04-08 22:39:35,324 INFO [decode.py:757] num_arcs after pruning: 4422
83
+ 2022-04-08 22:41:48,886 INFO [decode.py:497] batch 1400/?, cuts processed until now is 3491
84
+ 2022-04-08 22:42:03,583 INFO [decode.py:736] Caught exception:
85
+ CUDA out of memory. Tried to allocate 4.53 GiB (GPU 0; 31.75 GiB total capacity; 24.43 GiB already allocated; 1.20 GiB free; 29.19 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
86
+
87
+ 2022-04-08 22:42:03,584 INFO [decode.py:743] num_arcs before pruning: 446338
88
+ 2022-04-08 22:42:03,584 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
89
+ 2022-04-08 22:42:03,592 INFO [decode.py:757] num_arcs after pruning: 13422
90
+ 2022-04-08 22:44:41,081 INFO [decode.py:497] batch 1500/?, cuts processed until now is 3738
91
+ 2022-04-08 22:44:48,819 INFO [decode.py:736] Caught exception:
92
+ CUDA out of memory. Tried to allocate 1.94 GiB (GPU 0; 31.75 GiB total capacity; 29.06 GiB already allocated; 231.75 MiB free; 30.17 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
93
+
94
+ 2022-04-08 22:44:48,820 INFO [decode.py:743] num_arcs before pruning: 263598
95
+ 2022-04-08 22:44:48,820 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
96
+ 2022-04-08 22:44:48,833 INFO [decode.py:757] num_arcs after pruning: 7847
97
+ 2022-04-08 22:47:10,728 INFO [decode.py:497] batch 1600/?, cuts processed until now is 3970
98
+ 2022-04-08 22:47:52,235 INFO [decode.py:736] Caught exception:
99
+ CUDA out of memory. Tried to allocate 5.20 GiB (GPU 0; 31.75 GiB total capacity; 24.71 GiB already allocated; 231.75 MiB free; 30.17 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
100
+
101
+ 2022-04-08 22:47:52,236 INFO [decode.py:743] num_arcs before pruning: 317009
102
+ 2022-04-08 22:47:52,236 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
103
+ 2022-04-08 22:47:52,252 INFO [decode.py:757] num_arcs after pruning: 9354
104
+ 2022-04-08 22:49:32,370 INFO [decode.py:736] Caught exception:
105
+ CUDA out of memory. Tried to allocate 4.55 GiB (GPU 0; 31.75 GiB total capacity; 24.05 GiB already allocated; 231.75 MiB free; 30.17 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
106
+
107
+ 2022-04-08 22:49:32,371 INFO [decode.py:743] num_arcs before pruning: 136624
108
+ 2022-04-08 22:49:32,371 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
109
+ 2022-04-08 22:49:32,402 INFO [decode.py:757] num_arcs after pruning: 5456
110
+ 2022-04-08 22:49:36,398 INFO [decode.py:497] batch 1700/?, cuts processed until now is 4192
111
+ 2022-04-08 22:50:50,382 INFO [decode.py:736] Caught exception:
112
+ CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 19.56 GiB already allocated; 2.10 GiB free; 28.29 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
113
+
114
+ 2022-04-08 22:50:50,383 INFO [decode.py:743] num_arcs before pruning: 303893
115
+ 2022-04-08 22:50:50,383 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
116
+ 2022-04-08 22:50:50,400 INFO [decode.py:757] num_arcs after pruning: 9312
117
+ 2022-04-08 22:52:09,335 INFO [decode.py:497] batch 1800/?, cuts processed until now is 4416
118
+ 2022-04-08 22:52:51,744 INFO [decode.py:736] Caught exception:
119
+ CUDA out of memory. Tried to allocate 5.02 GiB (GPU 0; 31.75 GiB total capacity; 26.25 GiB already allocated; 2.10 GiB free; 28.29 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
120
+
121
+ 2022-04-08 22:52:51,745 INFO [decode.py:743] num_arcs before pruning: 379292
122
+ 2022-04-08 22:52:51,745 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
123
+ 2022-04-08 22:52:51,751 INFO [decode.py:757] num_arcs after pruning: 14317
124
+ 2022-04-08 22:54:33,478 INFO [decode.py:497] batch 1900/?, cuts processed until now is 4619
125
+ 2022-04-08 22:56:34,371 INFO [decode.py:736] Caught exception:
126
+ CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 19.32 GiB already allocated; 3.07 GiB free; 27.33 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
127
+
128
+ 2022-04-08 22:56:34,372 INFO [decode.py:743] num_arcs before pruning: 294097
129
+ 2022-04-08 22:56:34,372 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
130
+ 2022-04-08 22:56:34,389 INFO [decode.py:757] num_arcs after pruning: 5895
131
+ 2022-04-08 22:56:47,967 INFO [decode.py:497] batch 2000/?, cuts processed until now is 4816
132
+ 2022-04-08 22:58:06,236 INFO [decode.py:736] Caught exception:
133
+ CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 31.75 GiB total capacity; 19.41 GiB already allocated; 3.06 GiB free; 27.33 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
134
+
135
+ 2022-04-08 22:58:06,236 INFO [decode.py:743] num_arcs before pruning: 253855
136
+ 2022-04-08 22:58:06,236 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
137
+ 2022-04-08 22:58:06,253 INFO [decode.py:757] num_arcs after pruning: 9191
138
+ 2022-04-08 22:58:17,534 INFO [decode.py:736] Caught exception:
139
+ CUDA out of memory. Tried to allocate 2.17 GiB (GPU 0; 31.75 GiB total capacity; 26.06 GiB already allocated; 1.56 GiB free; 28.83 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
140
+
141
+ 2022-04-08 22:58:17,535 INFO [decode.py:743] num_arcs before pruning: 242689
142
+ 2022-04-08 22:58:17,535 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
143
+ 2022-04-08 22:58:17,549 INFO [decode.py:757] num_arcs after pruning: 4733
144
+ 2022-04-08 22:58:32,154 INFO [decode.py:736] Caught exception:
145
+ CUDA out of memory. Tried to allocate 2.38 GiB (GPU 0; 31.75 GiB total capacity; 26.65 GiB already allocated; 1.57 GiB free; 28.82 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
146
+
147
+ 2022-04-08 22:58:32,155 INFO [decode.py:743] num_arcs before pruning: 288302
148
+ 2022-04-08 22:58:32,155 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
149
+ 2022-04-08 22:58:32,164 INFO [decode.py:757] num_arcs after pruning: 5472
150
+ 2022-04-08 22:59:15,988 INFO [decode.py:497] batch 2100/?, cuts processed until now is 4981
151
+ 2022-04-08 23:00:31,937 INFO [decode.py:736] Caught exception:
152
+
153
+ Some bad things happened. Please read the above error messages and stack
154
+ trace. If you are using Python, the following command may be helpful:
155
+
156
+ gdb --args python /path/to/your/code.py
157
+
158
+ (You can use `gdb` to debug the code. Please consider compiling
159
+ a debug version of k2.).
160
+
161
+ If you are unable to fix it, please open an issue at:
162
+
163
+ https://github.com/k2-fsa/k2/issues/new
164
+
165
+
166
+ 2022-04-08 23:00:31,937 INFO [decode.py:743] num_arcs before pruning: 745182
167
+ 2022-04-08 23:00:31,937 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
168
+ 2022-04-08 23:00:31,989 INFO [decode.py:757] num_arcs after pruning: 13933
169
+ 2022-04-08 23:01:49,408 INFO [decode.py:497] batch 2200/?, cuts processed until now is 5132
170
+ 2022-04-08 23:04:08,911 INFO [decode.py:497] batch 2300/?, cuts processed until now is 5273
171
+ 2022-04-08 23:06:50,854 INFO [decode.py:497] batch 2400/?, cuts processed until now is 5388
172
+ 2022-04-08 23:06:53,493 INFO [decode.py:736] Caught exception:
173
+
174
+ Some bad things happened. Please read the above error messages and stack
175
+ trace. If you are using Python, the following command may be helpful:
176
+
177
+ gdb --args python /path/to/your/code.py
178
+
179
+ (You can use `gdb` to debug the code. Please consider compiling
180
+ a debug version of k2.).
181
+
182
+ If you are unable to fix it, please open an issue at:
183
+
184
+ https://github.com/k2-fsa/k2/issues/new
185
+
186
+
187
+ 2022-04-08 23:06:53,493 INFO [decode.py:743] num_arcs before pruning: 203946
188
+ 2022-04-08 23:06:53,493 INFO [decode.py:746] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
189
+ 2022-04-08 23:06:53,545 INFO [decode.py:757] num_arcs after pruning: 7172
190
+ 2022-04-08 23:09:08,764 INFO [decode.py:497] batch 2500/?, cuts processed until now is 5488
191
+ 2022-04-08 23:10:26,345 INFO [decode.py:841] Caught exception:
192
+ CUDA out of memory. Tried to allocate 5.79 GiB (GPU 0; 31.75 GiB total capacity; 24.31 GiB already allocated; 1.58 GiB free; 28.82 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
193
+
194
+ 2022-04-08 23:10:26,346 INFO [decode.py:843] num_paths before decreasing: 1000
195
+ 2022-04-08 23:10:26,346 INFO [decode.py:852] This OOM is not an error. You can ignore it. If your model does not converge well, or --max-duration is too large, or the input sound file is difficult to decode, you will meet this exception.
196
+ 2022-04-08 23:10:26,346 INFO [decode.py:858] num_paths after decreasing: 500
197
+ 2022-04-08 23:11:31,973 INFO [decode.py:497] batch 2600/?, cuts processed until now is 5588
198
+ 2022-04-08 23:13:41,208 INFO [decode.py:497] batch 2700/?, cuts processed until now is 5688
199
+ 2022-04-08 23:20:49,158 INFO [decode.py:567]
200
+ For dev, WER of different settings are:
201
+ ngram_lm_scale_0.6_attention_scale_1.5 10.46 best for dev
202
+ ngram_lm_scale_0.6_attention_scale_1.7 10.46
203
+ ngram_lm_scale_0.5_attention_scale_0.9 10.47
204
+ ngram_lm_scale_0.5_attention_scale_1.0 10.47
205
+ ngram_lm_scale_0.5_attention_scale_1.1 10.47
206
+ ngram_lm_scale_0.5_attention_scale_1.2 10.47
207
+ ngram_lm_scale_0.5_attention_scale_1.3 10.47
208
+ ngram_lm_scale_0.5_attention_scale_1.5 10.47
209
+ ngram_lm_scale_0.5_attention_scale_1.7 10.47
210
+ ngram_lm_scale_0.6_attention_scale_1.3 10.47
211
+ ngram_lm_scale_0.6_attention_scale_1.9 10.47
212
+ ngram_lm_scale_0.6_attention_scale_2.0 10.47
213
+ ngram_lm_scale_0.6_attention_scale_2.1 10.47
214
+ ngram_lm_scale_0.7_attention_scale_1.9 10.47
215
+ ngram_lm_scale_0.7_attention_scale_2.0 10.47
216
+ ngram_lm_scale_0.7_attention_scale_2.1 10.47
217
+ ngram_lm_scale_0.7_attention_scale_2.2 10.47
218
+ ngram_lm_scale_0.5_attention_scale_1.9 10.48
219
+ ngram_lm_scale_0.6_attention_scale_1.1 10.48
220
+ ngram_lm_scale_0.6_attention_scale_1.2 10.48
221
+ ngram_lm_scale_0.6_attention_scale_2.2 10.48
222
+ ngram_lm_scale_0.6_attention_scale_2.3 10.48
223
+ ngram_lm_scale_0.7_attention_scale_1.5 10.48
224
+ ngram_lm_scale_0.7_attention_scale_1.7 10.48
225
+ ngram_lm_scale_0.7_attention_scale_2.3 10.48
226
+ ngram_lm_scale_0.7_attention_scale_2.5 10.48
227
+ ngram_lm_scale_0.9_attention_scale_4.0 10.48
228
+ ngram_lm_scale_0.3_attention_scale_1.1 10.49
229
+ ngram_lm_scale_0.5_attention_scale_0.6 10.49
230
+ ngram_lm_scale_0.5_attention_scale_0.7 10.49
231
+ ngram_lm_scale_0.5_attention_scale_2.0 10.49
232
+ ngram_lm_scale_0.5_attention_scale_2.1 10.49
233
+ ngram_lm_scale_0.5_attention_scale_2.5 10.49
234
+ ngram_lm_scale_0.5_attention_scale_3.0 10.49
235
+ ngram_lm_scale_0.6_attention_scale_1.0 10.49
236
+ ngram_lm_scale_0.6_attention_scale_2.5 10.49
237
+ ngram_lm_scale_0.6_attention_scale_3.0 10.49
238
+ ngram_lm_scale_0.7_attention_scale_1.3 10.49
239
+ ngram_lm_scale_0.7_attention_scale_3.0 10.49
240
+ ngram_lm_scale_0.7_attention_scale_4.0 10.49
241
+ ngram_lm_scale_0.9_attention_scale_3.0 10.49
242
+ ngram_lm_scale_0.9_attention_scale_5.0 10.49
243
+ ngram_lm_scale_1.0_attention_scale_4.0 10.49
244
+ ngram_lm_scale_1.0_attention_scale_5.0 10.49
245
+ ngram_lm_scale_1.1_attention_scale_4.0 10.49
246
+ ngram_lm_scale_1.1_attention_scale_5.0 10.49
247
+ ngram_lm_scale_1.2_attention_scale_4.0 10.49
248
+ ngram_lm_scale_1.2_attention_scale_5.0 10.49
249
+ ngram_lm_scale_1.3_attention_scale_5.0 10.49
250
+ ngram_lm_scale_1.5_attention_scale_5.0 10.49
251
+ ngram_lm_scale_0.3_attention_scale_0.7 10.5
252
+ ngram_lm_scale_0.3_attention_scale_0.9 10.5
253
+ ngram_lm_scale_0.3_attention_scale_1.0 10.5
254
+ ngram_lm_scale_0.3_attention_scale_1.2 10.5
255
+ ngram_lm_scale_0.3_attention_scale_1.3 10.5
256
+ ngram_lm_scale_0.3_attention_scale_1.5 10.5
257
+ ngram_lm_scale_0.5_attention_scale_2.2 10.5
258
+ ngram_lm_scale_0.5_attention_scale_2.3 10.5
259
+ ngram_lm_scale_0.6_attention_scale_0.7 10.5
260
+ ngram_lm_scale_0.6_attention_scale_0.9 10.5
261
+ ngram_lm_scale_0.7_attention_scale_1.0 10.5
262
+ ngram_lm_scale_0.7_attention_scale_1.1 10.5
263
+ ngram_lm_scale_0.7_attention_scale_5.0 10.5
264
+ ngram_lm_scale_0.9_attention_scale_2.1 10.5
265
+ ngram_lm_scale_1.0_attention_scale_3.0 10.5
266
+ ngram_lm_scale_1.3_attention_scale_4.0 10.5
267
+ ngram_lm_scale_1.5_attention_scale_4.0 10.5
268
+ ngram_lm_scale_0.3_attention_scale_1.7 10.51
269
+ ngram_lm_scale_0.3_attention_scale_1.9 10.51
270
+ ngram_lm_scale_0.3_attention_scale_2.0 10.51
271
+ ngram_lm_scale_0.3_attention_scale_2.1 10.51
272
+ ngram_lm_scale_0.3_attention_scale_2.2 10.51
273
+ ngram_lm_scale_0.3_attention_scale_2.3 10.51
274
+ ngram_lm_scale_0.3_attention_scale_2.5 10.51
275
+ ngram_lm_scale_0.3_attention_scale_3.0 10.51
276
+ ngram_lm_scale_0.3_attention_scale_4.0 10.51
277
+ ngram_lm_scale_0.5_attention_scale_0.5 10.51
278
+ ngram_lm_scale_0.5_attention_scale_4.0 10.51
279
+ ngram_lm_scale_0.5_attention_scale_5.0 10.51
280
+ ngram_lm_scale_0.6_attention_scale_4.0 10.51
281
+ ngram_lm_scale_0.6_attention_scale_5.0 10.51
282
+ ngram_lm_scale_0.7_attention_scale_1.2 10.51
283
+ ngram_lm_scale_0.9_attention_scale_2.0 10.51
284
+ ngram_lm_scale_0.9_attention_scale_2.2 10.51
285
+ ngram_lm_scale_0.9_attention_scale_2.3 10.51
286
+ ngram_lm_scale_0.9_attention_scale_2.5 10.51
287
+ ngram_lm_scale_1.0_attention_scale_2.2 10.51
288
+ ngram_lm_scale_1.0_attention_scale_2.3 10.51
289
+ ngram_lm_scale_1.0_attention_scale_2.5 10.51
290
+ ngram_lm_scale_1.1_attention_scale_2.5 10.51
291
+ ngram_lm_scale_1.2_attention_scale_3.0 10.51
292
+ ngram_lm_scale_1.7_attention_scale_5.0 10.51
293
+ ngram_lm_scale_0.05_attention_scale_2.5 10.52
294
+ ngram_lm_scale_0.05_attention_scale_3.0 10.52
295
+ ngram_lm_scale_0.08_attention_scale_2.5 10.52
296
+ ngram_lm_scale_0.08_attention_scale_4.0 10.52
297
+ ngram_lm_scale_0.08_attention_scale_5.0 10.52
298
+ ngram_lm_scale_0.1_attention_scale_2.5 10.52
299
+ ngram_lm_scale_0.1_attention_scale_3.0 10.52
300
+ ngram_lm_scale_0.1_attention_scale_4.0 10.52
301
+ ngram_lm_scale_0.1_attention_scale_5.0 10.52
302
+ ngram_lm_scale_0.3_attention_scale_0.5 10.52
303
+ ngram_lm_scale_0.3_attention_scale_0.6 10.52
304
+ ngram_lm_scale_0.3_attention_scale_5.0 10.52
305
+ ngram_lm_scale_0.6_attention_scale_0.6 10.52
306
+ ngram_lm_scale_0.7_attention_scale_0.9 10.52
307
+ ngram_lm_scale_0.9_attention_scale_1.7 10.52
308
+ ngram_lm_scale_0.9_attention_scale_1.9 10.52
309
+ ngram_lm_scale_1.0_attention_scale_2.0 10.52
310
+ ngram_lm_scale_1.0_attention_scale_2.1 10.52
311
+ ngram_lm_scale_1.1_attention_scale_2.3 10.52
312
+ ngram_lm_scale_1.1_attention_scale_3.0 10.52
313
+ ngram_lm_scale_1.9_attention_scale_5.0 10.52
314
+ ngram_lm_scale_0.01_attention_scale_2.5 10.53
315
+ ngram_lm_scale_0.01_attention_scale_3.0 10.53
316
+ ngram_lm_scale_0.01_attention_scale_4.0 10.53
317
+ ngram_lm_scale_0.01_attention_scale_5.0 10.53
318
+ ngram_lm_scale_0.05_attention_scale_1.9 10.53
319
+ ngram_lm_scale_0.05_attention_scale_2.1 10.53
320
+ ngram_lm_scale_0.05_attention_scale_2.3 10.53
321
+ ngram_lm_scale_0.05_attention_scale_4.0 10.53
322
+ ngram_lm_scale_0.05_attention_scale_5.0 10.53
323
+ ngram_lm_scale_0.08_attention_scale_1.9 10.53
324
+ ngram_lm_scale_0.08_attention_scale_2.1 10.53
325
+ ngram_lm_scale_0.08_attention_scale_2.2 10.53
326
+ ngram_lm_scale_0.08_attention_scale_2.3 10.53
327
+ ngram_lm_scale_0.08_attention_scale_3.0 10.53
328
+ ngram_lm_scale_0.1_attention_scale_2.2 10.53
329
+ ngram_lm_scale_0.1_attention_scale_2.3 10.53
330
+ ngram_lm_scale_0.3_attention_scale_0.3 10.53
331
+ ngram_lm_scale_0.9_attention_scale_1.5 10.53
332
+ ngram_lm_scale_1.0_attention_scale_1.9 10.53
333
+ ngram_lm_scale_1.1_attention_scale_2.1 10.53
334
+ ngram_lm_scale_1.1_attention_scale_2.2 10.53
335
+ ngram_lm_scale_1.2_attention_scale_2.5 10.53
336
+ ngram_lm_scale_1.3_attention_scale_3.0 10.53
337
+ ngram_lm_scale_1.7_attention_scale_4.0 10.53
338
+ ngram_lm_scale_2.0_attention_scale_5.0 10.53
339
+ ngram_lm_scale_0.01_attention_scale_2.2 10.54
340
+ ngram_lm_scale_0.01_attention_scale_2.3 10.54
341
+ ngram_lm_scale_0.05_attention_scale_1.7 10.54
342
+ ngram_lm_scale_0.05_attention_scale_2.0 10.54
343
+ ngram_lm_scale_0.05_attention_scale_2.2 10.54
344
+ ngram_lm_scale_0.08_attention_scale_1.2 10.54
345
+ ngram_lm_scale_0.08_attention_scale_1.3 10.54
346
+ ngram_lm_scale_0.08_attention_scale_1.7 10.54
347
+ ngram_lm_scale_0.08_attention_scale_2.0 10.54
348
+ ngram_lm_scale_0.1_attention_scale_1.5 10.54
349
+ ngram_lm_scale_0.1_attention_scale_1.7 10.54
350
+ ngram_lm_scale_0.1_attention_scale_1.9 10.54
351
+ ngram_lm_scale_0.1_attention_scale_2.0 10.54
352
+ ngram_lm_scale_0.1_attention_scale_2.1 10.54
353
+ ngram_lm_scale_0.9_attention_scale_1.2 10.54
354
+ ngram_lm_scale_1.0_attention_scale_1.7 10.54
355
+ ngram_lm_scale_1.2_attention_scale_2.3 10.54
356
+ ngram_lm_scale_1.3_attention_scale_2.3 10.54
357
+ ngram_lm_scale_1.5_attention_scale_3.0 10.54
358
+ ngram_lm_scale_0.01_attention_scale_1.9 10.55
359
+ ngram_lm_scale_0.01_attention_scale_2.0 10.55
360
+ ngram_lm_scale_0.01_attention_scale_2.1 10.55
361
+ ngram_lm_scale_0.05_attention_scale_1.2 10.55
362
+ ngram_lm_scale_0.05_attention_scale_1.3 10.55
363
+ ngram_lm_scale_0.08_attention_scale_1.1 10.55
364
+ ngram_lm_scale_0.08_attention_scale_1.5 10.55
365
+ ngram_lm_scale_0.1_attention_scale_1.1 10.55
366
+ ngram_lm_scale_0.1_attention_scale_1.2 10.55
367
+ ngram_lm_scale_0.1_attention_scale_1.3 10.55
368
+ ngram_lm_scale_0.6_attention_scale_0.5 10.55
369
+ ngram_lm_scale_0.7_attention_scale_0.7 10.55
370
+ ngram_lm_scale_0.9_attention_scale_1.3 10.55
371
+ ngram_lm_scale_1.0_attention_scale_1.5 10.55
372
+ ngram_lm_scale_1.1_attention_scale_2.0 10.55
373
+ ngram_lm_scale_1.2_attention_scale_2.0 10.55
374
+ ngram_lm_scale_1.2_attention_scale_2.1 10.55
375
+ ngram_lm_scale_1.2_attention_scale_2.2 10.55
376
+ ngram_lm_scale_1.3_attention_scale_2.2 10.55
377
+ ngram_lm_scale_1.3_attention_scale_2.5 10.55
378
+ ngram_lm_scale_2.1_attention_scale_5.0 10.55
379
+ ngram_lm_scale_0.01_attention_scale_1.1 10.56
380
+ ngram_lm_scale_0.01_attention_scale_1.3 10.56
381
+ ngram_lm_scale_0.01_attention_scale_1.7 10.56
382
+ ngram_lm_scale_0.05_attention_scale_1.1 10.56
383
+ ngram_lm_scale_0.05_attention_scale_1.5 10.56
384
+ ngram_lm_scale_0.08_attention_scale_1.0 10.56
385
+ ngram_lm_scale_0.1_attention_scale_1.0 10.56
386
+ ngram_lm_scale_0.7_attention_scale_0.6 10.56
387
+ ngram_lm_scale_0.9_attention_scale_1.1 10.56
388
+ ngram_lm_scale_1.0_attention_scale_1.3 10.56
389
+ ngram_lm_scale_1.1_attention_scale_1.7 10.56
390
+ ngram_lm_scale_1.1_attention_scale_1.9 10.56
391
+ ngram_lm_scale_1.2_attention_scale_1.9 10.56
392
+ ngram_lm_scale_1.3_attention_scale_2.0 10.56
393
+ ngram_lm_scale_1.9_attention_scale_4.0 10.56
394
+ ngram_lm_scale_2.2_attention_scale_5.0 10.56
395
+ ngram_lm_scale_0.01_attention_scale_1.2 10.57
396
+ ngram_lm_scale_0.01_attention_scale_1.5 10.57
397
+ ngram_lm_scale_0.05_attention_scale_1.0 10.57
398
+ ngram_lm_scale_0.1_attention_scale_0.5 10.57
399
+ ngram_lm_scale_0.1_attention_scale_0.7 10.57
400
+ ngram_lm_scale_0.1_attention_scale_0.9 10.57
401
+ ngram_lm_scale_0.5_attention_scale_0.3 10.57
402
+ ngram_lm_scale_0.9_attention_scale_1.0 10.57
403
+ ngram_lm_scale_1.1_attention_scale_1.5 10.57
404
+ ngram_lm_scale_1.2_attention_scale_1.7 10.57
405
+ ngram_lm_scale_1.3_attention_scale_2.1 10.57
406
+ ngram_lm_scale_0.01_attention_scale_1.0 10.58
407
+ ngram_lm_scale_0.05_attention_scale_0.9 10.58
408
+ ngram_lm_scale_0.08_attention_scale_0.7 10.58
409
+ ngram_lm_scale_0.08_attention_scale_0.9 10.58
410
+ ngram_lm_scale_0.1_attention_scale_0.6 10.58
411
+ ngram_lm_scale_0.3_attention_scale_0.1 10.58
412
+ ngram_lm_scale_0.9_attention_scale_0.9 10.58
413
+ ngram_lm_scale_1.0_attention_scale_1.2 10.58
414
+ ngram_lm_scale_1.3_attention_scale_1.9 10.58
415
+ ngram_lm_scale_1.5_attention_scale_2.5 10.58
416
+ ngram_lm_scale_2.0_attention_scale_4.0 10.58
417
+ ngram_lm_scale_0.01_attention_scale_0.9 10.59
418
+ ngram_lm_scale_0.08_attention_scale_0.5 10.59
419
+ ngram_lm_scale_0.08_attention_scale_0.6 10.59
420
+ ngram_lm_scale_0.1_attention_scale_0.3 10.59
421
+ ngram_lm_scale_0.3_attention_scale_0.08 10.59
422
+ ngram_lm_scale_0.6_attention_scale_0.3 10.59
423
+ ngram_lm_scale_0.7_attention_scale_0.5 10.59
424
+ ngram_lm_scale_1.7_attention_scale_3.0 10.59
425
+ ngram_lm_scale_2.3_attention_scale_5.0 10.59
426
+ ngram_lm_scale_0.05_attention_scale_0.6 10.6
427
+ ngram_lm_scale_0.05_attention_scale_0.7 10.6
428
+ ngram_lm_scale_0.08_attention_scale_0.3 10.6
429
+ ngram_lm_scale_0.3_attention_scale_0.05 10.6
430
+ ngram_lm_scale_1.0_attention_scale_1.1 10.6
431
+ ngram_lm_scale_1.1_attention_scale_1.3 10.6
432
+ ngram_lm_scale_1.2_attention_scale_1.5 10.6
433
+ ngram_lm_scale_1.5_attention_scale_2.3 10.6
434
+ ngram_lm_scale_0.01_attention_scale_0.7 10.61
435
+ ngram_lm_scale_1.3_attention_scale_1.7 10.61
436
+ ngram_lm_scale_0.01_attention_scale_0.6 10.62
437
+ ngram_lm_scale_0.05_attention_scale_0.3 10.62
438
+ ngram_lm_scale_0.05_attention_scale_0.5 10.62
439
+ ngram_lm_scale_0.1_attention_scale_0.1 10.62
440
+ ngram_lm_scale_2.1_attention_scale_4.0 10.62
441
+ ngram_lm_scale_0.01_attention_scale_0.5 10.63
442
+ ngram_lm_scale_1.0_attention_scale_1.0 10.63
443
+ ngram_lm_scale_1.5_attention_scale_2.2 10.63
444
+ ngram_lm_scale_2.5_attention_scale_5.0 10.63
445
+ ngram_lm_scale_0.08_attention_scale_0.1 10.64
446
+ ngram_lm_scale_0.1_attention_scale_0.08 10.64
447
+ ngram_lm_scale_0.3_attention_scale_0.01 10.64
448
+ ngram_lm_scale_1.1_attention_scale_1.2 10.64
449
+ ngram_lm_scale_0.01_attention_scale_0.3 10.65
450
+ ngram_lm_scale_0.5_attention_scale_0.1 10.65
451
+ ngram_lm_scale_0.7_attention_scale_0.3 10.65
452
+ ngram_lm_scale_1.5_attention_scale_2.1 10.65
453
+ ngram_lm_scale_0.08_attention_scale_0.08 10.66
454
+ ngram_lm_scale_0.1_attention_scale_0.05 10.66
455
+ ngram_lm_scale_0.5_attention_scale_0.08 10.66
456
+ ngram_lm_scale_0.9_attention_scale_0.7 10.66
457
+ ngram_lm_scale_2.2_attention_scale_4.0 10.66
458
+ ngram_lm_scale_0.1_attention_scale_0.01 10.67
459
+ ngram_lm_scale_1.0_attention_scale_0.9 10.67
460
+ ngram_lm_scale_1.1_attention_scale_1.1 10.67
461
+ ngram_lm_scale_1.7_attention_scale_2.5 10.67
462
+ ngram_lm_scale_0.05_attention_scale_0.1 10.68
463
+ ngram_lm_scale_0.5_attention_scale_0.05 10.68
464
+ ngram_lm_scale_1.5_attention_scale_2.0 10.68
465
+ ngram_lm_scale_0.05_attention_scale_0.08 10.69
466
+ ngram_lm_scale_0.08_attention_scale_0.05 10.69
467
+ ngram_lm_scale_1.2_attention_scale_1.3 10.69
468
+ ngram_lm_scale_1.9_attention_scale_3.0 10.69
469
+ ngram_lm_scale_0.08_attention_scale_0.01 10.7
470
+ ngram_lm_scale_0.6_attention_scale_0.1 10.7
471
+ ngram_lm_scale_1.3_attention_scale_1.5 10.7
472
+ ngram_lm_scale_2.3_attention_scale_4.0 10.7
473
+ ngram_lm_scale_0.05_attention_scale_0.05 10.71
474
+ ngram_lm_scale_0.5_attention_scale_0.01 10.71
475
+ ngram_lm_scale_0.9_attention_scale_0.6 10.71
476
+ ngram_lm_scale_1.1_attention_scale_1.0 10.71
477
+ ngram_lm_scale_1.5_attention_scale_1.9 10.71
478
+ ngram_lm_scale_0.01_attention_scale_0.1 10.72
479
+ ngram_lm_scale_0.01_attention_scale_0.08 10.73
480
+ ngram_lm_scale_0.05_attention_scale_0.01 10.73
481
+ ngram_lm_scale_0.6_attention_scale_0.08 10.73
482
+ ngram_lm_scale_1.2_attention_scale_1.2 10.73
483
+ ngram_lm_scale_0.01_attention_scale_0.05 10.75
484
+ ngram_lm_scale_0.9_attention_scale_0.5 10.75
485
+ ngram_lm_scale_1.0_attention_scale_0.7 10.75
486
+ ngram_lm_scale_1.1_attention_scale_0.9 10.75
487
+ ngram_lm_scale_1.2_attention_scale_1.1 10.75
488
+ ngram_lm_scale_1.3_attention_scale_1.3 10.76
489
+ ngram_lm_scale_1.7_attention_scale_2.3 10.76
490
+ ngram_lm_scale_2.0_attention_scale_3.0 10.77
491
+ ngram_lm_scale_0.6_attention_scale_0.05 10.78
492
+ ngram_lm_scale_0.01_attention_scale_0.01 10.79
493
+ ngram_lm_scale_1.5_attention_scale_1.7 10.79
494
+ ngram_lm_scale_1.7_attention_scale_2.2 10.79
495
+ ngram_lm_scale_1.2_attention_scale_1.0 10.8
496
+ ngram_lm_scale_1.3_attention_scale_1.2 10.8
497
+ ngram_lm_scale_2.5_attention_scale_4.0 10.81
498
+ ngram_lm_scale_1.7_attention_scale_2.1 10.82
499
+ ngram_lm_scale_1.0_attention_scale_0.6 10.83
500
+ ngram_lm_scale_2.1_attention_scale_3.0 10.84
501
+ ngram_lm_scale_0.6_attention_scale_0.01 10.85
502
+ ngram_lm_scale_1.7_attention_scale_2.0 10.85
503
+ ngram_lm_scale_1.9_attention_scale_2.5 10.85
504
+ ngram_lm_scale_3.0_attention_scale_5.0 10.86
505
+ ngram_lm_scale_1.3_attention_scale_1.1 10.87
506
+ ngram_lm_scale_0.7_attention_scale_0.1 10.88
507
+ ngram_lm_scale_1.5_attention_scale_1.5 10.88
508
+ ngram_lm_scale_1.2_attention_scale_0.9 10.89
509
+ ngram_lm_scale_1.7_attention_scale_1.9 10.89
510
+ ngram_lm_scale_2.2_attention_scale_3.0 10.9
511
+ ngram_lm_scale_1.1_attention_scale_0.7 10.91
512
+ ngram_lm_scale_1.9_attention_scale_2.3 10.91
513
+ ngram_lm_scale_2.0_attention_scale_2.5 10.91
514
+ ngram_lm_scale_0.7_attention_scale_0.08 10.92
515
+ ngram_lm_scale_0.7_attention_scale_0.05 10.96
516
+ ngram_lm_scale_1.0_attention_scale_0.5 10.96
517
+ ngram_lm_scale_1.9_attention_scale_2.2 10.97
518
+ ngram_lm_scale_2.3_attention_scale_3.0 10.97
519
+ ngram_lm_scale_1.3_attention_scale_1.0 10.99
520
+ ngram_lm_scale_1.7_attention_scale_1.7 11.01
521
+ ngram_lm_scale_2.1_attention_scale_2.5 11.02
522
+ ngram_lm_scale_0.9_attention_scale_0.3 11.03
523
+ ngram_lm_scale_1.9_attention_scale_2.1 11.03
524
+ ngram_lm_scale_0.7_attention_scale_0.01 11.04
525
+ ngram_lm_scale_1.5_attention_scale_1.3 11.04
526
+ ngram_lm_scale_2.0_attention_scale_2.3 11.04
527
+ ngram_lm_scale_1.1_attention_scale_0.6 11.05
528
+ ngram_lm_scale_1.9_attention_scale_2.0 11.1
529
+ ngram_lm_scale_2.0_attention_scale_2.2 11.1
530
+ ngram_lm_scale_1.3_attention_scale_0.9 11.11
531
+ ngram_lm_scale_1.2_attention_scale_0.7 11.14
532
+ ngram_lm_scale_1.5_attention_scale_1.2 11.15
533
+ ngram_lm_scale_2.2_attention_scale_2.5 11.16
534
+ ngram_lm_scale_2.1_attention_scale_2.3 11.17
535
+ ngram_lm_scale_3.0_attention_scale_4.0 11.17
536
+ ngram_lm_scale_1.9_attention_scale_1.9 11.18
537
+ ngram_lm_scale_2.0_attention_scale_2.1 11.18
538
+ ngram_lm_scale_1.1_attention_scale_0.5 11.19
539
+ ngram_lm_scale_2.5_attention_scale_3.0 11.19
540
+ ngram_lm_scale_1.7_attention_scale_1.5 11.21
541
+ ngram_lm_scale_2.1_attention_scale_2.2 11.25
542
+ ngram_lm_scale_1.2_attention_scale_0.6 11.26
543
+ ngram_lm_scale_1.5_attention_scale_1.1 11.26
544
+ ngram_lm_scale_2.0_attention_scale_2.0 11.26
545
+ ngram_lm_scale_1.0_attention_scale_0.3 11.29
546
+ ngram_lm_scale_2.3_attention_scale_2.5 11.3
547
+ ngram_lm_scale_2.2_attention_scale_2.3 11.31
548
+ ngram_lm_scale_2.1_attention_scale_2.1 11.32
549
+ ngram_lm_scale_2.0_attention_scale_1.9 11.34
550
+ ngram_lm_scale_1.3_attention_scale_0.7 11.36
551
+ ngram_lm_scale_1.9_attention_scale_1.7 11.37
552
+ ngram_lm_scale_1.5_attention_scale_1.0 11.4
553
+ ngram_lm_scale_2.2_attention_scale_2.2 11.4
554
+ ngram_lm_scale_2.1_attention_scale_2.0 11.41
555
+ ngram_lm_scale_0.9_attention_scale_0.1 11.42
556
+ ngram_lm_scale_1.7_attention_scale_1.3 11.44
557
+ ngram_lm_scale_1.2_attention_scale_0.5 11.45
558
+ ngram_lm_scale_0.9_attention_scale_0.08 11.47
559
+ ngram_lm_scale_2.3_attention_scale_2.3 11.48
560
+ ngram_lm_scale_2.2_attention_scale_2.1 11.51
561
+ ngram_lm_scale_2.1_attention_scale_1.9 11.54
562
+ ngram_lm_scale_1.3_attention_scale_0.6 11.55
563
+ ngram_lm_scale_1.5_attention_scale_0.9 11.56
564
+ ngram_lm_scale_0.9_attention_scale_0.05 11.57
565
+ ngram_lm_scale_2.0_attention_scale_1.7 11.57
566
+ ngram_lm_scale_2.3_attention_scale_2.2 11.58
567
+ ngram_lm_scale_1.1_attention_scale_0.3 11.59
568
+ ngram_lm_scale_1.7_attention_scale_1.2 11.59
569
+ ngram_lm_scale_1.9_attention_scale_1.5 11.63
570
+ ngram_lm_scale_2.2_attention_scale_2.0 11.63
571
+ ngram_lm_scale_2.5_attention_scale_2.5 11.63
572
+ ngram_lm_scale_4.0_attention_scale_5.0 11.67
573
+ ngram_lm_scale_2.3_attention_scale_2.1 11.7
574
+ ngram_lm_scale_0.9_attention_scale_0.01 11.71
575
+ ngram_lm_scale_2.2_attention_scale_1.9 11.73
576
+ ngram_lm_scale_1.3_attention_scale_0.5 11.76
577
+ ngram_lm_scale_1.7_attention_scale_1.1 11.76
578
+ ngram_lm_scale_1.0_attention_scale_0.1 11.78
579
+ ngram_lm_scale_2.1_attention_scale_1.7 11.8
580
+ ngram_lm_scale_2.3_attention_scale_2.0 11.8
581
+ ngram_lm_scale_2.5_attention_scale_2.3 11.83
582
+ ngram_lm_scale_2.0_attention_scale_1.5 11.86
583
+ ngram_lm_scale_1.0_attention_scale_0.08 11.89
584
+ ngram_lm_scale_1.9_attention_scale_1.3 11.93
585
+ ngram_lm_scale_3.0_attention_scale_3.0 11.94
586
+ ngram_lm_scale_1.2_attention_scale_0.3 11.95
587
+ ngram_lm_scale_1.7_attention_scale_1.0 11.95
588
+ ngram_lm_scale_2.3_attention_scale_1.9 11.95
589
+ ngram_lm_scale_2.5_attention_scale_2.2 11.96
590
+ ngram_lm_scale_1.5_attention_scale_0.7 11.98
591
+ ngram_lm_scale_1.0_attention_scale_0.05 12.0
592
+ ngram_lm_scale_2.2_attention_scale_1.7 12.02
593
+ ngram_lm_scale_2.1_attention_scale_1.5 12.09
594
+ ngram_lm_scale_2.5_attention_scale_2.1 12.09
595
+ ngram_lm_scale_1.9_attention_scale_1.2 12.12
596
+ ngram_lm_scale_1.7_attention_scale_0.9 12.16
597
+ ngram_lm_scale_1.0_attention_scale_0.01 12.19
598
+ ngram_lm_scale_2.0_attention_scale_1.3 12.2
599
+ ngram_lm_scale_2.5_attention_scale_2.0 12.22
600
+ ngram_lm_scale_1.5_attention_scale_0.6 12.24
601
+ ngram_lm_scale_2.3_attention_scale_1.7 12.24
602
+ ngram_lm_scale_1.1_attention_scale_0.1 12.27
603
+ ngram_lm_scale_1.9_attention_scale_1.1 12.3
604
+ ngram_lm_scale_4.0_attention_scale_4.0 12.31
605
+ ngram_lm_scale_2.2_attention_scale_1.5 12.32
606
+ ngram_lm_scale_2.5_attention_scale_1.9 12.35
607
+ ngram_lm_scale_1.1_attention_scale_0.08 12.36
608
+ ngram_lm_scale_2.0_attention_scale_1.2 12.37
609
+ ngram_lm_scale_1.3_attention_scale_0.3 12.4
610
+ ngram_lm_scale_2.1_attention_scale_1.3 12.43
611
+ ngram_lm_scale_3.0_attention_scale_2.5 12.46
612
+ ngram_lm_scale_1.1_attention_scale_0.05 12.51
613
+ ngram_lm_scale_1.9_attention_scale_1.0 12.52
614
+ ngram_lm_scale_2.3_attention_scale_1.5 12.53
615
+ ngram_lm_scale_1.5_attention_scale_0.5 12.54
616
+ ngram_lm_scale_2.0_attention_scale_1.1 12.58
617
+ ngram_lm_scale_5.0_attention_scale_5.0 12.62
618
+ ngram_lm_scale_2.1_attention_scale_1.2 12.63
619
+ ngram_lm_scale_2.5_attention_scale_1.7 12.64
620
+ ngram_lm_scale_1.7_attention_scale_0.7 12.68
621
+ ngram_lm_scale_2.2_attention_scale_1.3 12.68
622
+ ngram_lm_scale_1.1_attention_scale_0.01 12.72
623
+ ngram_lm_scale_3.0_attention_scale_2.3 12.72
624
+ ngram_lm_scale_1.9_attention_scale_0.9 12.78
625
+ ngram_lm_scale_1.2_attention_scale_0.1 12.79
626
+ ngram_lm_scale_2.0_attention_scale_1.0 12.82
627
+ ngram_lm_scale_2.1_attention_scale_1.1 12.86
628
+ ngram_lm_scale_3.0_attention_scale_2.2 12.87
629
+ ngram_lm_scale_1.2_attention_scale_0.08 12.88
630
+ ngram_lm_scale_2.2_attention_scale_1.2 12.92
631
+ ngram_lm_scale_2.3_attention_scale_1.3 12.97
632
+ ngram_lm_scale_1.7_attention_scale_0.6 12.98
633
+ ngram_lm_scale_3.0_attention_scale_2.1 13.03
634
+ ngram_lm_scale_2.5_attention_scale_1.5 13.04
635
+ ngram_lm_scale_1.2_attention_scale_0.05 13.05
636
+ ngram_lm_scale_2.0_attention_scale_0.9 13.11
637
+ ngram_lm_scale_2.1_attention_scale_1.0 13.17
638
+ ngram_lm_scale_2.2_attention_scale_1.1 13.2
639
+ ngram_lm_scale_3.0_attention_scale_2.0 13.2
640
+ ngram_lm_scale_2.3_attention_scale_1.2 13.24
641
+ ngram_lm_scale_1.2_attention_scale_0.01 13.27
642
+ ngram_lm_scale_1.3_attention_scale_0.1 13.3
643
+ ngram_lm_scale_1.5_attention_scale_0.3 13.32
644
+ ngram_lm_scale_1.7_attention_scale_0.5 13.33
645
+ ngram_lm_scale_1.3_attention_scale_0.08 13.4
646
+ ngram_lm_scale_4.0_attention_scale_3.0 13.41
647
+ ngram_lm_scale_1.9_attention_scale_0.7 13.42
648
+ ngram_lm_scale_3.0_attention_scale_1.9 13.42
649
+ ngram_lm_scale_2.1_attention_scale_0.9 13.45
650
+ ngram_lm_scale_2.2_attention_scale_1.0 13.46
651
+ ngram_lm_scale_2.3_attention_scale_1.1 13.47
652
+ ngram_lm_scale_2.5_attention_scale_1.3 13.53
653
+ ngram_lm_scale_1.3_attention_scale_0.05 13.56
654
+ ngram_lm_scale_5.0_attention_scale_4.0 13.57
655
+ ngram_lm_scale_2.0_attention_scale_0.7 13.73
656
+ ngram_lm_scale_2.2_attention_scale_0.9 13.74
657
+ ngram_lm_scale_1.9_attention_scale_0.6 13.75
658
+ ngram_lm_scale_2.3_attention_scale_1.0 13.75
659
+ ngram_lm_scale_2.5_attention_scale_1.2 13.78
660
+ ngram_lm_scale_1.3_attention_scale_0.01 13.81
661
+ ngram_lm_scale_3.0_attention_scale_1.7 13.84
662
+ ngram_lm_scale_2.5_attention_scale_1.1 14.05
663
+ ngram_lm_scale_2.1_attention_scale_0.7 14.07
664
+ ngram_lm_scale_2.3_attention_scale_0.9 14.07
665
+ ngram_lm_scale_2.0_attention_scale_0.6 14.1
666
+ ngram_lm_scale_1.9_attention_scale_0.5 14.14
667
+ ngram_lm_scale_1.7_attention_scale_0.3 14.18
668
+ ngram_lm_scale_4.0_attention_scale_2.5 14.2
669
+ ngram_lm_scale_3.0_attention_scale_1.5 14.28
670
+ ngram_lm_scale_1.5_attention_scale_0.1 14.3
671
+ ngram_lm_scale_2.5_attention_scale_1.0 14.35
672
+ ngram_lm_scale_1.5_attention_scale_0.08 14.41
673
+ ngram_lm_scale_2.2_attention_scale_0.7 14.42
674
+ ngram_lm_scale_2.1_attention_scale_0.6 14.47
675
+ ngram_lm_scale_2.0_attention_scale_0.5 14.51
676
+ ngram_lm_scale_4.0_attention_scale_2.3 14.56
677
+ ngram_lm_scale_1.5_attention_scale_0.05 14.57
678
+ ngram_lm_scale_2.5_attention_scale_0.9 14.66
679
+ ngram_lm_scale_2.3_attention_scale_0.7 14.72
680
+ ngram_lm_scale_4.0_attention_scale_2.2 14.75
681
+ ngram_lm_scale_2.2_attention_scale_0.6 14.76
682
+ ngram_lm_scale_3.0_attention_scale_1.3 14.76
683
+ ngram_lm_scale_2.1_attention_scale_0.5 14.8
684
+ ngram_lm_scale_1.5_attention_scale_0.01 14.82
685
+ ngram_lm_scale_5.0_attention_scale_3.0 14.84
686
+ ngram_lm_scale_4.0_attention_scale_2.1 14.9
687
+ ngram_lm_scale_1.9_attention_scale_0.3 14.93
688
+ ngram_lm_scale_3.0_attention_scale_1.2 14.98
689
+ ngram_lm_scale_2.3_attention_scale_0.6 15.04
690
+ ngram_lm_scale_4.0_attention_scale_2.0 15.07
691
+ ngram_lm_scale_2.2_attention_scale_0.5 15.13
692
+ ngram_lm_scale_1.7_attention_scale_0.1 15.2
693
+ ngram_lm_scale_3.0_attention_scale_1.1 15.24
694
+ ngram_lm_scale_4.0_attention_scale_1.9 15.25
695
+ ngram_lm_scale_2.5_attention_scale_0.7 15.26
696
+ ngram_lm_scale_1.7_attention_scale_0.08 15.3
697
+ ngram_lm_scale_2.0_attention_scale_0.3 15.31
698
+ ngram_lm_scale_2.3_attention_scale_0.5 15.41
699
+ ngram_lm_scale_1.7_attention_scale_0.05 15.48
700
+ ngram_lm_scale_3.0_attention_scale_1.0 15.54
701
+ ngram_lm_scale_2.5_attention_scale_0.6 15.59
702
+ ngram_lm_scale_5.0_attention_scale_2.5 15.61
703
+ ngram_lm_scale_2.1_attention_scale_0.3 15.62
704
+ ngram_lm_scale_4.0_attention_scale_1.7 15.66
705
+ ngram_lm_scale_1.7_attention_scale_0.01 15.73
706
+ ngram_lm_scale_3.0_attention_scale_0.9 15.8
707
+ ngram_lm_scale_5.0_attention_scale_2.3 15.9
708
+ ngram_lm_scale_1.9_attention_scale_0.1 15.91
709
+ ngram_lm_scale_2.2_attention_scale_0.3 15.93
710
+ ngram_lm_scale_2.5_attention_scale_0.5 15.96
711
+ ngram_lm_scale_1.9_attention_scale_0.08 16.02
712
+ ngram_lm_scale_4.0_attention_scale_1.5 16.04
713
+ ngram_lm_scale_5.0_attention_scale_2.2 16.04
714
+ ngram_lm_scale_1.9_attention_scale_0.05 16.18
715
+ ngram_lm_scale_5.0_attention_scale_2.1 16.2
716
+ ngram_lm_scale_2.3_attention_scale_0.3 16.21
717
+ ngram_lm_scale_2.0_attention_scale_0.1 16.25
718
+ ngram_lm_scale_3.0_attention_scale_0.7 16.34
719
+ ngram_lm_scale_2.0_attention_scale_0.08 16.35
720
+ ngram_lm_scale_5.0_attention_scale_2.0 16.37
721
+ ngram_lm_scale_1.9_attention_scale_0.01 16.42
722
+ ngram_lm_scale_4.0_attention_scale_1.3 16.45
723
+ ngram_lm_scale_2.0_attention_scale_0.05 16.5
724
+ ngram_lm_scale_5.0_attention_scale_1.9 16.52
725
+ ngram_lm_scale_2.1_attention_scale_0.1 16.55
726
+ ngram_lm_scale_4.0_attention_scale_1.2 16.62
727
+ ngram_lm_scale_2.1_attention_scale_0.08 16.64
728
+ ngram_lm_scale_3.0_attention_scale_0.6 16.64
729
+ ngram_lm_scale_2.5_attention_scale_0.3 16.67
730
+ ngram_lm_scale_2.0_attention_scale_0.01 16.71
731
+ ngram_lm_scale_2.1_attention_scale_0.05 16.77
732
+ ngram_lm_scale_2.2_attention_scale_0.1 16.8
733
+ ngram_lm_scale_5.0_attention_scale_1.7 16.82
734
+ ngram_lm_scale_4.0_attention_scale_1.1 16.84
735
+ ngram_lm_scale_2.2_attention_scale_0.08 16.89
736
+ ngram_lm_scale_3.0_attention_scale_0.5 16.95
737
+ ngram_lm_scale_2.1_attention_scale_0.01 16.99
738
+ ngram_lm_scale_2.2_attention_scale_0.05 17.02
739
+ ngram_lm_scale_2.3_attention_scale_0.1 17.02
740
+ ngram_lm_scale_4.0_attention_scale_1.0 17.07
741
+ ngram_lm_scale_2.3_attention_scale_0.08 17.09
742
+ ngram_lm_scale_5.0_attention_scale_1.5 17.16
743
+ ngram_lm_scale_2.2_attention_scale_0.01 17.18
744
+ ngram_lm_scale_2.3_attention_scale_0.05 17.2
745
+ ngram_lm_scale_4.0_attention_scale_0.9 17.24
746
+ ngram_lm_scale_2.3_attention_scale_0.01 17.38
747
+ ngram_lm_scale_2.5_attention_scale_0.1 17.4
748
+ ngram_lm_scale_5.0_attention_scale_1.3 17.45
749
+ ngram_lm_scale_2.5_attention_scale_0.08 17.47
750
+ ngram_lm_scale_3.0_attention_scale_0.3 17.53
751
+ ngram_lm_scale_2.5_attention_scale_0.05 17.58
752
+ ngram_lm_scale_5.0_attention_scale_1.2 17.63
753
+ ngram_lm_scale_2.5_attention_scale_0.01 17.7
754
+ ngram_lm_scale_4.0_attention_scale_0.7 17.7
755
+ ngram_lm_scale_5.0_attention_scale_1.1 17.8
756
+ ngram_lm_scale_4.0_attention_scale_0.6 17.89
757
+ ngram_lm_scale_5.0_attention_scale_1.0 17.94
758
+ ngram_lm_scale_3.0_attention_scale_0.1 18.09
759
+ ngram_lm_scale_4.0_attention_scale_0.5 18.09
760
+ ngram_lm_scale_5.0_attention_scale_0.9 18.09
761
+ ngram_lm_scale_3.0_attention_scale_0.08 18.14
762
+ ngram_lm_scale_3.0_attention_scale_0.05 18.21
763
+ ngram_lm_scale_3.0_attention_scale_0.01 18.31
764
+ ngram_lm_scale_5.0_attention_scale_0.7 18.41
765
+ ngram_lm_scale_4.0_attention_scale_0.3 18.49
766
+ ngram_lm_scale_5.0_attention_scale_0.6 18.57
767
+ ngram_lm_scale_5.0_attention_scale_0.5 18.71
768
+ ngram_lm_scale_4.0_attention_scale_0.1 18.85
769
+ ngram_lm_scale_4.0_attention_scale_0.08 18.88
770
+ ngram_lm_scale_4.0_attention_scale_0.05 18.95
771
+ ngram_lm_scale_5.0_attention_scale_0.3 19.01
772
+ ngram_lm_scale_4.0_attention_scale_0.01 19.02
773
+ ngram_lm_scale_5.0_attention_scale_0.1 19.3
774
+ ngram_lm_scale_5.0_attention_scale_0.08 19.32
775
+ ngram_lm_scale_5.0_attention_scale_0.05 19.37
776
+ ngram_lm_scale_5.0_attention_scale_0.01 19.43
777
+
778
+ 2022-04-08 23:20:49,165 INFO [decode.py:730] Done!