Whisper fine-tuning recipes for CHiME-4 and WSJ#5342
Conversation
sw005320
left a comment
There was a problem hiding this comment.
LGTM.
Please clarify whether you use an LM or not in REAMD.md
egs2/chime4/asr1/README.md
Outdated
| |decode_asr_lm_lm_train_lm_transformer_en_char_valid.loss.ave_asr_model_valid.acc.ave/et05_simu_beamformit_5mics|1320|126812|94.4|2.8|2.8|1.5|7.2|66.1| | ||
|
|
||
|
|
||
| ## Whisper [medium_finetuning](conf/tuning/train_asr_whisper_full_warmup1500.yaml) |
There was a problem hiding this comment.
Is it with an LM?
Please remark on it.
There was a problem hiding this comment.
If this one is not used, you can remove it from this PR.
|
|
||
| ## Whisper [medium_finetuning](conf/tuning/train_asr_whisper_full_warmup3000.yaml) | ||
|
|
||
| ## Environments |
There was a problem hiding this comment.
Ditto
Please clarify that this does not use an LM.
|
To clarify the point that the performance is with |
Codecov Report
@@ Coverage Diff @@
## master #5342 +/- ##
=======================================
Coverage 76.11% 76.11%
=======================================
Files 672 672
Lines 59859 59859
=======================================
Hits 45563 45563
Misses 14296 14296
Flags with carried forward coverage won't be shown. Click here to find out more. 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
|
Thanks, @YoshikiMas! |
This PR adds configurations of Whisper fine-tuning for CHiME-4 and WSJ. I'll upload the pre-trained model later.