Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #5866 +/- ##
==========================================
+ Coverage 47.84% 51.79% +3.94%
==========================================
Files 501 818 +317
Lines 44796 75376 +30580
==========================================
+ Hits 21432 39039 +17607
- Misses 23364 36337 +12973
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
|
The error seems to be unrelated to my PR Can someone help me to fix this? |
|
We're working on fixing it. |
ftshijt
left a comment
There was a problem hiding this comment.
Many thanks! The whole recipe seems to be good, especially the readme is very instructive. I left a few minor comments for fixes and software-level suggestions.
| @@ -0,0 +1,93 @@ | |||
| # Trained with A100 (40 GB) x 1 GPUs for Kmeans1K+nbpe5K. It takes 32 minutes per epoch. | |||
There was a problem hiding this comment.
Hi, since the two configs are very similar, we may simply keep one with the best performance
egs2/ml_superb/asr2/local/data.sh
Outdated
| @@ -0,0 +1,145 @@ | |||
| #!/usr/bin/env bash | |||
There was a problem hiding this comment.
If the data preparation is identical to asr1, you may use symlink
| @@ -0,0 +1,398 @@ | |||
| import argparse | |||
egs2/ml_superb/asr2/run.sh
Outdated
| # Process Pipeline | ||
| source /home/stan/miniconda3/envs/espnet_discrete/etc/profile.d/conda.sh | ||
| conda activate /home/stan/miniconda3/envs/espnet_discrete | ||
| export NCCL_P2P_DISABLE=1 |
There was a problem hiding this comment.
| # Process Pipeline | |
| source /home/stan/miniconda3/envs/espnet_discrete/etc/profile.d/conda.sh | |
| conda activate /home/stan/miniconda3/envs/espnet_discrete | |
| export NCCL_P2P_DISABLE=1 |
We can safely remove the part with local setup.
egs2/ml_superb/asr2/run.sh
Outdated
| stage=15 | ||
| stop_stage=15 |
There was a problem hiding this comment.
Please initial them to stage=1 and stop_stage=100 for default setup
egs2/ml_superb/asr2/run.sh
Outdated
| inference_config=conf/decode_ctc0.3.yaml | ||
| asr_config=conf/tuning/train_discrete_asr_e_branchformer1_1gpu_lr5e-4_warmup5k.yaml |
There was a problem hiding this comment.
Considering some or our convention, you may name your config to conf/decode.yaml and conf/train.yaml for the first setup.
|
@ftshijt |
|
Thanks for your contribution~ |
ml_superb asr2 recipe
What?
Add an asr2 recipe to egs2/ml_superb
Why?
I fix the run.sh and local/data.sh of egs2/interspeech2024_dsu_challenge to make it fit the format of ml_superb.