Thanks to visit codestin.com
Credit goes to github.com

Skip to content

MT Task in SpeechLM#5899

Merged
sw005320 merged 6 commits intoespnet:speechlmfrom
ftshijt:speechlm
Sep 16, 2024
Merged

MT Task in SpeechLM#5899
sw005320 merged 6 commits intoespnet:speechlmfrom
ftshijt:speechlm

Conversation

@ftshijt
Copy link
Collaborator

@ftshijt ftshijt commented Sep 12, 2024

The full MT-task implementation for speechlm.


elif modality in ["text_bpe"]:
writers[name].write(f"{example_name} {detokenized}")
detokenized = detokenized.strip()
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jctian98 Please let me know if this is fixed in your latest PR then I can remove this temporary modification.

--inference_nj ${inference_nj} \
--gpu_inference ${gpu_inference} \
--nbest ${nbest}
--nbest ${nbest} ${scoring_args}
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jctian98 I intentionally add this one to incorporate scoring related arguments for different tasks.

@ftshijt
Copy link
Collaborator Author

ftshijt commented Sep 12, 2024

@jctian98 This is a following up fix for MT task of speechlm project, mostly with evaluation.

@jctian98
Copy link
Collaborator

it's ok to merge this to speechlm branch for now. I'll take care of the compatibility issues.

@sw005320 sw005320 added this to the v.202405 milestone Sep 16, 2024
@sw005320 sw005320 merged commit ca07db9 into espnet:speechlm Sep 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants