Thanks to visit codestin.com
Credit goes to github.com

Skip to content

TransformerDecoder forward_one_step with memory_mask#5679

Merged
mergify[bot] merged 7 commits intoespnet:masterfrom
albertz:albert-forward-step-masked
Feb 26, 2024
Merged

TransformerDecoder forward_one_step with memory_mask#5679
mergify[bot] merged 7 commits intoespnet:masterfrom
albertz:albert-forward-step-masked

Conversation

@albertz
Copy link
Contributor

@albertz albertz commented Feb 26, 2024

What?

I added the memory_mask arg to BaseTransformerDecoder.forward_one_step.

Alongside, I made the cache arg a keyword-only arg, such that the user can not rely on the specific order of positional args. I think that's cleaner.

Why?

I needed the memory_mask arg for TransformerDecoder.forward_one_step because I wanted to perform batched beam search (i.e. on multiple sequences at once, i.e. the batch dim in memory is >1).

@codecov
Copy link

codecov bot commented Feb 26, 2024

Codecov Report

Attention: Patch coverage is 80.00000% with 1 lines in your changes are missing coverage. Please review.

Project coverage is 76.12%. Comparing base (0854d6f) to head (2ee26ec).

Files Patch % Lines
espnet2/asr/decoder/transformer_decoder.py 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #5679      +/-   ##
==========================================
+ Coverage   70.27%   76.12%   +5.84%     
==========================================
  Files         746      747       +1     
  Lines       69369    69409      +40     
==========================================
+ Hits        48752    52836    +4084     
+ Misses      20617    16573    -4044     
Flag Coverage Δ
test_configuration_espnet2 ∅ <ø> (∅)
test_integration_espnet1 62.92% <100.00%> (ø)
test_integration_espnet2 48.92% <75.00%> (?)
test_python_espnet1 18.32% <40.00%> (ø)
test_python_espnet2 52.70% <60.00%> (+0.64%) ⬆️
test_utils 22.15% <100.00%> (+0.25%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@sw005320 sw005320 added the Enhancement Enhancement label Feb 26, 2024
@sw005320 sw005320 added this to the v.202405 milestone Feb 26, 2024
@sw005320 sw005320 requested a review from pyf98 February 26, 2024 15:00
@sw005320
Copy link
Contributor

Thanks, @albertz!
@pyf98, can you review this PR?

@pyf98
Copy link
Collaborator

pyf98 commented Feb 26, 2024

Looks good to me! Since we do not implement batched inference right now, it should not affect the current behavior.

I also found this issue when preparing the assignment for our speech course. In that assignment, I did a similar thing to add the mask so that we can use batched inference.

@sw005320 sw005320 added the auto-merge Enable auto-merge label Feb 26, 2024
@sw005320
Copy link
Contributor

OK, thanks!
After CI is finished, I'll merge this PR.

@mergify mergify bot merged commit 08b54d6 into espnet:master Feb 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants