Forward Sum Loss
Bases: Module
Computes the forward sum loss for sequence-to-sequence models with attention.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
blank_logprob |
float
|
The log probability of the blank symbol. Default: -1. |
-1
|
Attributes:
Name | Type | Description |
---|---|---|
log_softmax |
LogSoftmax
|
The log softmax function. |
ctc_loss |
CTCLoss
|
The CTC loss function. |
blank_logprob |
float
|
The log probability of the blank symbol. |
Methods:
Name | Description |
---|---|
forward |
Computes the forward sum loss for sequence-to-sequence models with attention. |
Source code in training/loss/forward_sum_loss.py
7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
|
forward(attn_logprob, in_lens, out_lens)
Computes the forward sum loss for sequence-to-sequence models with attention.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
attn_logprob |
Tensor
|
The attention log probabilities of shape (batch_size, max_out_len, max_in_len). |
required |
in_lens |
Tensor
|
The input lengths of shape (batch_size,). |
required |
out_lens |
Tensor
|
The output lengths of shape (batch_size,). |
required |
Returns:
Name | Type | Description |
---|---|---|
float |
float
|
The forward sum loss. |
Source code in training/loss/forward_sum_loss.py
29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
|