Scheduled Optim Pretraining
ScheduledOptimPretraining
Bases: Optimizer
DEPRECATED: moved to AcousticModule.
A custom optimizer that uses AdamW
for optimization and an LambdaLR
for learning rate scheduling.
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 |
|
__init__(train_config, model_config, parameters, defaults={}, step=0)
Initializes the ScheduledOptimPretraining optimizer.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
train_config |
AcousticPretrainingConfig
|
The training configuration. |
required |
model_config |
AcousticModelConfigType
|
The model configuration. |
required |
parameters |
Iterable
|
The model parameters to optimize. |
required |
defaults |
Dict[str, Any]
|
Default optimization options. Defaults to an empty dictionary. |
{}
|
step |
int
|
The current training step. Defaults to None. |
0
|
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 |
|
get_lr()
Returns the current learning rate.
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
102 103 104 |
|
load_state_dict(state_dict)
Loads the optimizer state dictionary.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
state_dict |
Dict[str, Any]
|
The optimizer state dictionary. |
required |
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
106 107 108 109 110 111 112 |
|
step(closure)
Performs a single optimization step.
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
93 94 95 96 |
|
zero_grad()
Zeroes the gradients of the optimizer.
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
98 99 100 |
|
get_lr_lambda(model_config, train_config, current_step=0)
DEPRECATED: moved to AcousticModule. Returns the custom lambda function for the learning rate schedule.
Returns function: The custom lambda function for the learning rate schedule.
Source code in notebooks/experiments/optimizer/scheduled_optim_pretraining.py
13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 |
|