Skip to content
GitLab
Menu
Projects
Groups
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
Anthony Larcher
sidekit
Commits
e1f5d369
Commit
e1f5d369
authored
May 05, 2021
by
Anthony Larcher
Browse files
remove cyclic 1
parent
3cf12ccc
Changes
1
Hide whitespace changes
Inline
Side-by-side
nnet/xvector.py
View file @
e1f5d369
...
@@ -1216,18 +1216,6 @@ def get_optimizer(model, model_opts, train_opts, training_loader):
...
@@ -1216,18 +1216,6 @@ def get_optimizer(model, model_opts, train_opts, training_loader):
cycle_momentum
=
cycle_momentum
,
cycle_momentum
=
cycle_momentum
,
mode
=
train_opts
[
"scheduler"
][
"mode"
])
mode
=
train_opts
[
"scheduler"
][
"mode"
])
#elif train_opts["scheduler"]["type"] == 'CyclicLR1':
# cycle_momentum = True
# if train_opts["optimizer"]["type"] == "adam":
# cycle_momentum = False
# scheduler = torch.optim.lr_scheduler.CyclicLR(optimizer=optimizer,
# base_lr=1e-8,
# max_lr=train_opts["lr"],
# step_size_up=model_opts["speaker_number"] * 4,
# step_size_down=None,
# cycle_momentum=cycle_momentum,
# mode="triangular")
elif
train_opts
[
"scheduler"
][
"type"
]
==
"MultiStepLR"
:
elif
train_opts
[
"scheduler"
][
"type"
]
==
"MultiStepLR"
:
scheduler
=
torch
.
optim
.
lr_scheduler
.
MultiStepLR
(
optimizer
=
optimizer
,
scheduler
=
torch
.
optim
.
lr_scheduler
.
MultiStepLR
(
optimizer
=
optimizer
,
milestones
=
[
10000
,
50000
,
100000
],
milestones
=
[
10000
,
50000
,
100000
],
...
...
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment