Skip to content
GitLab
Menu
Projects
Groups
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
Ambuj Mehrish
sidekit
Commits
84c4972a
Commit
84c4972a
authored
Mar 11, 2019
by
Anthony Larcher
Browse files
new definition of embeddings
parent
ceee256b
Changes
1
Hide whitespace changes
Inline
Side-by-side
nnet/xvector.py
View file @
84c4972a
...
...
@@ -98,14 +98,14 @@ class Xtractor(torch.nn.Module):
seg_emb_0
=
torch
.
cat
([
mean
,
std
],
dim
=
1
)
# No batch-normalisation after this layer
seg_emb_1
=
self
.
dropout_lin0
(
seg_emb_0
)
seg_emb_
1
=
self
.
activation
(
self
.
seg_lin0
(
seg_emb_1
))
seg_emb_
2
=
self
.
activation
(
self
.
seg_lin0
(
seg_emb_1
))
# new layer with batch Normalization
seg_emb_
2
=
self
.
dropout_lin1
(
seg_emb_
1
)
seg_emb_
3
=
self
.
norm6
(
self
.
activation
(
self
.
seg_lin1
(
seg_emb_
2
)))
seg_emb_
3
=
self
.
dropout_lin1
(
seg_emb_
2
)
seg_emb_
4
=
self
.
norm6
(
self
.
activation
(
self
.
seg_lin1
(
seg_emb_
3
)))
# No batch-normalisation after this layer
seg_emb_
4
=
self
.
activation
(
self
.
seg_lin2
(
seg_emb_
3
))
#
seg_
emb_3 = self.seg_lin2
(seg_emb_
2
)
return
seg_
emb_4
seg_emb_
5
=
self
.
activation
(
self
.
seg_lin2
(
seg_emb_
4
))
seg_
output
=
torch
.
nn
.
functional
.
softmax
(
seg_emb_
5
,
dim
=
1
)
return
seg_
output
def
LossFN
(
self
,
x
,
lable
):
loss
=
-
torch
.
trace
(
torch
.
mm
(
torch
.
log10
(
x
),
torch
.
t
(
lable
)))
...
...
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment