Skip to content
GitLab
Menu
Projects
Groups
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
Franck Dary
macaon
Commits
1473579c
Commit
1473579c
authored
Mar 25, 2021
by
Franck Dary
Browse files
Added sanity check when loading pretrained word embeddings
parent
7e1a6789
Changes
1
Hide whitespace changes
Inline
Side-by-side
torch_modules/src/Submodule.cpp
View file @
1473579c
...
...
@@ -72,6 +72,8 @@ void Submodule::loadPretrainedW2vEmbeddings(torch::nn::Embedding embeddings, std
if
(
dictIndex
>=
embeddings
->
weight
.
size
(
0
))
{
if
((
unsigned
long
)
dictIndex
!=
embeddings
->
weight
.
size
(
0
)
+
toAdd
.
size
())
util
::
myThrow
(
fmt
::
format
(
"dictIndex == {}, weight.size == {}, toAdd.size == {}"
,
dictIndex
,
embeddings
->
weight
.
size
(
0
),
toAdd
.
size
()));
toAdd
.
emplace_back
();
for
(
unsigned
int
i
=
1
;
i
<
splited
.
size
();
i
++
)
toAdd
.
back
().
emplace_back
(
std
::
stof
(
splited
[
i
]));
...
...
@@ -166,7 +168,7 @@ std::function<std::string(const std::string &)> Submodule::getFunction(const std
return
[
sequence
](
const
std
::
string
&
s
)
{
auto
result
=
s
;
auto
result
=
s
;
for
(
auto
&
f
:
sequence
)
result
=
f
(
result
);
return
result
;
...
...
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment