Name : Tokenizer, Tagger, Morpho, Lemmatizer, Parser and Segmenter Machine Classifier : tokeparser { Transitions : {tokenizer,data/tokenizer.ts tagger,data/tagger.ts morpho,data/morpho_whole.ts lemmatizer_rules,data/lemmatizer_rules.ts parser,data/parser_eager_rel_strict.ts segmenter,data/segmenter.ts} LossMultiplier : {} Network type : Modular Context : Targets{b.-3 b.-2 b.-1 b.0 s.0 s.1 s.2 b.0.0 s.0.0 s.0.-1 s.1.0 s.1.-1 s.2.0 s.2.-1} Columns{FORM} LSTM{1 1 0.0 1} In{128} Out{128} w2v{FORM,data/FORM.w2v} Context : Targets{b.-3 b.-2 b.-1 b.0 s.0 s.1 s.2 b.0.0 s.0.0 s.0.-1 s.1.0 s.1.-1 s.2.0 s.2.-1} Columns{EOS ID UPOS FEATS DEPREL} LSTM{1 1 0 1} In{128} Out{64} w2v{} Focused : Column{prefix3:FORM} NbElem{3} Buffer{0} Stack{} LSTM{1 1 0 1} In{64} Out{64} w2v{} Focused : Column{suffix3:FORM} NbElem{3} Buffer{0} Stack{} LSTM{1 1 0 1} In{64} Out{64} w2v{} RawInput : Left{5} Right{10} LSTM{1 1 0.0 1} In{32} Out{32} History : NbElem{10} LSTM{1 1 0 1} In{128} Out{64} HistoryMine : NbElem{4} LSTM{1 1 0 1} In{128} Out{64} StateName : Out{64} Distance : FromBuffer{} FromStack{0 1 2} ToBuffer{0} ToStack{} Threshold{15} LSTM{1 1 0.0 1} In{128} Out{64} SplitTrans : LSTM{1 1 0.0 1} In{128} Out{64} InputDropout : 0.5 MLP : {3200 0.4 1600 0.4} End Optimizer : Adagrad {0.01 0.000001 0 0.0000000001} Type : classification Loss : crossentropy } Splitwords : data/splitwords.ts Predictions : ID FORM UPOS FEATS LEMMA HEAD DEPREL EOS Strategy { Block : End{cannotMove} tokenizer tagger ENDWORD 0 tokenizer tagger SPLIT 0 tokenizer tokenizer * 0 tagger morpho * 0 morpho lemmatizer_rules * 0 lemmatizer_rules parser * 0 parser segmenter eager_SHIFT 0 parser segmenter eager_RIGHT_rel 0 parser parser * 0 segmenter tokenizer * 1 }