Skip to content
Snippets Groups Projects
Select Git revision
  • loss
  • master default protected
  • producer
3 results
You can move around the graph by using the arrow keys.
Created with Raphaël 2.2.08Feb76520Jan191715847Dec17Nov1630Oct2623221614910Sep916Aug11765431Jul30281514972130Jun2926232221191817161514131210987543228May252423211817161514131211108654330Apr292726242322212017161514131211109876532131Mar30292621201917161514121110987543228Feb272625242322212019171615141312118765431Jan30292827262221201413129876221Dec20191817161514131211Added possibility to print special columns in printForDebugAdded special dict value for when a feature target the child ofa node without oneFixed test in SplitTransitionAdded const qualifier to operator ==When doing a splitTransition, add to Form the content of the rawInput and not the first word of the splitTransitionSplitTransition now ignore caseFixed a bug where EOS features were not taken into accountImproved error message for failed stof in NumericColumnModuleAdded -x (extra columns) option when invoking eval scriptAll losses are reduced to sum instead of mean (to give consistent values regardless of batch size)Fixed L1 lossAdded L1 loss and removed mean from regression lossesRemoved factor 100 for loss printAdded optional instruction BanExamples in Classifier's definitionremoved useless lineAdded argument --oracleMode for macaon train, to transform a corpus into a list of transitionsThrow error if column not found in NumericColumnModuleAdded default value for NumericColumnModuleUpdated docSurprisal for each word can now be computed by adding SURPRISAL to mcdMade entropy positiveMerge branch 'leftRightEntropy' of https://gitlab.lis-lab.fr/franck.dary/macaon into leftRightEntropyParser entropy is only counted for a word being attachedCorrected error in computing of entropy (changed dot product to hadamard product)Parser entropy is only counted for a word being attachedCorrected bug where text metadata would not contain the first word of sentence if it was a multiwordEntropy is divided by statesRemove NaN and clamping of entropyUsing torch functions to compute entropyOutput entropy when there is a column named 'ENTROPY'Dict is open during pretrained embeddings loadingAdded option to reload pretrained embeddings during decodingAdded program argument to lock pretrained embeddingsFixed rawRange for multiwordsmetadata like #text= are now updating correctlyLowered minimum CMake requirementLowered requirement on CMake versionImproved printimproved printFixed min float value
Loading