/// @brief Make each Classifier go over every examples.
/// Depending on getScoreOnBatch, it can update the parameters or not.
/// @param examples Map each trainable Classifier with a set of examples.
/// @param batchSize The batch size to use.
/// @param nbExamples Map each trainable Classifier to a count of how many examples it has seen during this epoch and a count of how many of this examples it has correctly classified. This map is filled by this function.
/// @param getScoreOnBatch The MLP function that must be called to get the score of a classifier on a certain batch.
/// @brief Print the score obtained by all Classifier on this epoch.
///
/// @param output Where to print the output.
/// @param nbExamplesTrain Map each trainable Classifier to a count of how many train examples it has seen during this epoch and a count of how many of this examples it has correctly classified.
/// @param nbExamplesDev Map each trainable Classifier to a count of how many dev examples it has seen during this epoch and a count of how many of this examples it has correctly classified.
/// @param trainScores The scores obtained by each Classifier on the train set.
/// @param devScores The scores obtained by each Classifier on the train set.
/// @param bestIter Map each classifier to its best epoch. It is updated by this function.
/// @param nbIter The total number of epoch of the training.
/// @param curIter The current epoch of the training.