Conversation
|
|
||
| // save the final model. | ||
| System.out.println("Writing " + getName()); | ||
| learner.save(); // Doesn't write .lex if lexicon is empty. |
There was a problem hiding this comment.
@cowchipkid can you clarify the change here? Confused about what's happening here ....
There was a problem hiding this comment.
I moved the Accuracy reporting outside the block to ensure doneTraining() gets called before accuracy reporting. doneTraining will apply the feature optimization, we want the score after that. If accuracy is reported before doneTraining, we will get the score of the un-optimized models, which would be for not.
| continue; | ||
| } | ||
| double wt = ltu.getWeightVector().getRawWeights().get(fi); | ||
|
|
There was a problem hiding this comment.
Ya, this is wrong. We need to call hasWeight here I think, this won't work for SparseAveragedPerceptron where we need to sum the past average with the actual weight.
There was a problem hiding this comment.
Also, it doesn't even address absolute zero.
|
The documentation in the feature pruning stuff is in the package-info.java in the package with all the optimization stuff. It was was already in place, I enhanced it a bit, and added some links into the life cycle methods. I also fixed one bug that likely had no impact. It would not disable pruning when you would set the threshold to zero. |
| for (int i = 0; i < indexes.length; ++i) { | ||
| Feature f = inverse.get(indexes[i]); | ||
| previousClassName = | ||
| previousClassName = |
| /** Default for {@link #bias}. */ | ||
| public static final double defaultBias = 1.0; | ||
| /** any weight less than this is considered irrelevant. This is for prunning. */ | ||
| public static final double defaultFeaturePruningThreshold = 0.000001; |
There was a problem hiding this comment.
Is this always used in an absolute sense? I.e. it always have to be positive.
In other words, if its zero, no pruning will be done.
There was a problem hiding this comment.
@danyaljj setting the threshold to zero is how you disable pruning, effectively.
|
A minor comment; Looks good to me! |
|
am still in the thick of post-move activity; feel free to merge. |
The feature pruning implementation, prunes low value features for SVM, LTU subclasses and sparse nets.