-
Notifications
You must be signed in to change notification settings - Fork 17
Open
Description
AdaGrad does not increase the size of the weight vector while learning. Weight Vector dimensions might increase if there are new features seen while feature extraction from unseen training examples.
Example feature:
discrete MyTestFeature(MyData d) <- {
return d.isCapitalized() ? "YES" : "NO"
}
For this example, weight vector should have size 3 - YES, NO, Bias Term. But exampleFeatures.length is only 1 here.
Compare with implementation of StochasticGradientDescent.
Metadata
Metadata
Assignees
Labels
No labels