Add CrossEntropyLoss to classifier#1053
Conversation
CrossEntropyLoss loss to classifierCrossEntropyLoss to classifier
|
Thanks for proposing the PR. A few points:
|
|
@BenjaminBossan Hi, Thanks for your reply, |
Could you please provide an example to reproduce this error? CE should absolutely work in skorch as is, e.g.: import numpy as np
from sklearn.datasets import make_classification
from torch import nn
from skorch import NeuralNetClassifier
X, y = make_classification(1000, 20, n_informative=10, random_state=0)
X = X.astype(np.float32)
y = y.astype(np.int64)
class MyModule(nn.Module):
def __init__(self, num_units=10, nonlin=nn.ReLU()):
super().__init__()
self.dense0 = nn.Linear(20, num_units)
self.nonlin = nonlin
self.dropout = nn.Dropout(0.5)
self.dense1 = nn.Linear(num_units, num_units)
self.output = nn.Linear(num_units, 2)
def forward(self, X, **kwargs):
X = self.nonlin(self.dense0(X))
X = self.dropout(X)
X = self.nonlin(self.dense1(X))
X = self.output(X)
return X
net = NeuralNetClassifier(
MyModule,
max_epochs=10,
criterion=nn.CrossEntropyLoss(),
lr=0.1,
# Shuffle training data on each epoch
iterator_train__shuffle=True,
)
net.fit(X, y)
y_proba = net.predict_proba(X) |
|
This got a bit stale :) @amine759 is this something you're still working on? |
I had to use
CrossEntropyLossfor my use case, I thought of creating this PR sinceNLLLossis the only loss function supported :).