Skip to content

Conversation

@mxmasterton
Copy link

Implements exp and log operations as well as softmax

Maxes it possible to use softmax with micrograd
Copy link

@54g0 54g0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is wrong the backward function inside the log function is wrong because the derivative of logx is 1/x so self.grad += (1/self.data)*out.grad


def log(self):
    out = Value(math.log(self.data),(self,),"log")
    def _backward():
        self.grad += (1/self.data)* out.grad
    out._backward = _backward
    return out 

and one more bug you by default python start counting from the integer 0 so it tries to do 0 + Value() so Value class doest know how to add a plain integer from the left side so this would fail the correct way is denom = sum(exps,Value(0.0))

@staticmethod
    def softmax(logits):
        exps = [logit.exp() for logit in logits]
        denom = sum(exps,Value(0.0))
        return [e/denom for e in exps]

@mxmasterton
Copy link
Author

mxmasterton commented Sep 25, 2025

Will fix the first bug in a second. Second one is not an issue because radd is implemented and also there is a check right at the start of add to catch this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants