Skip to content

Is variables of batch norm layers folded while inference? #12

@jakc4103

Description

@jakc4103

Hi, thanks again for sharing this repo for reproducing the awesome results.

I am curious about is the BatchNorm layers folded into preceding Conv or FC layers while inference?
I ran both static and retrain mode for mobilenetv2. While inference, I found that the variables of BatchNorm (mean/var/gamma/beta) are filled with some values instead of 1s or 0s, and still get involved in the computation graph. Is that worked as intended?
(I load the quantized model with .ckpt and .pb files.)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions