Skip to content

How to modify in optimizer_config #39

@svamsip

Description

@svamsip

Hi Team
Great work and thanks for your contribution!

I'm working on training the mask_rcnn_focalnet_small_patch4_mstrain_480-800_adamw_3x_coco_lrf.py with custom dataset in coco format.

  • The DistOptimizerHook has been depreciated from mmcv.runner.hooks.optimizer and has been recommended to use OptimizerHook replacing the same.
    # do not use mmdet version fp16
    fp16 = None
    optimizer_config = dict(
       type="DistOptimizerHook",
       update_interval=1,
       grad_clip=None,
       coalesce=True, 
       bucket_size_mb=-1,
       use_fp16=True,
    )
  • I've replaced with below code which goes with default OptimizerHook
    # do not use mmdet version fp16
    fp16 = None
    optimizer_config = dict(
        grad_clip=None,
    )
  • Installed mmcv using source code:
    git clone https://github.com/open-mmlab/mmcv.git -b 1.x /openmmlab/mmcv \
    && cd /openmmlab/mmcv \
    && MMCV_WITH_OPS=1 pip install --no-cache-dir -e . -v

Would like to know recommended configuration with fp16 and optimizer_config in place DistOptimizerHook for distributed training.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions