Skip to content

BaseCompose support#390

Merged
dtronmans merged 6 commits intomainfrom
feat/nested-transform-compositions
Feb 17, 2026
Merged

BaseCompose support#390
dtronmans merged 6 commits intomainfrom
feat/nested-transform-compositions

Conversation

@dtronmans
Copy link
Contributor

Purpose

Tested that training runs with this config:

model:
  name: detection_nested_augs_test
  predefined_model:
    name: DetectionModel
    variant: light
    params:
      loss_params:
        iou_type: "siou"
        iou_loss_weight: 20
        class_loss_weight: 8

loader:
  params:
    dataset_name: fruit_dataset

trainer:
  precision: "16-mixed"
  preprocessing:
    train_image_size: [384, 512]
    keep_aspect_ratio: true
    normalize:
      active: true
      params:
        mean: [0., 0., 0.]
        std: [1, 1, 1]

    augmentations:
      # --- Regular transforms ---
      - name: HorizontalFlip
        params:
          p: 0.5

      - name: RandomRotate90
        params:
          p: 0.3

      # --- OneOf ---
      - name: OneOf
        params:
          transforms:
            - name: GaussianBlur
              params:
                blur_limit: [3, 7]
            - name: MotionBlur
              params:
                blur_limit: [3, 7]
            - name: Defocus
              params:
                radius: [3, 5]
          p: 0.4

      # --- SomeOf ---
      - name: SomeOf
        params:
          n: 2
          transforms:
            - name: ColorJitter
              params:
                brightness: 0.3
                contrast: 0.3
                saturation: 0.3
                hue: 0.05
            - name: CLAHE
              params:
                clip_limit: [1, 4]
            - name: Sharpen
              params:
                alpha: [0.2, 0.5]
                lightness: [0.5, 1.0]
            - name: Posterize
              params:
                num_bits: [4, 6]
          p: 0.5

      # --- Sequential ---
      - name: Sequential
        params:
          transforms:
            - name: GaussNoise
              params:
                std_range: [0.01, 0.05]
            - name: Sharpen
              params:
                alpha: [0.1, 0.3]
                lightness: [0.7, 1.0]
          p: 0.3

      # --- Batch augmentations ---
      - name: Mosaic4
        params:
          out_height: 384
          out_width: 512
          p: 0.5

      - name: MixUp
        params:
          alpha: [0.3, 0.7]
          p: 0.3

  batch_size: 8
  epochs: 50
  accumulate_grad_batches: 8
  n_workers: 4
  validation_interval: 10
  n_log_images: 8

  callbacks:
    - name: EMACallback
      params:
        decay: 0.9999
        use_dynamic_decay: true
        decay_tau: 2000

  training_strategy:
    name: "TripleLRSGDStrategy"
    params:
      warmup_epochs: 2
      warmup_bias_lr: 0.05
      warmup_momentum: 0.5
      lr: 0.0032
      lre: 0.000384
      momentum: 0.843
      weight_decay: 0.00036
      nesterov: true

Specification

Dependencies & Potential Impact

Deployment Plan

Testing & Validation

@dtronmans dtronmans requested a review from a team as a code owner February 13, 2026 13:44
@dtronmans dtronmans requested review from conorsim, klemen1999, kozlov721 and tersekmatija and removed request for a team February 13, 2026 13:44
@github-actions github-actions bot added the enhancement New feature or request label Feb 13, 2026
@klemen1999
Copy link
Collaborator

I found some strange behaviour. We should check if this is due to our implementation or is it a bug in the original Albumentations library by creating a MRE outside of luxonis-train/luxonis-ml stack.

  • (Sequential or OneOf or SomeOf) + HorizontalFlip produces a mask that is not flipped
augmentations:
  - name: Sequential
    params:
      transforms:
        - name: HorizontalFlip
          params:
            p: 1
      p: 1
image
  • (Sequential or OneOf or SomeOf) + BatchTransform (e.g. Mosaic or MixUp) don't play well together. The below config yields in an error:
- name: OneOf
  params:
    transforms:
      - name: Mosaic4
        params:
          out_height: 384
          out_width: 512
          p: 1

Error:

ValueError: could not broadcast input array from shape (384,4,1) into shape (384,4)

(and a different issue if MixUp is used).
IMO it's fine that this combination is not supported. But in this case we should try to catch it earlier, in some validation step and error out with in a more informative way (e.g. X and Y combination is not supported).

@dtronmans
Copy link
Contributor Author

I found some strange behaviour. We should check if this is due to our implementation or is it a bug in the original Albumentations library by creating a MRE outside of luxonis-train/luxonis-ml stack.

  • (Sequential or OneOf or SomeOf) + HorizontalFlip produces a mask that is not flipped
augmentations:
  - name: Sequential
    params:
      transforms:
        - name: HorizontalFlip
          params:
            p: 1
      p: 1
image * (Sequential or OneOf or SomeOf) + BatchTransform (e.g. Mosaic or MixUp) don't play well together. The below config yields in an error:
- name: OneOf
  params:
    transforms:
      - name: Mosaic4
        params:
          out_height: 384
          out_width: 512
          p: 1

Error:

ValueError: could not broadcast input array from shape (384,4,1) into shape (384,4)

(and a different issue if MixUp is used). IMO it's fine that this combination is not supported. But in this case we should try to catch it earlier, in some validation step and error out with in a more informative way (e.g. X and Y combination is not supported).

instance segmentation augmentations were handled in a special way in the albumentations engine that I hadn't noticed, and yes indeed it is true that batch transforms like Mosaic4 cannot be handled by nested augmentations.

I actually didn't know what to add as tests before so I just had "passing augmentations does not fail" as a test, but your comment gave me the idea to include tests that assert that a single transformation inside SomeOf/Sequential/OneOf gives the same result as the augmentation applied alone (both with p=1). Otherwise the default albumentations library works correctly this (and the batch augmentations not working) were just oversights from my part

Copy link
Collaborator

@klemen1999 klemen1999 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@dtronmans dtronmans merged commit fa31aa7 into main Feb 17, 2026
13 checks passed
@dtronmans dtronmans deleted the feat/nested-transform-compositions branch February 17, 2026 08:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants