I tried to compare the .txt model files of the single-task lightgbm model and the multi-task mt-gbm model in example 1. Although both models set num_boost_round as 200 in params, the lightgbm model contains 200 trees while the mt-gbm model contains 400. Also, each pair of trees (Tree0 and Tree1, Tree2 and Tree3, etc.) does not share the same structure. Could you please explain this phenomenon?