- Clone using:
git clone --recurse-submodules https://github.com/TomasGadea/MLP-NAS.git
cd MLP-NAS
- Create python3.9.13 environ (python3 version of scar):
python3 -m venv environ
source environ/bin/activate
pip install -r requirements.txt
- Ask @TomasGadea for config files:
config.jsonand add your wandb API key (this last thing is optional)
(Open a tmux session rooted in MLP-NAS)
sh execute.sh
To see all available params check main.py.
--use-ampstoresTruewhen added and usestorch.cuda.amp.autocastandtorch.cuda.amp.GradScaler.--wandbstoresTrueand logs into your wandb account using your API (optional).
(Open a tmux session rooted in MLP-NAS)
sh fixed_execute.sh
To see all available params check fixed_main.py.
--path-to-supernetis the output path of any past experiment ofexecute.sh. Check the example infixed_execute.sh.
Output files for Train Search are:
flops_table.txt: string formatted table of n_params and flops of model.log.csv: metrics such as acc, F, mmc, along epochs.params.json: parameters that include all arguments called inexecute.shand other extra info.W.pt: Last version of the model saved in PyTorch format after all training epochs.W_test.pt: Best version of the model saved in PyTorch format after all training epochs.
Output files for Retrain Fixed are:
flops_table.txt: string formatted table of n_params and flops of model.log.csv: metrics such as acc, mmc, along epochs.params.json: parameters that include all arguments called infixed_execute.shand other extra info.W.pt: Last version of the model saved in PyTorch format after all training epochs.
Retrain Fixed files are stored in out/retrain/ dir, unlike Train Search that are directly into out/ directoy. They can be modified however using the --output arg if desired.