Skip to content

Capstone Project: Evaluating Neural Network Architectures for RL Agent in PEARL#5

Open
MusaToTheMoon wants to merge 33 commits intoModern-Compilers-Lab:ablation_studyfrom
MusaToTheMoon:ablation_study
Open

Capstone Project: Evaluating Neural Network Architectures for RL Agent in PEARL#5
MusaToTheMoon wants to merge 33 commits intoModern-Compilers-Lab:ablation_studyfrom
MusaToTheMoon:ablation_study

Conversation

@MusaToTheMoon
Copy link

Summary: Merge finalized repo for PEARL pretraining with modular GNN architectures (GAT, GIN, GCN) and pooling options.

Key changes:

  • neural_nets module with GAT, GIN, GCN; modular pooling (Global, SAG in pooling.py).
  • Pretraining scripts in root: pretrain_[MODEL_TYPE]_network.py. Up to date: GCN (pretrain_GCN_network.py), GIN (pretrain_GCN_sag_network.py, pretrain_GIN_network.py), GAT (pretrain_GAT_network_with_dataloader.py).
  • Bash scripts and W&B sweeps in playground/.

Results structure: results/ (.out/.err), sweep_results/ (W&B sweeps), saved_weights/, wandb/, mlruns/.
Usage: Run the appropriate pretrain_[MODEL_TYPE]_network.py; adjust W&B/MLflow in scripts as needed; use playground/ bash for sweeps.

Impact: No breaking changes; adds a clear workflow for adding new architectures and running hyperparameter sweeps.

BrouthenKamel and others added 30 commits October 14, 2024 13:09
…onfigurable-pretraining

Update config (pretrain, hyperparameters), Save training logs, Setup all embedding types
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants