-
Notifications
You must be signed in to change notification settings - Fork 7
Open
Description
(methylbert_39) python3 test_deconvolute.py
Building Vocab
Total number of sequences : 612
# of reads in each label: [319. 293.]
No detected GPU device. Load the model on CPU
The model is loaded on CPU
Restore the pretrained model tmp/bert.model/
You passed along `num_labels=20` with an incompatible id to label map: {'0': 'LABEL_0', '1': 'LABEL_1'}. The number of labels wil be overwritten to -1.
Cross entropy loss assigned
Traceback (most recent call last):
File "/omics/groups/OE0219/internal/Yunhee/DL_project/methylbert/test/test_deconvolute.py", line 48, in <module>
trainer.load(model_dir)
File "/omics/groups/OE0219/internal/Yunhee/anaconda3/envs/methylbert_39/lib/python3.9/site-packages/methylbert/trainer.py", line 646, in load
self.bert = MethylBertEmbeddedDMR.from_pretrained(dir_path,
File "/omics/groups/OE0219/internal/Yunhee/anaconda3/envs/methylbert_39/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3960, in from_pretrained
) = cls._load_pretrained_model(
File "/omics/groups/OE0219/internal/Yunhee/anaconda3/envs/methylbert_39/lib/python3.9/site-packages/transformers/modeling_utils.py", line 4492, in _load_pretrained_model
raise RuntimeError(f"Error(s) in loading state_dict for {model.__class__.__name__}:\n\t{error_msg}")
RuntimeError: Error(s) in loading state_dict for MethylBertEmbeddedDMR:
size mismatch for dmr_encoder.0.weight: copying a param with shape torch.Size([20, 151]) from checkpoint, the shape in current model is torch.Size([10, 151]).
You may consider adding `ignore_mismatched_sizes=True` in the model `from_pretrained` method.
Run "test_deconvolute.py" with the previous model directory
Metadata
Metadata
Assignees
Labels
No labels