parlange commited on
Commit
4638404
·
verified ·
1 Parent(s): 24f29d8

Upload DeiT3 model from experiment s3

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +2 -0
  2. README.md +165 -0
  3. config.json +76 -0
  4. confusion_matrices/DeiT3_Confusion_Matrix_a.png +0 -0
  5. confusion_matrices/DeiT3_Confusion_Matrix_b.png +0 -0
  6. confusion_matrices/DeiT3_Confusion_Matrix_c.png +0 -0
  7. confusion_matrices/DeiT3_Confusion_Matrix_d.png +0 -0
  8. confusion_matrices/DeiT3_Confusion_Matrix_e.png +0 -0
  9. confusion_matrices/DeiT3_Confusion_Matrix_f.png +0 -0
  10. confusion_matrices/DeiT3_Confusion_Matrix_g.png +0 -0
  11. confusion_matrices/DeiT3_Confusion_Matrix_h.png +0 -0
  12. confusion_matrices/DeiT3_Confusion_Matrix_i.png +0 -0
  13. confusion_matrices/DeiT3_Confusion_Matrix_j.png +0 -0
  14. confusion_matrices/DeiT3_Confusion_Matrix_k.png +0 -0
  15. confusion_matrices/DeiT3_Confusion_Matrix_l.png +0 -0
  16. deit3-gravit-s3.pth +3 -0
  17. evaluation_results.csv +145 -0
  18. model.safetensors +3 -0
  19. pytorch_model.bin +3 -0
  20. roc_confusion_matrix/DeiT3_roc_confusion_matrix_a.png +0 -0
  21. roc_confusion_matrix/DeiT3_roc_confusion_matrix_b.png +0 -0
  22. roc_confusion_matrix/DeiT3_roc_confusion_matrix_c.png +0 -0
  23. roc_confusion_matrix/DeiT3_roc_confusion_matrix_d.png +0 -0
  24. roc_confusion_matrix/DeiT3_roc_confusion_matrix_e.png +0 -0
  25. roc_confusion_matrix/DeiT3_roc_confusion_matrix_f.png +0 -0
  26. roc_confusion_matrix/DeiT3_roc_confusion_matrix_g.png +0 -0
  27. roc_confusion_matrix/DeiT3_roc_confusion_matrix_h.png +0 -0
  28. roc_confusion_matrix/DeiT3_roc_confusion_matrix_i.png +0 -0
  29. roc_confusion_matrix/DeiT3_roc_confusion_matrix_j.png +0 -0
  30. roc_confusion_matrix/DeiT3_roc_confusion_matrix_k.png +0 -0
  31. roc_confusion_matrix/DeiT3_roc_confusion_matrix_l.png +0 -0
  32. roc_curves/DeiT3_ROC_a.png +0 -0
  33. roc_curves/DeiT3_ROC_b.png +0 -0
  34. roc_curves/DeiT3_ROC_c.png +0 -0
  35. roc_curves/DeiT3_ROC_d.png +0 -0
  36. roc_curves/DeiT3_ROC_e.png +0 -0
  37. roc_curves/DeiT3_ROC_f.png +0 -0
  38. roc_curves/DeiT3_ROC_g.png +0 -0
  39. roc_curves/DeiT3_ROC_h.png +0 -0
  40. roc_curves/DeiT3_ROC_i.png +0 -0
  41. roc_curves/DeiT3_ROC_j.png +0 -0
  42. roc_curves/DeiT3_ROC_k.png +0 -0
  43. roc_curves/DeiT3_ROC_l.png +0 -0
  44. training_curves/DeiT3_accuracy.png +0 -0
  45. training_curves/DeiT3_auc.png +0 -0
  46. training_curves/DeiT3_combined_metrics.png +3 -0
  47. training_curves/DeiT3_f1.png +0 -0
  48. training_curves/DeiT3_loss.png +0 -0
  49. training_curves/DeiT3_metrics.csv +28 -0
  50. training_metrics.csv +28 -0
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ training_curves/DeiT3_combined_metrics.png filter=lfs diff=lfs merge=lfs -text
37
+ training_notebook_s3.ipynb filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,165 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - vision-transformer
5
+ - image-classification
6
+ - pytorch
7
+ - timm
8
+ - deit3
9
+ - gravitational-lensing
10
+ - strong-lensing
11
+ - astronomy
12
+ - astrophysics
13
+ datasets:
14
+ - parlange/gravit-c21
15
+ metrics:
16
+ - accuracy
17
+ - auc
18
+ - f1
19
+ paper:
20
+ - title: "GraViT: A Gravitational Lens Discovery Toolkit with Vision Transformers"
21
+ url: "https://arxiv.org/abs/2509.00226"
22
+ authors: "Parlange et al."
23
+ model-index:
24
+ - name: DeiT3-s3
25
+ results:
26
+ - task:
27
+ type: image-classification
28
+ name: Strong Gravitational Lens Discovery
29
+ dataset:
30
+ type: common-test-sample
31
+ name: Common Test Sample (More et al. 2024)
32
+ metrics:
33
+ - type: accuracy
34
+ value: 0.8804
35
+ name: Average Accuracy
36
+ - type: auc
37
+ value: 0.8794
38
+ name: Average AUC-ROC
39
+ - type: f1
40
+ value: 0.6576
41
+ name: Average F1-Score
42
+ ---
43
+
44
+ # 🌌 deit3-gravit-s3
45
+
46
+ 🔭 This model is part of **GraViT**: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery
47
+
48
+ 🔗 **GitHub Repository**: [https://github.com/parlange/gravit](https://github.com/parlange/gravit)
49
+
50
+ ## 🛰️ Model Details
51
+
52
+ - **🤖 Model Type**: DeiT3
53
+ - **🧪 Experiment**: S3 - C21-all-blocks-ResNet18-18660
54
+ - **🌌 Dataset**: C21
55
+ - **🪐 Fine-tuning Strategy**: all-blocks
56
+
57
+ - **🎲 Random Seed**: 18660
58
+
59
+ ## 💻 Quick Start
60
+
61
+ ```python
62
+ import torch
63
+ import timm
64
+
65
+ # Load the model directly from the Hub
66
+ model = timm.create_model(
67
+ 'hf-hub:parlange/deit3-gravit-s3',
68
+ pretrained=True
69
+ )
70
+ model.eval()
71
+
72
+ # Example inference
73
+ dummy_input = torch.randn(1, 3, 224, 224)
74
+ with torch.no_grad():
75
+ output = model(dummy_input)
76
+ predictions = torch.softmax(output, dim=1)
77
+ print(f"Lens probability: {predictions[0][1]:.4f}")
78
+ ```
79
+
80
+ ## ⚡️ Training Configuration
81
+
82
+ **Training Dataset:** C21 (Cañameras et al. 2021)
83
+ **Fine-tuning Strategy:** all-blocks
84
+
85
+
86
+ | 🔧 Parameter | 📝 Value |
87
+ |--------------|----------|
88
+ | Batch Size | 192 |
89
+ | Learning Rate | AdamW with ReduceLROnPlateau |
90
+ | Epochs | 100 |
91
+ | Patience | 10 |
92
+ | Optimizer | AdamW |
93
+ | Scheduler | ReduceLROnPlateau |
94
+ | Image Size | 224x224 |
95
+ | Fine Tune Mode | all_blocks |
96
+ | Stochastic Depth Probability | 0.1 |
97
+
98
+
99
+ ## 📈 Training Curves
100
+
101
+ ![Combined Training Metrics](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/training_curves/DeiT3_combined_metrics.png)
102
+
103
+
104
+ ## 🏁 Final Epoch Training Metrics
105
+
106
+ | Metric | Training | Validation |
107
+ |:---------:|:-----------:|:-------------:|
108
+ | 📉 Loss | 0.0048 | 0.0500 |
109
+ | 🎯 Accuracy | 0.9981 | 0.9910 |
110
+ | 📊 AUC-ROC | 1.0000 | 0.9986 |
111
+ | ⚖️ F1 Score | 0.9981 | 0.9910 |
112
+
113
+
114
+ ## ☑️ Evaluation Results
115
+
116
+ ### ROC Curves and Confusion Matrices
117
+
118
+ Performance across all test datasets (a through l) in the Common Test Sample (More et al. 2024):
119
+
120
+ ![ROC + Confusion Matrix - Dataset A](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_a.png)
121
+ ![ROC + Confusion Matrix - Dataset B](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_b.png)
122
+ ![ROC + Confusion Matrix - Dataset C](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_c.png)
123
+ ![ROC + Confusion Matrix - Dataset D](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_d.png)
124
+ ![ROC + Confusion Matrix - Dataset E](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_e.png)
125
+ ![ROC + Confusion Matrix - Dataset F](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_f.png)
126
+ ![ROC + Confusion Matrix - Dataset G](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_g.png)
127
+ ![ROC + Confusion Matrix - Dataset H](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_h.png)
128
+ ![ROC + Confusion Matrix - Dataset I](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_i.png)
129
+ ![ROC + Confusion Matrix - Dataset J](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_j.png)
130
+ ![ROC + Confusion Matrix - Dataset K](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_k.png)
131
+ ![ROC + Confusion Matrix - Dataset L](https://huggingface.co/parlange/deit3-gravit-s3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_l.png)
132
+
133
+ ### 📋 Performance Summary
134
+
135
+ Average performance across 12 test datasets from the Common Test Sample (More et al. 2024):
136
+
137
+ | Metric | Value |
138
+ |-----------|----------|
139
+ | 🎯 Average Accuracy | 0.8804 |
140
+ | 📈 Average AUC-ROC | 0.8794 |
141
+ | ⚖️ Average F1-Score | 0.6576 |
142
+
143
+
144
+ ## 📘 Citation
145
+
146
+ If you use this model in your research, please cite:
147
+
148
+ ```bibtex
149
+ @misc{parlange2025gravit,
150
+ title={GraViT: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery},
151
+ author={René Parlange and Juan C. Cuevas-Tello and Octavio Valenzuela and Omar de J. Cabrera-Rosas and Tomás Verdugo and Anupreeta More and Anton T. Jaelani},
152
+ year={2025},
153
+ eprint={2509.00226},
154
+ archivePrefix={arXiv},
155
+ primaryClass={cs.CV},
156
+ url={https://arxiv.org/abs/2509.00226},
157
+ }
158
+ ```
159
+
160
+ ---
161
+
162
+
163
+ ## Model Card Contact
164
+
165
+ For questions about this model, please contact the author through: https://github.com/parlange/
config.json ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architecture": "deit3_base_patch16_224",
3
+ "num_classes": 2,
4
+ "num_features": 1000,
5
+ "global_pool": "avg",
6
+ "crop_pct": 0.875,
7
+ "interpolation": "bicubic",
8
+ "mean": [
9
+ 0.485,
10
+ 0.456,
11
+ 0.406
12
+ ],
13
+ "std": [
14
+ 0.229,
15
+ 0.224,
16
+ 0.225
17
+ ],
18
+ "first_conv": "conv1",
19
+ "classifier": "fc",
20
+ "input_size": [
21
+ 3,
22
+ 224,
23
+ 224
24
+ ],
25
+ "pool_size": [
26
+ 7,
27
+ 7
28
+ ],
29
+ "pretrained_cfg": {
30
+ "tag": "gravit_s3",
31
+ "custom_load": false,
32
+ "input_size": [
33
+ 3,
34
+ 224,
35
+ 224
36
+ ],
37
+ "fixed_input_size": true,
38
+ "interpolation": "bicubic",
39
+ "crop_pct": 0.875,
40
+ "crop_mode": "center",
41
+ "mean": [
42
+ 0.485,
43
+ 0.456,
44
+ 0.406
45
+ ],
46
+ "std": [
47
+ 0.229,
48
+ 0.224,
49
+ 0.225
50
+ ],
51
+ "num_classes": 2,
52
+ "pool_size": [
53
+ 7,
54
+ 7
55
+ ],
56
+ "first_conv": "conv1",
57
+ "classifier": "fc"
58
+ },
59
+ "model_name": "deit3_gravit_s3",
60
+ "experiment": "s3",
61
+ "training_strategy": "all-blocks",
62
+ "dataset": "C21",
63
+ "hyperparameters": {
64
+ "batch_size": "192",
65
+ "learning_rate": "AdamW with ReduceLROnPlateau",
66
+ "epochs": "100",
67
+ "patience": "10",
68
+ "optimizer": "AdamW",
69
+ "scheduler": "ReduceLROnPlateau",
70
+ "image_size": "224x224",
71
+ "fine_tune_mode": "all_blocks",
72
+ "stochastic_depth_probability": "0.1"
73
+ },
74
+ "hf_hub_id": "parlange/deit3-gravit-s3",
75
+ "license": "apache-2.0"
76
+ }
confusion_matrices/DeiT3_Confusion_Matrix_a.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_b.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_c.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_d.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_e.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_f.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_g.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_h.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_i.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_j.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_k.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_l.png ADDED
deit3-gravit-s3.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:00177f54a9da39283b65f7fec299d1c44587607a588742d1dc1ed945f37b57bd
3
+ size 343337390
evaluation_results.csv ADDED
@@ -0,0 +1,145 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Model,Dataset,Loss,Accuracy,AUCROC,F1
2
+ ViT,a,0.23330061458559465,0.904432568374725,0.9365773480662983,0.4950166112956811
3
+ ViT,b,0.15167975368101766,0.9421565545425966,0.9638158379373848,0.6182572614107884
4
+ ViT,c,0.42172509137656994,0.8267840301791889,0.9025782688766114,0.3510011778563015
5
+ ViT,d,0.1090005352573176,0.95881798176674,0.974342541436464,0.6946386946386947
6
+ ViT,e,0.2932905108554434,0.8902305159165752,0.9406039506546583,0.7487437185929648
7
+ ViT,f,0.21606165350946946,0.9115482921539773,0.9441149153910586,0.20694444444444443
8
+ ViT,g,0.07253240664303302,0.9705,0.998692111111111,0.971111473804472
9
+ ViT,h,0.21570144249498843,0.9093333333333333,0.9954923333333334,0.9162303664921466
10
+ ViT,i,0.04990530861914158,0.9793333333333333,0.9991746666666667,0.9795851168916694
11
+ ViT,j,3.397079090178013,0.5168333333333334,0.5124903333333333,0.1486049926578561
12
+ ViT,k,3.3744520025253295,0.5256666666666666,0.6147927777777777,0.1509546539379475
13
+ ViT,l,1.2076301042454038,0.7930305113426048,0.7260690116291557,0.6331083614548182
14
+ MLP-Mixer,a,0.6226771516981308,0.8201823325998113,0.8910414364640883,0.33796296296296297
15
+ MLP-Mixer,b,0.32969332946167546,0.908519333542911,0.9376252302025783,0.5008576329331046
16
+ MLP-Mixer,c,1.2415813063720755,0.6787173844702924,0.8285883977900552,0.2222222222222222
17
+ MLP-Mixer,d,0.07694695194780883,0.9795661741590694,0.9830349907918968,0.8179271708683473
18
+ MLP-Mixer,e,0.6463492494383183,0.8342480790340285,0.8990236887913419,0.6591422121896162
19
+ MLP-Mixer,f,0.5533021807061205,0.8481140113081869,0.9094389205470177,0.12960497114957834
20
+ MLP-Mixer,g,0.14915568167157472,0.9568333333333333,0.9983734444444444,0.9585798816568047
21
+ MLP-Mixer,h,0.6326082771439105,0.835,0.9925535,0.8582474226804123
22
+ MLP-Mixer,i,0.015157968224957585,0.9945,0.9999197777777777,0.9945246391239423
23
+ MLP-Mixer,j,4.768675954580307,0.48783333333333334,0.32178805555555556,0.10642628671125327
24
+ MLP-Mixer,k,4.634678235054016,0.5255,0.5500302222222222,0.11391223155929038
25
+ MLP-Mixer,l,1.844831031485805,0.7471841785204378,0.6553877700624002,0.5818245429895915
26
+ CvT,a,0.21327115955104695,0.9280100597296448,0.9095662983425414,0.5410821643286573
27
+ CvT,b,0.2245168307551945,0.9261238604212512,0.9276169429097607,0.5346534653465347
28
+ CvT,c,0.5436479192752952,0.8308707953473751,0.8541436464088398,0.3341584158415842
29
+ CvT,d,0.07627428377018489,0.9767368751964791,0.9801915285451197,0.7848837209302325
30
+ CvT,e,0.4199431847400644,0.8594950603732162,0.8942859305229698,0.678391959798995
31
+ CvT,f,0.23622647001237154,0.9209975989466347,0.9165266282718422,0.20930232558139536
32
+ CvT,g,0.0921408845228143,0.9676666666666667,0.9993117777777779,0.9686287192755498
33
+ CvT,h,0.2613335499805398,0.9171666666666667,0.9978183333333334,0.9233852320024665
34
+ CvT,i,0.013547633681911975,0.9945,0.9999296666666666,0.9945210028225137
35
+ CvT,j,4.21272634255886,0.4856666666666667,0.24911683333333334,0.06257594167679223
36
+ CvT,k,4.134133087083697,0.5125,0.6382155555555556,0.06579367614180773
37
+ CvT,l,1.4700100416854185,0.7926074771297129,0.6455889580274956,0.6224489795918368
38
+ Swin,a,0.5396662080149275,0.7937755422823012,0.9155405156537753,0.32510288065843623
39
+ Swin,b,0.3355165186101856,0.8814838101226029,0.9462909760589319,0.455988455988456
40
+ Swin,c,1.0126445727766553,0.657340458975165,0.871803867403315,0.22475106685633
41
+ Swin,d,0.04055070694646055,0.986167871738447,0.9945414364640884,0.8777777777777778
42
+ Swin,e,0.6789207920415734,0.7760702524698134,0.9016120487398773,0.6076923076923076
43
+ Swin,f,0.5000679909827477,0.8234838509797847,0.9302990716669634,0.12177263969171484
44
+ Swin,g,0.1668860674443422,0.9405,0.999325,0.9437883797827114
45
+ Swin,h,0.5258767666163622,0.8216666666666667,0.9975895555555556,0.8485277463193658
46
+ Swin,i,0.010505019271629862,0.996,0.9999704444444445,0.996011964107677
47
+ Swin,j,4.138313320279122,0.45866666666666667,0.14908555555555558,0.06127167630057803
48
+ Swin,k,3.9819322906583547,0.5141666666666667,0.5644793333333333,0.06779661016949153
49
+ Swin,l,1.6023035252941213,0.726296864258897,0.6168375948237592,0.5575312019148573
50
+ CaiT,a,0.18350773630736947,0.945300220056586,0.9521998158379374,0.634453781512605
51
+ CaiT,b,0.2112922705674801,0.9320968248978309,0.9511565377532228,0.583011583011583
52
+ CaiT,c,0.47731829819458904,0.8654511160012575,0.9114604051565378,0.4136986301369863
53
+ CaiT,d,0.07184898189050302,0.9808236403646652,0.979158379373849,0.8319559228650137
54
+ CaiT,e,0.66179066170191,0.8068057080131723,0.8871868614243549,0.6317991631799164
55
+ CaiT,f,0.23348395380409856,0.9275811323677484,0.9449781479343615,0.24413904607922393
56
+ CaiT,g,0.09087410058942623,0.9678333333333333,0.9993048888888888,0.9687651723579868
57
+ CaiT,h,0.23191223003831693,0.9325,0.9980312777777778,0.9366296354248161
58
+ CaiT,i,0.016945911412360147,0.9936666666666667,0.999897111111111,0.9936918990703851
59
+ CaiT,j,3.6651160932779314,0.5033333333333333,0.5270411111111111,0.1214622641509434
60
+ CaiT,k,3.591187914278358,0.5291666666666667,0.5760749444444444,0.12727834414581402
61
+ CaiT,l,1.295636463950478,0.8024430225794511,0.7398510347639352,0.6420084323495592
62
+ DeiT,a,0.16014718334959085,0.9569317824583464,0.940755985267035,0.6745843230403801
63
+ DeiT,b,0.10635868036350676,0.9729644765796919,0.9633609576427257,0.7675675675675676
64
+ DeiT,c,0.23710613171471923,0.9355548569632192,0.9248029465930019,0.5807770961145194
65
+ DeiT,d,0.08718779183401884,0.9779943414020749,0.9719484346224677,0.8022598870056498
66
+ DeiT,e,0.36636867216337393,0.9099890230515917,0.9294634072504351,0.7759562841530054
67
+ DeiT,f,0.10625621084368816,0.9671597862287972,0.9490269646243918,0.4011299435028249
68
+ DeiT,g,0.023581442082300782,0.9911666666666666,0.9998664444444445,0.9912266181095845
69
+ DeiT,h,0.09289937312342227,0.9713333333333334,0.9993322222222223,0.9720779220779221
70
+ DeiT,i,0.013417671525850891,0.9938333333333333,0.9999291111111112,0.9938589211618257
71
+ DeiT,j,5.060567226946354,0.519,0.5434004444444445,0.10037406483790523
72
+ DeiT,k,5.050403485819698,0.5216666666666666,0.5780161111111111,0.10087719298245613
73
+ DeiT,l,1.6720809529578171,0.8271376447570197,0.7329687320683346,0.6685592618878637
74
+ DeiT3,a,0.09279773294391665,0.9742219427852876,0.956814917127072,0.7670454545454546
75
+ DeiT3,b,0.1237781981520966,0.9585036152153411,0.9581307550644568,0.6716417910447762
76
+ DeiT3,c,0.14906807608975062,0.949386985224772,0.9397476979742174,0.6264501160092807
77
+ DeiT3,d,0.076721701036737,0.9783087079534738,0.9859097605893186,0.7964601769911505
78
+ DeiT3,e,0.33115524258035206,0.9066959385290889,0.9261711950351927,0.7605633802816901
79
+ DeiT3,f,0.07180493832847527,0.9732785996437147,0.9582022281728896,0.43902439024390244
80
+ DeiT3,g,0.03650352464616299,0.9841666666666666,0.999683,0.9843672864900445
81
+ DeiT3,h,0.04991138017177582,0.9793333333333333,0.9995745555555556,0.9796921061251228
82
+ DeiT3,i,0.011555743247270584,0.9946666666666667,0.9999258888888889,0.9946790821416694
83
+ DeiT3,j,4.905326539263129,0.5121666666666667,0.3997841666666667,0.09799691833590139
84
+ DeiT3,k,4.880378704622388,0.5226666666666666,0.7140411111111111,0.09993714644877436
85
+ DeiT3,l,1.595974028533837,0.8310507112262704,0.7152346731164213,0.6728110599078341
86
+ Twins_SVT,a,0.5287571287837204,0.8016347060672745,0.9145736648250459,0.3322751322751323
87
+ Twins_SVT,b,0.33979776653431004,0.86765168186105,0.9400736648250463,0.4272108843537415
88
+ Twins_SVT,c,0.8760767604374579,0.6592266582835586,0.8834567219152852,0.2246065808297568
89
+ Twins_SVT,d,0.0414358896034407,0.9867966048412449,0.9934677716390424,0.8820224719101124
90
+ Twins_SVT,e,1.014912523631861,0.6597145993413831,0.864413834859608,0.5032051282051282
91
+ Twins_SVT,f,0.485403974522426,0.8147316241964216,0.9289660305625117,0.11603843311160385
92
+ Twins_SVT,g,0.16855271980445832,0.9326666666666666,0.9987118333333335,0.9367762128325509
93
+ Twins_SVT,h,0.4528699638573453,0.8221666666666667,0.9961345,0.8487168580745782
94
+ Twins_SVT,i,0.010371196451596915,0.9958333333333333,0.9999524444444444,0.9958409582432207
95
+ Twins_SVT,j,4.296078189373016,0.4568333333333333,0.19758433333333336,0.07807637906647807
96
+ Twins_SVT,k,4.137896645441652,0.52,0.6393722222222222,0.08745247148288973
97
+ Twins_SVT,l,1.6425652151218086,0.7218021257469198,0.6391880993028709,0.5555461687927684
98
+ Twins_PCPVT,a,0.28459295154517267,0.8962590380383527,0.9146933701657459,0.4745222929936306
99
+ Twins_PCPVT,b,0.1722448716166727,0.9421565545425966,0.9457421731123388,0.6182572614107884
100
+ Twins_PCPVT,c,0.5418229629617936,0.8047783715812638,0.8814585635359117,0.32426550598476606
101
+ Twins_PCPVT,d,0.07013078457132774,0.9820811065702609,0.9903885819521179,0.8394366197183099
102
+ Twins_PCPVT,e,0.479874432446797,0.8375411635565313,0.8898963142359798,0.6681614349775785
103
+ Twins_PCPVT,f,0.25029015365415846,0.9061265587483541,0.930595278912214,0.19735099337748344
104
+ Twins_PCPVT,g,0.07097410994302482,0.9728333333333333,0.999268111111111,0.9734570916788796
105
+ Twins_PCPVT,h,0.26691202929150315,0.9,0.9973623888888888,0.9087868653086044
106
+ Twins_PCPVT,i,0.016836636100895704,0.994,0.9998691111111111,0.9940139674093781
107
+ Twins_PCPVT,j,7.1517002888023855,0.5098333333333334,0.18946605555555554,0.1254831995242343
108
+ Twins_PCPVT,k,7.097562807479873,0.531,0.5933782777777777,0.13040791100123608
109
+ Twins_PCPVT,l,2.4205206177914396,0.7878483422346783,0.6371767534317276,0.6253968253968254
110
+ PiT,a,2.0392126046198413,0.4360264067903175,0.8411418047882135,0.15853658536585366
111
+ PiT,b,1.2754779040757234,0.6463376296762025,0.8967909760589319,0.23103212576896787
112
+ PiT,c,3.5646744424987236,0.255580006287331,0.7408324125230203,0.12490761271249076
113
+ PiT,d,0.03054899423822634,0.9880540710468406,0.9968987108655616,0.898936170212766
114
+ PiT,e,2.2146248903808425,0.4774972557628979,0.806800121092863,0.4152334152334152
115
+ PiT,f,1.8453932350700364,0.5544109673921462,0.8653539513829515,0.05549170907896897
116
+ PiT,g,0.6705343906283379,0.8145,0.9932486111111111,0.8435259384226065
117
+ PiT,h,1.8841899921298026,0.6073333333333333,0.9759541111111112,0.7180469123982767
118
+ PiT,i,0.010514569073915481,0.9956666666666667,0.9999776666666667,0.9956853634251577
119
+ PiT,j,4.467936363220215,0.3516666666666667,0.16883433333333334,0.10285977859778597
120
+ PiT,k,3.807916547060013,0.5328333333333334,0.7121261111111112,0.13727300707910126
121
+ PiT,l,2.465662835393678,0.5489397705039395,0.6250971701663347,0.4429933394279744
122
+ ResNet-18,a,0.6459113768308052,0.7654825526563973,0.9383268876611418,0.3079777365491651
123
+ ResNet-18,b,0.6718155814768645,0.7790003143665514,0.9370386740331492,0.3207729468599034
124
+ ResNet-18,c,1.1330159351684057,0.6526249607041811,0.9005441988950276,0.23103688239387613
125
+ ResNet-18,d,0.014098176388939534,0.9949701351776171,0.9996151012891344,0.9540229885057471
126
+ ResNet-18,e,0.727007691933478,0.7058177826564215,0.9220616059940968,0.5533333333333333
127
+ ResNet-18,f,0.6458914898100172,0.7848346371311284,0.9426299731351963,0.1067524115755627
128
+ ResNet-18,g,0.3519982568551786,0.8841666666666667,0.9969418333333333,0.8959736566382278
129
+ ResNet-18,h,0.5965127636720426,0.8171666666666667,0.9945367777777777,0.845122123394042
130
+ ResNet-18,i,0.0032961710044764913,0.9986666666666667,0.9999976666666667,0.9986653319986654
131
+ ResNet-18,j,7.148829643726349,0.38666666666666666,0.04898066666666667,0.004329004329004329
132
+ ResNet-18,k,6.800127555849089,0.5011666666666666,0.5824981111111112,0.005317381189764041
133
+ ResNet-18,l,2.5991157615762135,0.6945164190153879,0.5908983361702036,0.5229956238130625
134
+ Ensemble,a,,0.9289531593838416,0.9334871086556169,0.5735849056603773
135
+ Ensemble,b,,0.9462433197107828,0.9550589318600368,0.64
136
+ Ensemble,c,,0.840301791889343,0.8989116022099447,0.37438423645320196
137
+ Ensemble,d,,0.9874253379440427,0.992403314917127,0.8837209302325582
138
+ Ensemble,e,,0.862788144895719,0.9159312798001968,0.7086247086247086
139
+ Ensemble,f,,0.9261095190147935,0.9433002912162074,0.24165341812400637
140
+ Ensemble,g,,0.9761666666666666,0.9997736666666667,0.9767138902458883
141
+ Ensemble,h,,0.92,0.9990904444444445,0.9259030564989195
142
+ Ensemble,i,,0.998,0.9999855555555555,0.9980033277870216
143
+ Ensemble,j,,0.4985,0.25344022222222223,0.08122137404580153
144
+ Ensemble,k,,0.5203333333333333,0.6153003333333333,0.0846055979643766
145
+ Ensemble,l,,0.7978954047908624,0.6487850790241945,0.6321462945139558
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:394b7674c4d7157193f9c8a3b055fb9b024c6435c98dfe461299e0dc6da097cf
3
+ size 343287616
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:00177f54a9da39283b65f7fec299d1c44587607a588742d1dc1ed945f37b57bd
3
+ size 343337390
roc_confusion_matrix/DeiT3_roc_confusion_matrix_a.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_b.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_c.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_d.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_e.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_f.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_g.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_h.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_i.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_j.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_k.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_l.png ADDED
roc_curves/DeiT3_ROC_a.png ADDED
roc_curves/DeiT3_ROC_b.png ADDED
roc_curves/DeiT3_ROC_c.png ADDED
roc_curves/DeiT3_ROC_d.png ADDED
roc_curves/DeiT3_ROC_e.png ADDED
roc_curves/DeiT3_ROC_f.png ADDED
roc_curves/DeiT3_ROC_g.png ADDED
roc_curves/DeiT3_ROC_h.png ADDED
roc_curves/DeiT3_ROC_i.png ADDED
roc_curves/DeiT3_ROC_j.png ADDED
roc_curves/DeiT3_ROC_k.png ADDED
roc_curves/DeiT3_ROC_l.png ADDED
training_curves/DeiT3_accuracy.png ADDED
training_curves/DeiT3_auc.png ADDED
training_curves/DeiT3_combined_metrics.png ADDED

Git LFS Details

  • SHA256: 6c6773230a673cf1d69a4742d27852a4f1b547fca19743d5bb0e454cd4aa566b
  • Pointer size: 131 Bytes
  • Size of remote file: 148 kB
training_curves/DeiT3_f1.png ADDED
training_curves/DeiT3_loss.png ADDED
training_curves/DeiT3_metrics.csv ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.13692867085577207,0.07421052139997482,0.9407556270096463,0.978,0.9887223029239888,0.997632,0.9407572144369122,0.9783464566929134
3
+ 2,0.048061912743991596,0.05094329964928329,0.9823954983922829,0.982,0.9984246124879234,0.9985300000000001,0.9823921957494707,0.9818548387096774
4
+ 3,0.034636192352861354,0.04595610737428069,0.9877813504823151,0.983,0.9991717887876814,0.99875,0.9877820052515942,0.9829488465396189
5
+ 4,0.029761615775262044,0.06541032767249272,0.9893354769560557,0.985,0.9993711192789341,0.998534,0.9893377625375054,0.9849548645937813
6
+ 5,0.030002773200967305,0.051245969947427514,0.9892818863879957,0.984,0.999385977307008,0.99814,0.9892824607470125,0.9839679358717435
7
+ 6,0.02682357684221011,0.04745743262767792,0.9903536977491961,0.985,0.9995302841276569,0.9979460000000001,0.9903516295025729,0.9850448654037887
8
+ 7,0.022328507840345912,0.04253761661052704,0.9917738478027867,0.988,0.9996601780723249,0.9993879999999999,0.9917762717312689,0.9879518072289156
9
+ 8,0.023209826517969944,0.04362523984909058,0.9920685959271168,0.989,0.9995784854834466,0.9991639999999999,0.9920719948575102,0.9890329012961117
10
+ 9,0.01983714324614584,0.07314276767149568,0.9930064308681672,0.984,0.9997082315227417,0.997778,0.9930054937692617,0.9839679358717435
11
+ 10,0.023987467304902255,0.05695145654678345,0.9913183279742765,0.986,0.9995943573095121,0.998852,0.9913146043319752,0.9860557768924303
12
+ 11,0.019595004033321258,0.062166427850723266,0.992550911039657,0.984,0.9997504003496885,0.9977739999999999,0.9925465172395302,0.9840637450199203
13
+ 12,0.018815073947095796,0.053746583104133605,0.9929796355841372,0.99,0.9997343002036786,0.999192,0.992978883052846,0.9899799599198397
14
+ 13,0.019448585342915305,0.08425082373619079,0.9929260450160772,0.984,0.9997465720416916,0.9977720000000001,0.9929294552466656,0.9839034205231388
15
+ 14,0.011912179231745493,0.06962954658264062,0.9956323687031082,0.985,0.9999023422467143,0.9982420000000001,0.9956324857318936,0.9849246231155779
16
+ 15,0.010103805389343625,0.061363389253616334,0.9965434083601287,0.989,0.9999240972602755,0.998372,0.9965425745758624,0.9889447236180905
17
+ 16,0.008275224358574083,0.05774303518238594,0.9969453376205788,0.991,0.9999478827417694,0.9984360000000001,0.9969451739107134,0.9909729187562688
18
+ 17,0.008736756973947815,0.0483376273214817,0.9969721329046088,0.99,0.9999491435273737,0.9985139999999999,0.9969712401833338,0.99
19
+ 18,0.0076623801350851105,0.04974857234954834,0.9971061093247588,0.99,0.9999615905542746,0.99844,0.9971067295327904,0.99
20
+ 19,0.00820424718388515,0.055063622951536675,0.9970257234726688,0.992,0.9999551243611351,0.998472,0.997025643773949,0.9919678714859438
21
+ 20,0.007627792189001043,0.049032895147800445,0.9970525187566989,0.99,0.9999623789042711,0.998566,0.9970514126413982,0.9900199600798403
22
+ 21,0.006668012922197742,0.04988277526572347,0.997534833869239,0.99,0.9999717730494009,0.998564,0.9975352301344907,0.99
23
+ 22,0.005706650657029792,0.05048745417594819,0.9976956055734191,0.989,0.999980499466392,0.998592,0.9976965931004929,0.988988988988989
24
+ 23,0.00603126482562674,0.05044751685857773,0.9977224008574491,0.99,0.999977414993182,0.9985660000000001,0.9977222177559826,0.99
25
+ 24,0.006448185958810093,0.05043740690127015,0.9978295819935691,0.991,0.9999700197245457,0.998554,0.9978294075086421,0.991008991008991
26
+ 25,0.0067558092912254775,0.05029597575962543,0.997668810289389,0.99,0.9999410776586493,0.998568,0.9976688727526057,0.99
27
+ 26,0.006298534143184554,0.04814575481414795,0.997508038585209,0.991,0.999974746952575,0.9985980000000001,0.9975078382506632,0.991008991008991
28
+ 27,0.004799275206745575,0.04999193596467376,0.9980975348338692,0.991,0.9999868723211895,0.998586,0.9980974838554088,0.991008991008991
training_metrics.csv ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.13692867085577207,0.07421052139997482,0.9407556270096463,0.978,0.9887223029239888,0.997632,0.9407572144369122,0.9783464566929134
3
+ 2,0.048061912743991596,0.05094329964928329,0.9823954983922829,0.982,0.9984246124879234,0.9985300000000001,0.9823921957494707,0.9818548387096774
4
+ 3,0.034636192352861354,0.04595610737428069,0.9877813504823151,0.983,0.9991717887876814,0.99875,0.9877820052515942,0.9829488465396189
5
+ 4,0.029761615775262044,0.06541032767249272,0.9893354769560557,0.985,0.9993711192789341,0.998534,0.9893377625375054,0.9849548645937813
6
+ 5,0.030002773200967305,0.051245969947427514,0.9892818863879957,0.984,0.999385977307008,0.99814,0.9892824607470125,0.9839679358717435
7
+ 6,0.02682357684221011,0.04745743262767792,0.9903536977491961,0.985,0.9995302841276569,0.9979460000000001,0.9903516295025729,0.9850448654037887
8
+ 7,0.022328507840345912,0.04253761661052704,0.9917738478027867,0.988,0.9996601780723249,0.9993879999999999,0.9917762717312689,0.9879518072289156
9
+ 8,0.023209826517969944,0.04362523984909058,0.9920685959271168,0.989,0.9995784854834466,0.9991639999999999,0.9920719948575102,0.9890329012961117
10
+ 9,0.01983714324614584,0.07314276767149568,0.9930064308681672,0.984,0.9997082315227417,0.997778,0.9930054937692617,0.9839679358717435
11
+ 10,0.023987467304902255,0.05695145654678345,0.9913183279742765,0.986,0.9995943573095121,0.998852,0.9913146043319752,0.9860557768924303
12
+ 11,0.019595004033321258,0.062166427850723266,0.992550911039657,0.984,0.9997504003496885,0.9977739999999999,0.9925465172395302,0.9840637450199203
13
+ 12,0.018815073947095796,0.053746583104133605,0.9929796355841372,0.99,0.9997343002036786,0.999192,0.992978883052846,0.9899799599198397
14
+ 13,0.019448585342915305,0.08425082373619079,0.9929260450160772,0.984,0.9997465720416916,0.9977720000000001,0.9929294552466656,0.9839034205231388
15
+ 14,0.011912179231745493,0.06962954658264062,0.9956323687031082,0.985,0.9999023422467143,0.9982420000000001,0.9956324857318936,0.9849246231155779
16
+ 15,0.010103805389343625,0.061363389253616334,0.9965434083601287,0.989,0.9999240972602755,0.998372,0.9965425745758624,0.9889447236180905
17
+ 16,0.008275224358574083,0.05774303518238594,0.9969453376205788,0.991,0.9999478827417694,0.9984360000000001,0.9969451739107134,0.9909729187562688
18
+ 17,0.008736756973947815,0.0483376273214817,0.9969721329046088,0.99,0.9999491435273737,0.9985139999999999,0.9969712401833338,0.99
19
+ 18,0.0076623801350851105,0.04974857234954834,0.9971061093247588,0.99,0.9999615905542746,0.99844,0.9971067295327904,0.99
20
+ 19,0.00820424718388515,0.055063622951536675,0.9970257234726688,0.992,0.9999551243611351,0.998472,0.997025643773949,0.9919678714859438
21
+ 20,0.007627792189001043,0.049032895147800445,0.9970525187566989,0.99,0.9999623789042711,0.998566,0.9970514126413982,0.9900199600798403
22
+ 21,0.006668012922197742,0.04988277526572347,0.997534833869239,0.99,0.9999717730494009,0.998564,0.9975352301344907,0.99
23
+ 22,0.005706650657029792,0.05048745417594819,0.9976956055734191,0.989,0.999980499466392,0.998592,0.9976965931004929,0.988988988988989
24
+ 23,0.00603126482562674,0.05044751685857773,0.9977224008574491,0.99,0.999977414993182,0.9985660000000001,0.9977222177559826,0.99
25
+ 24,0.006448185958810093,0.05043740690127015,0.9978295819935691,0.991,0.9999700197245457,0.998554,0.9978294075086421,0.991008991008991
26
+ 25,0.0067558092912254775,0.05029597575962543,0.997668810289389,0.99,0.9999410776586493,0.998568,0.9976688727526057,0.99
27
+ 26,0.006298534143184554,0.04814575481414795,0.997508038585209,0.991,0.999974746952575,0.9985980000000001,0.9975078382506632,0.991008991008991
28
+ 27,0.004799275206745575,0.04999193596467376,0.9980975348338692,0.991,0.9999868723211895,0.998586,0.9980974838554088,0.991008991008991