File size: 10,952 Bytes
cdb210c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
---
library_name: transformers
license: apache-2.0
base_model: PekingU/rtdetr_v2_r101vd
tags:
- generated_from_trainer
model-index:
- name: rtdetr-v2-r101-cppe5-finetune-2
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# rtdetr-v2-r101-cppe5-finetune-2

This model is a fine-tuned version of [PekingU/rtdetr_v2_r101vd](https://huggingface.co/PekingU/rtdetr_v2_r101vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 9.5258
- Map: 0.3781
- Map 50: 0.9125
- Map 75: 0.2023
- Map Small: 0.2722
- Map Medium: 0.4358
- Map Large: 0.4439
- Mar 1: 0.4279
- Mar 10: 0.6136
- Mar 100: 0.6664
- Mar Small: 0.6158
- Mar Medium: 0.701
- Mar Large: 0.6333

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 50

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|
| No log        | 1.0   | 313   | 18.7992         | 0.0102 | 0.037  | 0.0016 | 0.0193    | 0.023      | 0.0       | 0.0444 | 0.1863 | 0.2535  | 0.253     | 0.2555     | 0.0       |
| 112.163       | 2.0   | 626   | 14.0671         | 0.0806 | 0.265  | 0.0204 | 0.0711    | 0.0931     | 0.1998    | 0.1649 | 0.3478 | 0.4882  | 0.4476    | 0.5154     | 0.5667    |
| 112.163       | 3.0   | 939   | 13.0106         | 0.1549 | 0.5077 | 0.0356 | 0.1316    | 0.1788     | 0.2388    | 0.2393 | 0.4002 | 0.4952  | 0.4637    | 0.5172     | 0.4       |
| 12.4154       | 4.0   | 1252  | 12.8674         | 0.1861 | 0.578  | 0.0506 | 0.1487    | 0.2151     | 0.3809    | 0.2786 | 0.4329 | 0.5228  | 0.5074    | 0.5334     | 0.5       |
| 10.9002       | 5.0   | 1565  | 12.7759         | 0.1994 | 0.6122 | 0.0592 | 0.1562    | 0.2337     | 0.4435    | 0.2923 | 0.4511 | 0.5369  | 0.5092    | 0.5555     | 0.5667    |
| 10.9002       | 6.0   | 1878  | 12.9410         | 0.2477 | 0.6845 | 0.1046 | 0.1891    | 0.2878     | 0.4869    | 0.3295 | 0.4896 | 0.5643  | 0.5301    | 0.5872     | 0.6333    |
| 10.0597       | 7.0   | 2191  | 12.0453         | 0.2511 | 0.7021 | 0.092  | 0.1894    | 0.2916     | 0.4639    | 0.3144 | 0.4984 | 0.5683  | 0.5179    | 0.6024     | 0.6       |
| 9.4738        | 8.0   | 2504  | 11.6443         | 0.262  | 0.6911 | 0.1112 | 0.2077    | 0.3015     | 0.2426    | 0.3408 | 0.5317 | 0.5898  | 0.5452    | 0.6209     | 0.4667    |
| 9.4738        | 9.0   | 2817  | 11.4795         | 0.3091 | 0.7994 | 0.1487 | 0.2497    | 0.3488     | 0.3159    | 0.3764 | 0.5331 | 0.611   | 0.5664    | 0.6427     | 0.4       |
| 9.0887        | 10.0  | 3130  | 11.0460         | 0.3147 | 0.8058 | 0.1438 | 0.2394    | 0.365      | 0.469     | 0.3837 | 0.5331 | 0.5933  | 0.5301    | 0.6366     | 0.5333    |
| 9.0887        | 11.0  | 3443  | 10.8509         | 0.3038 | 0.7871 | 0.1337 | 0.214     | 0.3601     | 0.3414    | 0.3804 | 0.5429 | 0.5945  | 0.5435    | 0.6304     | 0.4       |
| 8.7187        | 12.0  | 3756  | 11.2568         | 0.2689 | 0.7366 | 0.1041 | 0.1987    | 0.3154     | 0.3722    | 0.3456 | 0.5245 | 0.5972  | 0.5402    | 0.6362     | 0.5667    |
| 8.5299        | 13.0  | 4069  | 10.6576         | 0.2818 | 0.7436 | 0.1189 | 0.2007    | 0.3285     | 0.2467    | 0.3433 | 0.5424 | 0.6164  | 0.5732    | 0.6466     | 0.5       |
| 8.5299        | 14.0  | 4382  | 10.3815         | 0.2688 | 0.6636 | 0.1296 | 0.1923    | 0.3198     | 0.395     | 0.3618 | 0.5586 | 0.6085  | 0.5518    | 0.6482     | 0.4333    |
| 8.2714        | 15.0  | 4695  | 10.2251         | 0.2846 | 0.7427 | 0.1324 | 0.1992    | 0.3356     | 0.3412    | 0.3408 | 0.5538 | 0.6187  | 0.5554    | 0.6623     | 0.5333    |
| 8.0708        | 16.0  | 5008  | 10.4468         | 0.2912 | 0.7585 | 0.1371 | 0.188     | 0.3503     | 0.2716    | 0.3442 | 0.5606 | 0.6252  | 0.5839    | 0.6551     | 0.3333    |
| 8.0708        | 17.0  | 5321  | 10.0924         | 0.3174 | 0.8149 | 0.1384 | 0.2167    | 0.3764     | 0.4464    | 0.3772 | 0.5462 | 0.6038  | 0.5426    | 0.6462     | 0.5       |
| 7.8789        | 18.0  | 5634  | 10.1511         | 0.292  | 0.7808 | 0.1069 | 0.1984    | 0.3441     | 0.4159    | 0.3466 | 0.5514 | 0.6103  | 0.5554    | 0.6484     | 0.5       |
| 7.8789        | 19.0  | 5947  | 10.0809         | 0.2872 | 0.7423 | 0.1304 | 0.1846    | 0.3477     | 0.3738    | 0.3546 | 0.558  | 0.6196  | 0.556     | 0.6638     | 0.4667    |
| 7.7461        | 20.0  | 6260  | 9.7884          | 0.3327 | 0.8237 | 0.1515 | 0.2486    | 0.3814     | 0.4109    | 0.396  | 0.579  | 0.6268  | 0.586     | 0.6555     | 0.4667    |
| 7.6557        | 21.0  | 6573  | 9.9407          | 0.3006 | 0.7914 | 0.1273 | 0.1849    | 0.3632     | 0.442     | 0.3455 | 0.5555 | 0.6118  | 0.5673    | 0.6429     | 0.4667    |
| 7.6557        | 22.0  | 6886  | 9.7508          | 0.3278 | 0.8365 | 0.1451 | 0.2254    | 0.3855     | 0.4052    | 0.372  | 0.5611 | 0.6179  | 0.567     | 0.6524     | 0.6333    |
| 7.4889        | 23.0  | 7199  | 9.9115          | 0.3163 | 0.8201 | 0.1201 | 0.2166    | 0.3695     | 0.4521    | 0.3826 | 0.566  | 0.6277  | 0.5869    | 0.6561     | 0.5333    |
| 7.4078        | 24.0  | 7512  | 9.9078          | 0.3068 | 0.7968 | 0.1297 | 0.1912    | 0.3724     | 0.5057    | 0.3533 | 0.5677 | 0.6291  | 0.5634    | 0.6743     | 0.5333    |
| 7.4078        | 25.0  | 7825  | 9.9138          | 0.2985 | 0.7844 | 0.1291 | 0.1905    | 0.3604     | 0.499     | 0.3514 | 0.5645 | 0.6385  | 0.5744    | 0.683      | 0.5       |
| 7.3325        | 26.0  | 8138  | 9.6711          | 0.3194 | 0.8042 | 0.1664 | 0.2146    | 0.3795     | 0.3995    | 0.3785 | 0.5969 | 0.6546  | 0.5905    | 0.6992     | 0.5       |
| 7.3325        | 27.0  | 8451  | 10.1850         | 0.2888 | 0.7886 | 0.1142 | 0.1822    | 0.3504     | 0.4257    | 0.3229 | 0.5677 | 0.6307  | 0.5616    | 0.6781     | 0.5667    |
| 7.2804        | 28.0  | 8764  | 10.1133         | 0.3185 | 0.8264 | 0.1482 | 0.2248    | 0.3728     | 0.4663    | 0.3688 | 0.5832 | 0.6509  | 0.5872    | 0.6953     | 0.4667    |
| 7.2029        | 29.0  | 9077  | 9.5707          | 0.3228 | 0.8097 | 0.1547 | 0.2271    | 0.3774     | 0.4653    | 0.3957 | 0.6044 | 0.6565  | 0.6071    | 0.6913     | 0.4667    |
| 7.2029        | 30.0  | 9390  | 10.1563         | 0.3377 | 0.8745 | 0.1421 | 0.2499    | 0.3885     | 0.3663    | 0.3789 | 0.5962 | 0.648   | 0.5988    | 0.683      | 0.4       |
| 7.0825        | 31.0  | 9703  | 9.8796          | 0.3364 | 0.8697 | 0.1503 | 0.242     | 0.3886     | 0.4629    | 0.3754 | 0.606  | 0.6556  | 0.6006    | 0.6939     | 0.5       |
| 7.026         | 32.0  | 10016 | 9.8638          | 0.345  | 0.8834 | 0.1628 | 0.2468    | 0.4006     | 0.4779    | 0.3856 | 0.6067 | 0.6615  | 0.6149    | 0.6937     | 0.5667    |
| 7.026         | 33.0  | 10329 | 9.6067          | 0.3561 | 0.8764 | 0.1902 | 0.2453    | 0.4162     | 0.399     | 0.3958 | 0.608  | 0.6585  | 0.6074    | 0.6947     | 0.4       |
| 6.8795        | 34.0  | 10642 | 9.5258          | 0.3781 | 0.9125 | 0.2023 | 0.2722    | 0.4358     | 0.4439    | 0.4279 | 0.6136 | 0.6664  | 0.6158    | 0.701      | 0.6333    |
| 6.8795        | 35.0  | 10955 | 9.4449          | 0.368  | 0.9007 | 0.188  | 0.2706    | 0.4204     | 0.5       | 0.4066 | 0.6088 | 0.6685  | 0.6253    | 0.699      | 0.5       |
| 6.7461        | 36.0  | 11268 | 9.2644          | 0.3658 | 0.8989 | 0.1945 | 0.2671    | 0.419      | 0.4653    | 0.4136 | 0.6142 | 0.6635  | 0.6202    | 0.6941     | 0.4667    |
| 6.64          | 37.0  | 11581 | 9.2426          | 0.3645 | 0.9017 | 0.1894 | 0.2627    | 0.4184     | 0.3975    | 0.4108 | 0.6078 | 0.6507  | 0.5997    | 0.6866     | 0.4333    |
| 6.64          | 38.0  | 11894 | 9.3995          | 0.3721 | 0.899  | 0.1976 | 0.2764    | 0.4243     | 0.4664    | 0.4192 | 0.6066 | 0.6544  | 0.6068    | 0.6877     | 0.5       |
| 6.4988        | 39.0  | 12207 | 9.4093          | 0.3769 | 0.9078 | 0.1964 | 0.2918    | 0.4253     | 0.4664    | 0.4288 | 0.6064 | 0.6531  | 0.6021    | 0.6887     | 0.5       |
| 6.4018        | 40.0  | 12520 | 9.3955          | 0.3615 | 0.8954 | 0.1759 | 0.2773    | 0.4089     | 0.4663    | 0.4116 | 0.6028 | 0.6546  | 0.5982    | 0.6941     | 0.4667    |
| 6.4018        | 41.0  | 12833 | 9.3966          | 0.3711 | 0.8973 | 0.1989 | 0.2758    | 0.4238     | 0.4993    | 0.421  | 0.6102 | 0.6635  | 0.6152    | 0.6968     | 0.6       |
| 6.3285        | 42.0  | 13146 | 9.4656          | 0.3655 | 0.8967 | 0.1968 | 0.2785    | 0.4172     | 0.4694    | 0.4245 | 0.6163 | 0.6701  | 0.6131    | 0.7097     | 0.5333    |
| 6.3285        | 43.0  | 13459 | 9.4390          | 0.3728 | 0.9149 | 0.1933 | 0.2914    | 0.419      | 0.4663    | 0.4295 | 0.6202 | 0.6675  | 0.6187    | 0.7018     | 0.4667    |
| 6.2397        | 44.0  | 13772 | 9.3682          | 0.3636 | 0.8795 | 0.202  | 0.2785    | 0.4146     | 0.4664    | 0.428  | 0.6148 | 0.6627  | 0.6065    | 0.7018     | 0.5       |
| 6.1381        | 45.0  | 14085 | 9.2842          | 0.3692 | 0.8998 | 0.2041 | 0.279     | 0.4205     | 0.4663    | 0.4279 | 0.6116 | 0.6625  | 0.6074    | 0.7012     | 0.4667    |
| 6.1381        | 46.0  | 14398 | 9.2333          | 0.3686 | 0.8987 | 0.2006 | 0.2758    | 0.4211     | 0.4663    | 0.428  | 0.6155 | 0.6621  | 0.6095    | 0.699      | 0.4667    |
| 6.0162        | 47.0  | 14711 | 9.2259          | 0.366  | 0.9051 | 0.1986 | 0.2756    | 0.4155     | 0.4663    | 0.4245 | 0.6157 | 0.6641  | 0.6128    | 0.7002     | 0.4667    |
| 5.9406        | 48.0  | 15024 | 9.2568          | 0.3632 | 0.902  | 0.1941 | 0.2729    | 0.4133     | 0.4733    | 0.4198 | 0.6145 | 0.6623  | 0.606     | 0.7014     | 0.5333    |
| 5.9406        | 49.0  | 15337 | 9.2850          | 0.3635 | 0.904  | 0.196  | 0.2733    | 0.4147     | 0.479     | 0.4223 | 0.616  | 0.6669  | 0.6134    | 0.704      | 0.5333    |
| 5.8549        | 50.0  | 15650 | 9.2647          | 0.3663 | 0.8999 | 0.1997 | 0.2746    | 0.4177     | 0.4663    | 0.4232 | 0.6149 | 0.664   | 0.6107    | 0.7014     | 0.4667    |


### Framework versions

- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1