Skip to content
Snippets Groups Projects
Unverified Commit 3349722d authored by i-robot's avatar i-robot Committed by Gitee
Browse files

!3728 [东北大学][高校贡献][Mindspore][yolov3_tiny]-迁移训练模型提交

Merge pull request !3728 from 冷情灬/yolov3_tiny
parents c2ab2515 4cf4ce59
No related branches found
No related tags found
No related merge requests found
......@@ -44,7 +44,8 @@ YOLOv3 Tiny是YOLOv3的一个轻量级变体,它使用池化层并减少卷积
# 数据集
使用的数据集:[COCO 2017](<http://images.cocodataset.org/>)
使用的数据集:
[COCO 2017](<http://images.cocodataset.org/>)
- 数据集大小:19 GB
- 训练集:18 GB,118000张图片
......@@ -53,6 +54,13 @@ YOLOv3 Tiny是YOLOv3的一个轻量级变体,它使用池化层并减少卷积
- 数据格式:图片和json文件
- 标注:数据在dataset.py中处理。
[face-mask-detection](<https://www.kaggle.com/datasets/andrewmvd/face-mask-detection>)(迁移学习使用)
- 数据集大小: 397.65MB, 853张3类彩色图像
- 数据格式:RGB图像.
- 注意:数据将在src/dataset.py 中被处理
- 数据集
1. 目录结构如下:
......@@ -431,4 +439,120 @@ YOLOv3-tiny应用于5000张图像上(标注和数据格式必须与COCO val 20
# ModelZoo主页
请浏览官网[主页](https://gitee.com/mindspore/models)。
请浏览官网[主页](https://gitee.com/mindspore/models)。
## [迁移学习](#content)
### [迁移学习训练流程](#content)
#### 数据集处理
[数据集下载地址](https://www.kaggle.com/datasets/andrewmvd/face-mask-detection)
下载数据集后解压至yolov3_tiny的dataset目录下,进入src目录,使用data_split脚本划分出80%的训练集和20%的测试集,使用voc脚本生成训练集测试集标签的json文件
```bash
运行脚本示例
cd src
python data_split.py
python voc.py
```
```text
数据集结构
└─dataset
├─train
├─val
├─annotations
├─images
├─facemask
├─annotations_json
```
```text
训练前,为yaml文件配置好facemask数据集路径
# your dataset dir
dataset_root: /home/mindspore/yolov3_tiny/dataset/
```
#### 迁移学习训练过程
需要先从[Mindspore Hub](https://www.mindspore.cn/resources/hub/details?MindSpore/1.8/yolov3tiny_coco2017)下载预训练的ckpt
```text
# 在finetune_config.yaml设置预训练模型的ckpt
finetune_path: "/home/mindspore/yolov3_tiny/LoadPretrainedModel/yolov3tiny_ascend_v180_coco2017_research_cv_mAP17.5_AP50acc36.0.ckpt""
```
```bash
#运行迁移学习训练脚本
python train.py --config_path './config/finetune_config.yaml'
```
**结果展示**
训练结果将存储在示例路径中。checkpoint将存储在 `./outputs/%Y-%m-%d_time_%H_%M_%S/ckpt_0` 路径下,训练loss输出示例如下:
```text
2022-09-28 00:12:14,528:INFO:epoch[0], iter[0], loss:7826.489258, fps:0.38 imgs/sec, epoch time:84333.98 ms, per step time:84333.98 ms, lr:7.142857043618278e-07
2022-09-28 00:12:25,968:INFO:epoch[1], iter[21], loss:1231.710783, fps:58.77 imgs/sec, epoch time:11435.29 ms, per step time:544.54 ms, lr:1.5714285837020725e-05
2022-09-28 00:12:36,381:INFO:epoch[2], iter[42], loss:185.153160, fps:64.54 imgs/sec, epoch time:10411.45 ms, per step time:495.78 ms, lr:3.0714287277078256e-05
2022-09-28 00:12:46,107:INFO:epoch[3], iter[63], loss:150.027618, fps:69.12 imgs/sec, epoch time:9722.47 ms, per step time:462.97 ms, lr:4.571428507915698e-05
2022-09-28 00:12:55,684:INFO:epoch[4], iter[84], loss:128.305368, fps:70.22 imgs/sec, epoch time:9569.61 ms, per step time:455.70 ms, lr:6.071428651921451e-05
2022-09-28 00:13:05,645:INFO:epoch[5], iter[105], loss:123.486393, fps:67.46 imgs/sec, epoch time:9960.92 ms, per step time:474.33 ms, lr:7.571428432129323e-05
...
...
```
#### 迁移学习推理过程
```bash
#运行迁移学习训练脚本
python eval.py --config_path './config/finetune_config.yaml'
```
**结果展示**
```text
=============coco eval reulst=========
Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.476
Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.769
Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.560
Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.366
Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.642
Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.588
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.297
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.512
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.548
Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.461
Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.680
Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.601
======================================
mAP: 0.47645057246310263
2022-09-27 23:31:00,800:INFO:testing cost time 0.01h
```
#### 迁移学习quick_start
运行eval脚本后,会生成`predictions.json`文件,在finetune_config.yaml修改`predictions.json`文件的路径后再运行
predict_path: "/home/mindspore/yolov3_tiny/outputs/%Y-%m-%d_time_%H_%M_%S/predict_%Y-%m-%d_time_%H_%M_%S.json"
```bash
# 运行quick_start脚本示例
python quick_start.py --config_path './config/finetune_config.yaml'
```
**结果说明**
图中颜色的含义分别是:
- 浅蓝: 真实标签的mask_weared_incorrect
- 浅绿: 真实标签的with_mask
- 浅红: 真实标签的without_mask
- 蓝色: 预测标签的mask_weared_incorrect
- 绿色: 预测标签的with_mask
- 红色: 预测标签的without_mask
# Builtin Configurations(DO NOT CHANGE THESE CONFIGURATIONS unless you know exactly what you are doing)
enable_modelarts: False
# Url for modelarts
data_url: ""
train_url: ""
checkpoint_url: ""
# Path for local
data_path: "/cache/data"
output_path: "/cache/train"
load_path: "/cache/checkpoint_path"
device_target: "CPU" # ['Ascend', 'GPU', 'CPU']
need_modelarts_dataset_unzip: True
modelarts_dataset_unzip_name: "coco2017"
# ==============================================================================
# Training options
finetune: 1
# dataset related
data_dir: "./dataset/"
finetune_path: "./LoadPretrainedModel/yolov3tiny_ascend_v180_coco2017_research_cv_mAP17.5_AP50acc36.0.ckpt"
predict_path: "./outputs/2022-09-19_time_03_45_17/predict_2022_09_19_03_45_36.json"
per_batch_size: 32
# network related
pretrained_backbone: ""
resume_yolo: ""
resume_epoch: 0
# optimizer and lr related
lr_scheduler: "cosine_annealing"
lr: 0.0003
lr_epochs: "220,250"
lr_gamma: 0.1
eta_min: 0.0
t_max: 300
max_epoch: 300
warmup_epochs: 20
weight_decay: 0.016
momentum: 0.9
# loss related
loss_scale: 1
label_smooth: 0
label_smooth_factor: 0.1
# logging related
log_interval: 21
ckpt_path: "./outputs/"
ckpt_interval: -1
is_save_on_master: 1
# distributed related
is_distributed: 0
rank: 0
group_size: 1
# profiler init
need_profiler: 0
# reset default config
training_shape: 640
# Eval option
pretrained: "./outputs/2022-09-19_time_02_05_24/ckpt_0/yolov3_tiny-300_6300.ckpt"
log_path: "./outputs/"
nms_thresh: 0.5
ann_file: ""
testing_shape: 640
eval_ignore_threshold: 0.001
result_path: ""
img_path: ""
# Export option
device_id: 0
batch_size: 1
ckpt_file: ""
file_name: "yolov3_tiny"
file_format: "MINDIR" # ["AIR", "ONNX", "MINDIR"]
keep_detect: True
# convert weight option
input_file: ""
output_file: ""
# Other default config
num_parallel_workers: 8
hue: 0.015
saturation: 1.5
value: 0.4
jitter: 0.3
resize_rate: 10
multi_scale: [[320, 320],
[352, 352],
[384, 384],
[416, 416],
[448, 448],
[480, 480],
[512, 512],
[544, 544],
[576, 576],
[608, 608]
]
num_classes: 3
out_channel: 24 #3 * (num_classes + 5)
max_box: 90
coord_scale: 4
large_scale: 4
# confidence under ignore_threshold means no object when training
ignore_threshold: 0.7
# h->w
anchor_scales: [[10, 14],
[23, 27],
[37, 58],
[81, 82],
[135, 169],
[344, 319]]
# test_param
test_img_shape: [640, 640]
multi_label: True
multi_label_thresh: 0.15
---
# Help description for each configuration
data_dir: "Train dataset directory."
per_batch_size: "Batch size for Training."
pretrained_backbone: "The ckpt file of backbone."
resume_yolo: "The ckpt file of YOLO, which used to fine tune."
lr_scheduler: "Learning rate scheduler, options: exponential, cosine_annealing."
lr: "Learning rate."
lr_epochs: "Epoch of changing of lr changing, split with ',' ."
lr_gamma: "Decrease lr by a factor of exponential lr_scheduler."
eta_min: "Eta_min in cosine_annealing scheduler."
t_max: "t-max in cosine_annealing scheduler."
max_epoch: "Max epoch num to train the model."
warmup_epochs: "Warmup epochs."
weight_decay: "Weight decay factor."
momentum: "Momentum."
loss_scale: "Static loss scale."
label_smooth: "Whether to use label smooth in CE."
label_smooth_factor: "Smooth strength of original one-hot."
log_interval: "Logging interval steps."
ckpt_path: "Checkpoint save location."
ckpt_interval: "Save checkpoint interval."
is_save_on_master: "Save ckpt on master or all rank, 1 for master, 0 for all ranks."
is_distributed: "Distribute train or not, 1 for yes, 0 for no."
rank: "Local rank of distributed."
group_size: "World size of device."
need_profiler: "Whether use profiler. 0 for no, 1 for yes."
training_shape: "Fix training shape."
resize_rate: "Resize rate for multi-scale training."
# eval option
pretrained: "model_path, local pretrained model to load."
log_path: "checkpoint save location."
nms_thresh: "threshold for NMS."
ann_file: "path to annotation."
testing_shape: "shape for test."
eval_ignore_threshold: "threshold to throw low quality boxes for eval."
multi_label: "whether to use multi label."
multi_label_thresh: "threshold to throw low quality boxes."
# export option
device_id: "Device id"
batch_size: "batch size"
ckpt_file: "Checkpoint file path."
file_name: "output file name."
file_format: "file format choices in ['AIR', 'ONNX', 'MINDIR']"
device_target: "device target. choices in ['Ascend', 'GPU'] for train. choices in ['Ascend', 'GPU', 'CPU'] for export."
keep_detect: "keep the detect module or not, default: True"
# convert weight option
input_file: "input file path."
output_file: "output file path."
maksssksksss304
maksssksksss311
maksssksksss6
maksssksksss479
maksssksksss231
maksssksksss180
maksssksksss485
maksssksksss557
maksssksksss290
maksssksksss435
maksssksksss703
maksssksksss827
maksssksksss129
maksssksksss635
maksssksksss5
maksssksksss559
maksssksksss412
maksssksksss235
maksssksksss80
maksssksksss312
maksssksksss810
maksssksksss65
maksssksksss179
maksssksksss823
maksssksksss770
maksssksksss279
maksssksksss433
maksssksksss68
maksssksksss423
maksssksksss246
maksssksksss72
maksssksksss769
maksssksksss287
maksssksksss817
maksssksksss551
maksssksksss831
maksssksksss742
maksssksksss509
maksssksksss153
maksssksksss760
maksssksksss232
maksssksksss452
maksssksksss402
maksssksksss841
maksssksksss343
maksssksksss754
maksssksksss144
maksssksksss546
maksssksksss143
maksssksksss344
maksssksksss115
maksssksksss40
maksssksksss467
maksssksksss320
maksssksksss220
maksssksksss106
maksssksksss360
maksssksksss591
maksssksksss151
maksssksksss191
maksssksksss768
maksssksksss211
maksssksksss254
maksssksksss470
maksssksksss372
maksssksksss594
maksssksksss150
maksssksksss379
maksssksksss533
maksssksksss730
maksssksksss271
maksssksksss339
maksssksksss613
maksssksksss465
maksssksksss101
maksssksksss634
maksssksksss364
maksssksksss397
maksssksksss161
maksssksksss458
maksssksksss583
maksssksksss783
maksssksksss700
maksssksksss193
maksssksksss60
maksssksksss328
maksssksksss299
maksssksksss112
maksssksksss720
maksssksksss371
maksssksksss346
maksssksksss99
maksssksksss385
maksssksksss182
maksssksksss718
maksssksksss166
maksssksksss369
maksssksksss650
maksssksksss575
maksssksksss210
maksssksksss242
maksssksksss53
maksssksksss787
maksssksksss87
maksssksksss838
maksssksksss431
maksssksksss156
maksssksksss125
maksssksksss274
maksssksksss582
maksssksksss336
maksssksksss209
maksssksksss599
maksssksksss499
maksssksksss54
maksssksksss713
maksssksksss812
maksssksksss181
maksssksksss398
maksssksksss520
maksssksksss498
maksssksksss269
maksssksksss744
maksssksksss100
maksssksksss155
maksssksksss565
maksssksksss159
maksssksksss534
maksssksksss692
maksssksksss306
maksssksksss300
maksssksksss654
maksssksksss47
maksssksksss729
maksssksksss609
maksssksksss98
maksssksksss846
maksssksksss679
maksssksksss142
maksssksksss704
maksssksksss597
maksssksksss266
maksssksksss111
maksssksksss669
maksssksksss177
maksssksksss736
maksssksksss523
maksssksksss419
maksssksksss672
maksssksksss49
maksssksksss123
maksssksksss316
maksssksksss35
maksssksksss31
maksssksksss538
maksssksksss821
maksssksksss794
maksssksksss445
maksssksksss282
maksssksksss365
maksssksksss624
maksssksksss788
maksssksksss614
maksssksksss711
maksssksksss775
maksssksksss510
maksssksksss250
maksssksksss767
maksssksksss622
maksssksksss74
maksssksksss313
maksssksksss302
maksssksksss747
maksssksksss542
maksssksksss58
maksssksksss107
maksssksksss800
maksssksksss27
maksssksksss118
maksssksksss217
maksssksksss464
maksssksksss386
maksssksksss592
maksssksksss76
maksssksksss749
maksssksksss244
maksssksksss641
maksssksksss491
maksssksksss356
maksssksksss260
maksssksksss836
maksssksksss719
maksssksksss381
maksssksksss199
maksssksksss685
maksssksksss61
maksssksksss640
maksssksksss34
maksssksksss573
maksssksksss439
maksssksksss502
maksssksksss48
maksssksksss0
maksssksksss105
maksssksksss653
maksssksksss513
maksssksksss46
maksssksksss496
maksssksksss252
maksssksksss756
maksssksksss598
maksssksksss503
maksssksksss705
maksssksksss495
maksssksksss114
maksssksksss798
maksssksksss24
maksssksksss411
maksssksksss626
maksssksksss665
maksssksksss213
maksssksksss92
maksssksksss497
maksssksksss443
maksssksksss476
maksssksksss78
maksssksksss317
maksssksksss103
maksssksksss396
maksssksksss442
maksssksksss505
maksssksksss521
maksssksksss585
maksssksksss247
maksssksksss446
maksssksksss368
maksssksksss579
maksssksksss514
maksssksksss425
maksssksksss238
maksssksksss531
maksssksksss261
maksssksksss709
maksssksksss690
maksssksksss701
maksssksksss772
maksssksksss643
maksssksksss204
maksssksksss668
maksssksksss354
maksssksksss671
maksssksksss126
maksssksksss327
maksssksksss4
maksssksksss349
maksssksksss795
maksssksksss140
maksssksksss586
maksssksksss295
maksssksksss222
maksssksksss822
maksssksksss791
maksssksksss239
maksssksksss779
maksssksksss197
maksssksksss228
maksssksksss563
maksssksksss777
maksssksksss689
maksssksksss293
maksssksksss681
maksssksksss484
maksssksksss285
maksssksksss830
maksssksksss276
maksssksksss590
maksssksksss212
maksssksksss352
maksssksksss847
maksssksksss617
maksssksksss721
maksssksksss796
maksssksksss663
maksssksksss89
maksssksksss621
maksssksksss265
maksssksksss172
maksssksksss82
maksssksksss75
maksssksksss815
maksssksksss294
maksssksksss8
maksssksksss407
maksssksksss405
maksssksksss471
maksssksksss473
maksssksksss544
maksssksksss422
maksssksksss79
maksssksksss154
maksssksksss387
maksssksksss811
maksssksksss226
maksssksksss2
maksssksksss255
maksssksksss121
maksssksksss852
maksssksksss307
maksssksksss62
maksssksksss149
maksssksksss577
maksssksksss615
maksssksksss803
maksssksksss560
maksssksksss673
maksssksksss581
maksssksksss589
maksssksksss288
maksssksksss472
maksssksksss259
maksssksksss684
maksssksksss481
maksssksksss73
maksssksksss38
maksssksksss670
maksssksksss418
maksssksksss141
maksssksksss776
maksssksksss120
maksssksksss176
maksssksksss429
maksssksksss175
maksssksksss807
maksssksksss432
maksssksksss102
maksssksksss451
maksssksksss205
maksssksksss64
maksssksksss131
maksssksksss171
maksssksksss127
maksssksksss697
maksssksksss128
maksssksksss45
maksssksksss462
maksssksksss41
maksssksksss620
maksssksksss605
maksssksksss198
maksssksksss146
maksssksksss556
maksssksksss200
maksssksksss716
maksssksksss708
maksssksksss819
maksssksksss580
maksssksksss221
maksssksksss138
maksssksksss383
maksssksksss734
maksssksksss168
maksssksksss816
maksssksksss52
maksssksksss647
maksssksksss826
maksssksksss676
maksssksksss415
maksssksksss629
maksssksksss455
maksssksksss724
maksssksksss454
maksssksksss694
maksssksksss404
maksssksksss529
maksssksksss348
maksssksksss377
maksssksksss536
maksssksksss623
maksssksksss753
maksssksksss469
maksssksksss28
maksssksksss687
maksssksksss183
maksssksksss660
maksssksksss447
maksssksksss43
maksssksksss801
maksssksksss438
maksssksksss391
maksssksksss39
maksssksksss436
maksssksksss764
maksssksksss548
maksssksksss33
maksssksksss487
maksssksksss541
maksssksksss426
maksssksksss361
maksssksksss851
maksssksksss532
maksssksksss202
maksssksksss324
maksssksksss223
maksssksksss839
maksssksksss196
maksssksksss508
maksssksksss113
maksssksksss263
maksssksksss237
maksssksksss738
maksssksksss395
maksssksksss603
maksssksksss611
maksssksksss342
maksssksksss357
maksssksksss178
maksssksksss848
maksssksksss315
maksssksksss727
maksssksksss173
maksssksksss403
maksssksksss417
maksssksksss225
maksssksksss298
maksssksksss771
maksssksksss695
maksssksksss666
maksssksksss384
maksssksksss525
maksssksksss362
maksssksksss427
maksssksksss825
maksssksksss340
maksssksksss408
maksssksksss743
maksssksksss273
maksssksksss698
maksssksksss434
maksssksksss482
maksssksksss608
maksssksksss828
maksssksksss170
maksssksksss558
maksssksksss488
maksssksksss799
maksssksksss658
maksssksksss50
maksssksksss530
maksssksksss524
maksssksksss607
maksssksksss19
maksssksksss227
maksssksksss518
maksssksksss351
maksssksksss284
maksssksksss376
maksssksksss390
maksssksksss782
maksssksksss793
maksssksksss486
maksssksksss394
maksssksksss414
maksssksksss303
maksssksksss850
maksssksksss258
maksssksksss382
maksssksksss774
maksssksksss256
maksssksksss334
maksssksksss117
maksssksksss601
maksssksksss595
maksssksksss766
maksssksksss331
maksssksksss84
maksssksksss657
maksssksksss707
maksssksksss725
maksssksksss460
maksssksksss547
maksssksksss430
maksssksksss428
maksssksksss494
maksssksksss501
maksssksksss528
maksssksksss638
maksssksksss194
maksssksksss814
maksssksksss201
maksssksksss91
maksssksksss535
maksssksksss696
maksssksksss122
maksssksksss675
maksssksksss834
maksssksksss16
maksssksksss748
maksssksksss17
maksssksksss137
maksssksksss169
maksssksksss157
maksssksksss130
maksssksksss97
maksssksksss792
maksssksksss11
maksssksksss292
maksssksksss189
maksssksksss20
maksssksksss55
maksssksksss478
maksssksksss308
maksssksksss844
maksssksksss132
maksssksksss18
maksssksksss353
maksssksksss587
maksssksksss77
maksssksksss81
maksssksksss70
maksssksksss248
maksssksksss576
maksssksksss627
maksssksksss489
maksssksksss540
maksssksksss527
maksssksksss190
maksssksksss566
maksssksksss648
maksssksksss251
maksssksksss512
maksssksksss67
maksssksksss51
maksssksksss569
maksssksksss29
maksssksksss309
maksssksksss840
maksssksksss319
maksssksksss733
maksssksksss187
maksssksksss506
maksssksksss268
maksssksksss270
maksssksksss219
maksssksksss283
maksssksksss835
maksssksksss374
maksssksksss740
maksssksksss69
maksssksksss262
maksssksksss236
maksssksksss214
maksssksksss490
maksssksksss759
maksssksksss230
maksssksksss459
maksssksksss568
maksssksksss596
maksssksksss37
maksssksksss71
maksssksksss165
maksssksksss570
maksssksksss642
maksssksksss842
maksssksksss325
maksssksksss763
maksssksksss564
maksssksksss735
maksssksksss281
maksssksksss388
maksssksksss752
maksssksksss57
maksssksksss824
maksssksksss572
maksssksksss802
maksssksksss780
maksssksksss618
maksssksksss765
maksssksksss784
maksssksksss543
maksssksksss332
maksssksksss400
maksssksksss731
maksssksksss457
maksssksksss440
maksssksksss667
maksssksksss550
maksssksksss7
maksssksksss88
maksssksksss683
maksssksksss366
maksssksksss655
maksssksksss682
maksssksksss466
maksssksksss21
maksssksksss240
maksssksksss755
maksssksksss483
maksssksksss843
maksssksksss578
maksssksksss184
maksssksksss275
maksssksksss833
maksssksksss804
maksssksksss399
maksssksksss820
maksssksksss593
maksssksksss162
maksssksksss549
maksssksksss410
maksssksksss691
maksssksksss253
maksssksksss66
maksssksksss715
maksssksksss124
maksssksksss604
maksssksksss659
maksssksksss164
maksssksksss633
maksssksksss305
maksssksksss739
maksssksksss552
maksssksksss109
maksssksksss152
maksssksksss245
maksssksksss448
maksssksksss94
maksssksksss337
maksssksksss688
maksssksksss728
maksssksksss1
maksssksksss453
maksssksksss504
maksssksksss335
maksssksksss389
maksssksksss286
maksssksksss93
maksssksksss367
maksssksksss741
maksssksksss188
maksssksksss463
maksssksksss500
maksssksksss370
maksssksksss373
maksssksksss380
maksssksksss218
maksssksksss686
maksssksksss750
maksssksksss321
maksssksksss516
maksssksksss322
maksssksksss761
maksssksksss257
maksssksksss63
maksssksksss845
maksssksksss359
maksssksksss519
maksssksksss23
maksssksksss229
maksssksksss714
maksssksksss562
maksssksksss10
maksssksksss233
maksssksksss409
maksssksksss264
maksssksksss329
maksssksksss289
maksssksksss646
maksssksksss36
maksssksksss797
maksssksksss330
maksssksksss849
maksssksksss318
maksssksksss805
maksssksksss139
maksssksksss545
maksssksksss134
maksssksksss474
maksssksksss174
maksssksksss86
maksssksksss280
maksssksksss644
maksssksksss449
maksssksksss461
maksssksksss314
maksssksksss456
maksssksksss203
maksssksksss185
maksssksksss751
maksssksksss85
maksssksksss758
maksssksksss680
maksssksksss652
maksssksksss249
maksssksksss781
maksssksksss167
maksssksksss416
maksssksksss539
maksssksksss722
maksssksksss808
maksssksksss600
maksssksksss207
maksssksksss135
maksssksksss25
maksssksksss674
maksssksksss610
maksssksksss406
maksssksksss186
maksssksksss441
maksssksksss413
maksssksksss277
maksssksksss710
maksssksksss363
maksssksksss444
maksssksksss116
maksssksksss12
maksssksksss350
maksssksksss420
maksssksksss136
maksssksksss30
maksssksksss706
maksssksksss631
maksssksksss517
maksssksksss637
maksssksksss636
maksssksksss818
maksssksksss699
maksssksksss378
maksssksksss693
maksssksksss492
maksssksksss661
maksssksksss192
maksssksksss14
maksssksksss345
maksssksksss625
maksssksksss567
maksssksksss145
maksssksksss243
maksssksksss301
maksssksksss341
maksssksksss790
maksssksksss326
maksssksksss278
maksssksksss468
maksssksksss522
maksssksksss216
maksssksksss32
maksssksksss726
maksssksksss13
maksssksksss555
maksssksksss160
maksssksksss477
maksssksksss554
maksssksksss56
maksssksksss746
maksssksksss493
maksssksksss310
maksssksksss809
maksssksksss26
maksssksksss553
maksssksksss612
maksssksksss110
maksssksksss9
maksssksksss475
maksssksksss737
maksssksksss338
maksssksksss662
maksssksksss829
maksssksksss272
maksssksksss215
maksssksksss450
maksssksksss375
maksssksksss574
maksssksksss507
maksssksksss323
maksssksksss702
maksssksksss392
maksssksksss732
maksssksksss537
maksssksksss333
maksssksksss108
maksssksksss561
maksssksksss678
maksssksksss757
maksssksksss104
maksssksksss515
maksssksksss639
maksssksksss347
maksssksksss158
maksssksksss355
maksssksksss588
maksssksksss806
maksssksksss206
maksssksksss393
maksssksksss83
maksssksksss571
maksssksksss22
maksssksksss42
maksssksksss3
maksssksksss649
maksssksksss44
maksssksksss656
maksssksksss480
maksssksksss224
maksssksksss526
maksssksksss778
maksssksksss723
maksssksksss119
maksssksksss195
maksssksksss712
maksssksksss133
maksssksksss297
maksssksksss786
maksssksksss762
maksssksksss437
maksssksksss745
maksssksksss606
maksssksksss59
maksssksksss241
maksssksksss616
maksssksksss664
maksssksksss15
maksssksksss584
maksssksksss773
maksssksksss96
maksssksksss163
maksssksksss837
maksssksksss421
maksssksksss511
maksssksksss291
maksssksksss717
maksssksksss785
maksssksksss632
maksssksksss630
maksssksksss234
maksssksksss832
maksssksksss677
maksssksksss602
maksssksksss147
maksssksksss358
maksssksksss401
maksssksksss628
maksssksksss424
maksssksksss789
maksssksksss95
maksssksksss208
maksssksksss645
maksssksksss148
maksssksksss90
maksssksksss651
maksssksksss813
maksssksksss619
maksssksksss267
......@@ -20,12 +20,11 @@ import time
from collections import defaultdict
import numpy as np
import mindspore as ms
from mindspore import Tensor
from mindspore import context
from mindspore import dtype as mstype
from mindspore.context import ParallelMode
from mindspore.train.serialization import load_checkpoint
from mindspore.train.serialization import load_param_into_net
from pycocotools.coco import COCO
from pycocotools.cocoeval import COCOeval
......@@ -57,16 +56,19 @@ class DetectionEngine:
def __init__(self, config_detection):
self.eval_ignore_threshold = config_detection.eval_ignore_threshold
self.labels = ['person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat',
'traffic light', 'fire hydrant', 'stop sign', 'parking meter', 'bench', 'bird', 'cat',
'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack',
'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball',
'kite', 'baseball bat', 'baseball glove', 'skateboard', 'surfboard', 'tennis racket',
'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple',
'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza', 'donut', 'cake', 'chair',
'couch', 'potted plant', 'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote',
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book',
'clock', 'vase', 'scissors', 'teddy bear', 'hair drier', 'toothbrush']
if config.finetune:
self.labels = ['without_mask', 'with_mask', 'mask_weared_incorrect']
else:
self.labels = ['person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat',
'traffic light', 'fire hydrant', 'stop sign', 'parking meter', 'bench', 'bird', 'cat',
'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack',
'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball',
'kite', 'baseball bat', 'baseball glove', 'skateboard', 'surfboard', 'tennis racket',
'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple',
'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza', 'donut', 'cake', 'chair',
'couch', 'potted plant', 'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote',
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book',
'clock', 'vase', 'scissors', 'teddy bear', 'hair drier', 'toothbrush']
self.num_classes = len(self.labels)
self.results = {}
self.file_path = ''
......@@ -193,6 +195,9 @@ class DetectionEngine:
stdout = sys.stdout
sys.stdout = rdct
coco_eval.summarize()
mAP = coco_eval.stats[0]
print("\n======================================\n")
print(f"mAP: {mAP}")
sys.stdout = stdout
return rdct.content
......@@ -341,8 +346,12 @@ def modelarts_pre_process():
def run_test():
"""The function of eval."""
start_time = time.time()
config.data_root = os.path.join(config.data_dir, 'val2017')
config.ann_file = os.path.join(config.data_dir, 'annotations/instances_val2017.json')
if config.finetune:
config.data_root = os.path.join(config.data_dir, 'val/images')
config.ann_file = os.path.join(config.data_dir, 'annotations_json/val.json')
else:
config.data_root = os.path.join(config.data_dir, 'val2017')
config.ann_file = os.path.join(config.data_dir, 'annotations/instances_val2017.json')
device_id = int(os.getenv('DEVICE_ID')) if os.getenv('DEVICE_ID') else 0
# device_id = 1
......@@ -363,7 +372,7 @@ def run_test():
config.logger.info(config.pretrained)
if os.path.isfile(config.pretrained):
param_dict = load_checkpoint(config.pretrained)
param_dict = ms.load_checkpoint(config.pretrained)
param_dict_new = {}
for key, values in param_dict.items():
if key.startswith('moments.'):
......@@ -372,7 +381,7 @@ def run_test():
param_dict_new[key[13:]] = values
else:
param_dict_new[key] = values
load_param_into_net(network, param_dict_new)
ms.load_param_into_net(network, param_dict_new)
config.logger.info('load_model %s success', config.pretrained)
else:
config.logger.info('%s not exists or not a pre-trained file', config.pretrained)
......
# Copyright 2022 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""visualize for yolov3_tiny"""
import random
import cv2
import matplotlib.pyplot as plt
from pycocotools.coco import COCO
from model_utils.config import config
random.seed(11)
pred_blue = (0, 0, 255) # 'with_mask': 1
true_blue = (0, 191, 255)
pred_red = (255, 0, 0) # 'without_mask': 2,
true_red = (255, 140, 0)
pred_green = (0, 255, 0) # 'mask_weared_incorrect': 3
true_green = (0, 128, 0)
def visualize_model():
# load best ckpt to generate instances_val.json and predictions.json
dataset_dir = r'./dataset/val/images/'
ann_file = './dataset/annotations_json/val.json'
coco = COCO(ann_file)
catids = coco.getCatIds()
imgids = coco.getImgIds()
img_list = random.sample(imgids, 8)
coco_res = coco.loadRes(config.predict_path)
catids_res = coco_res.getCatIds()
for i in img_list:
img = coco.loadImgs(i)[0]
image = cv2.imread(dataset_dir + img['file_name'])
image_res = image
annids = coco.getAnnIds(imgIds=img['id'], catIds=catids, iscrowd=None)
annos = coco.loadAnns(annids)
annids_res = coco_res.getAnnIds(imgIds=img['id'], catIds=catids_res, iscrowd=None)
annos_res = coco_res.loadAnns(annids_res)
plt.figure(figsize=(6, 6))
for anno in annos:
bbox = anno['bbox']
x, y, w, h = bbox
if anno['category_id'] == 1: # with_mask
anno_image = cv2.rectangle(image, (int(x), int(y)), (int(x + w), int(y + h)), true_blue, 2)
elif anno['category_id'] == 2: # without_mask
anno_image = cv2.rectangle(image, (int(x), int(y)), (int(x + w), int(y + h)), true_red, 2)
else: # mask_weared_incorrect
anno_image = cv2.rectangle(image, (int(x), int(y)), (int(x + w), int(y + h)), true_green, 2)
plt.subplot(1, 2, 1)
plt.plot([-2, 3], [1, 5])
plt.title('true')
plt.imshow(anno_image)
for anno_res in annos_res:
bbox_res = anno_res['bbox']
x, y, w, h = bbox_res
if anno_res['category_id'] == 1:
res_image = cv2.rectangle(image_res, (int(x), int(y)), (int(x + w), int(y + h)), pred_blue, 2)
elif anno_res['category_id'] == 2:
res_image = cv2.rectangle(image_res, (int(x), int(y)), (int(x + w), int(y + h)), pred_red, 2)
else:
res_image = cv2.rectangle(image_res, (int(x), int(y)), (int(x + w), int(y + h)), pred_green, 2)
plt.subplot(1, 2, 2)
plt.title('pred')
plt.imshow(res_image)
plt.show()
if __name__ == '__main__':
visualize_model()
# Copyright 2022 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""data_split"""
import os
import shutil
image_original_path = '../dataset/images/'
label_original_path = '../dataset/annotations/'
train_image_path = '../dataset/train/images/'
train_label_path = '../dataset/train/annotations/'
val_image_path = '../dataset/val/images/'
val_label_path = '../dataset/val/annotations/'
def mkdir():
if not os.path.exists(train_image_path):
os.makedirs(train_image_path)
if not os.path.exists(train_label_path):
os.makedirs(train_label_path)
if not os.path.exists(val_image_path):
os.makedirs(val_image_path)
if not os.path.exists(val_label_path):
os.makedirs(val_label_path)
def main():
mkdir()
with open("../dataset/facemask/train.txt", 'r') as f:
for line in f:
dst_train_image = train_image_path + line[:-1] + '.jpg'
dst_train_label = train_label_path + line[:-1] + '.xml'
shutil.copyfile(image_original_path + line[:-1] + '.png', dst_train_image)
shutil.copyfile(label_original_path + line[:-1] + '.xml', dst_train_label)
with open("../dataset/facemask/val.txt", 'r') as f:
for line in f:
dst_val_image = val_image_path + line[:-1] + '.jpg'
dst_val_label = val_label_path + line[:-1] + '.xml'
shutil.copyfile(image_original_path + line[:-1] + '.png', dst_val_image)
shutil.copyfile(label_original_path + line[:-1] + ".xml", dst_val_label)
if __name__ == '__main__':
main()
# Copyright 2022 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""xml covert to coco"""
import glob
import json
import os
import xml.etree.ElementTree as ET
import numpy as np
define_categories = {'with_mask': 1, 'without_mask': 2, 'mask_weared_incorrect': 3}
anno_train_dir = "../dataset/train/annotations"
anno_val_dir = "../dataset/val/annotations"
save_dir = "../dataset/annotations_json"
if not os.path.exists(save_dir):
os.makedirs(save_dir)
def convert(xml_list, json_file):
json_dict = {"images": [], "type": "instances", "annotations": [], "categories": []}
categories = define_categories.copy()
bnd_id = 1
for line in xml_list:
xml_f = line
tree = ET.parse(xml_f)
root = tree.getroot()
filename = root.find('filename').text[:-4] + '.jpg'
image_id = int(filename[12:-4]) # maksssksksss 12
size = root.find('size')
width = int(size.find('width').text)
height = int(size.find('height').text)
image = {'file_name': filename, 'height': height, 'width': width, 'id': image_id}
json_dict['images'].append(image)
for obj in root.findall('object'):
category = obj.find('name').text
if category not in categories:
continue
category_id = categories[category]
bndbox = obj.find('bndbox')
xmin = int(float(bndbox.find('xmin').text)) - 1
ymin = int(float(bndbox.find('ymin').text)) - 1
xmax = int(float(bndbox.find('xmax').text)) - 1
ymax = int(float(bndbox.find('ymax').text)) - 1
assert (xmax > xmin), "xmax <= xmin, {}".format(line)
assert (ymax > ymin), "ymax <= ymin, {}".format(line)
o_width = abs(xmax - xmin)
o_height = abs(ymax - ymin)
ann = {'area': o_width * o_height,
'iscrowd': 0,
'image_id': image_id,
'bbox': [xmin, ymin, o_width, o_height],
'category_id': category_id,
'id': bnd_id,
'segmentation': []} # Currently we do not support segmentation
json_dict['annotations'].append(ann)
bnd_id = bnd_id + 1
for cate_name, cid in categories.items():
cat = {'supercategory': 'none',
'id': cid,
'name': cate_name}
json_dict['categories'].append(cat)
json_fp = open(json_file, 'w')
json_str = json.dumps(json_dict)
json_fp.write(json_str)
json_fp.close()
print("------------create {} done--------------".format(json_file))
print("category: id --> {}".format(categories))
print(categories.keys())
print(categories.values())
if __name__ == '__main__':
anno_train_list = glob.glob(anno_train_dir + "/*.xml")
anno_train_list = np.sort(anno_train_list)
anno_val_list = glob.glob(anno_val_dir + "/*.xml")
anno_val_list = np.sort(anno_val_list)
# save json files
save_anno_train = save_dir + "/train.json"
save_anno_val = save_dir + "/val.json"
convert(anno_train_list, save_anno_train)
convert(anno_val_list, save_anno_val)
......@@ -47,8 +47,8 @@ class YOLOv3Tiny(nn.Cell):
super(YOLOv3Tiny, self).__init__()
self.backbone = backbone
self.head1 = nn.Conv2d(512, 255, kernel_size=1, stride=1, has_bias=True)
self.head2 = nn.Conv2d(256, 255, kernel_size=1, stride=1, has_bias=True)
self.head1 = nn.Conv2d(512, config.out_channel, kernel_size=1, stride=1, has_bias=True)
self.head2 = nn.Conv2d(256, config.out_channel, kernel_size=1, stride=1, has_bias=True)
self.concat = P.Concat(axis=1)
def construct(self, x):
......
......@@ -55,11 +55,12 @@ def set_default():
config.t_max = config.max_epoch
config.lr_epochs = list(map(int, config.lr_epochs.split(',')))
config.data_root = os.path.join(config.data_dir, 'train2017')
config.ann_file = os.path.join(config.data_dir, 'annotations/instances_train2017.json')
config.data_val_root = os.path.join(config.data_dir, 'val2017')
config.ann_val_file = os.path.join(config.data_dir, 'annotations/instances_val2017.json')
if config.finetune:
config.data_root = os.path.join(config.data_dir, 'train/images')
config.ann_file = os.path.join(config.data_dir, 'annotations_json/train.json')
else:
config.data_root = os.path.join(config.data_dir, 'train2017')
config.ann_file = os.path.join(config.data_dir, 'annotations/instances_train2017.json')
context.set_context(mode=context.GRAPH_MODE, device_target=config.device_target, device_id=get_device_id())
......@@ -183,7 +184,28 @@ def prepare_network():
network = YOLOv3(is_training=True)
# default is kaiming-normal
default_recurisive_init(network)
load_yolo_params(config, network)
if config.finetune:
param_dict = ms.load_checkpoint(config.finetune_path)
param_dict_new = {}
for key, values in param_dict.items():
if key.startswith('moments.'):
continue
elif key.startswith('yolo_network.'):
param_dict_new[key[13:]] = values
else:
param_dict_new[key] = values
for key in list(param_dict_new.keys()):
if 'feature_map.head1.weight' in key:
del param_dict_new[key]
if 'feature_map.head1.bias' in key:
del param_dict_new[key]
if 'feature_map.head2.weight' in key:
del param_dict_new[key]
if 'feature_map.head2.bias' in key:
del param_dict_new[key]
ms.load_param_into_net(network, param_dict_new)
else:
load_yolo_params(config, network)
network = YOLOWithLossCell(network)
return network
......@@ -235,10 +257,10 @@ def run_train():
if config.rank_save_ckpt_flag:
# checkpoint save
ckpt_max_num = config.max_epoch * config.steps_per_epoch // config.ckpt_interval
ckpt_max_num = 5 * config.steps_per_epoch // config.ckpt_interval
ckpt_config = CheckpointConfig(save_checkpoint_steps=config.ckpt_interval, keep_checkpoint_max=ckpt_max_num)
save_ckpt_path = os.path.join(config.outputs_dir, f'ckpt_{config.rank}/')
ckpt_cb = ModelCheckpoint(config=ckpt_config, directory=save_ckpt_path, prefix='{}'.format(config.rank))
ckpt_cb = ModelCheckpoint(config=ckpt_config, directory=save_ckpt_path, prefix='yolov3_tiny')
cb_params = _InternalCallbackParam()
cb_params.train_network = network
cb_params.epoch_num = ckpt_max_num
......
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment