Skip to content

Commit 51ff1f3

Browse files
committed
example update
1 parent 62b27e9 commit 51ff1f3

1 file changed

Lines changed: 1 addition & 19 deletions

File tree

examples/example_patch_based_inference.ipynb

Lines changed: 1 addition & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,6 @@
148148
" conf=0.5,\n",
149149
" iou=0.7,\n",
150150
" classes_list=[0, 1, 2, 3, 5, 7],\n",
151-
" resize_initial_size=True,\n",
152151
")\n",
153152
"result = CombineDetections(element_crops, nms_threshold=0.05)"
154153
]
@@ -401,7 +400,6 @@
401400
" conf=0.5,\n",
402401
" iou=0.7,\n",
403402
" classes_list=[0, 1, 2, 3, 5, 7],\n",
404-
" resize_initial_size=True,\n",
405403
")\n",
406404
"result = CombineDetections(element_crops, nms_threshold=0.05)\n",
407405
"\n",
@@ -480,7 +478,6 @@
480478
" conf=0.5,\n",
481479
" iou=0.7,\n",
482480
" classes_list=[0, 1, 2, 3, 5, 7],\n",
483-
" resize_initial_size=True,\n",
484481
")\n",
485482
"result = CombineDetections(element_crops, nms_threshold=0.5)"
486483
]
@@ -592,7 +589,6 @@
592589
" conf=0.5,\n",
593590
" iou=0.7,\n",
594591
" classes_list=[0, 1, 2, 3, 5, 7],\n",
595-
" resize_initial_size=True,\n",
596592
")\n",
597593
"result = CombineDetections(element_crops, nms_threshold=0.5)"
598594
]
@@ -751,7 +747,6 @@
751747
" conf=0.5,\n",
752748
" iou=0.8,\n",
753749
" classes_list=[0, 1, 2, 3, 5, 7],\n",
754-
" resize_initial_size=True,\n",
755750
")\n",
756751
"result = CombineDetections(element_crops, nms_threshold=0.25)\n",
757752
"\n",
@@ -887,7 +882,6 @@
887882
" overlap_y=50,\n",
888883
" conf=0.3,\n",
889884
" iou=0.8,\n",
890-
" resize_initial_size=True,\n",
891885
")\n",
892886
"result = CombineDetections(element_crops, nms_threshold=0.40)\n",
893887
"\n",
@@ -1032,7 +1026,6 @@
10321026
" iou=0.7,\n",
10331027
" imgsz=416,\n",
10341028
" classes_list=[0, 1, 2, 3, 5, 7],\n",
1035-
" resize_initial_size=True,\n",
10361029
" memory_optimize=False,\n",
10371030
" inference_extra_args={'retina_masks':True}\n",
10381031
")\n",
@@ -1233,7 +1226,6 @@
12331226
" iou=0.7,\n",
12341227
" imgsz=1024,\n",
12351228
" classes_list=[0, 39, 40, 41, 42, 43, 44, 45, 56, 60],\n",
1236-
" resize_initial_size=True,\n",
12371229
" memory_optimize=False,\n",
12381230
" inference_extra_args={'retina_masks': True}\n",
12391231
" \n",
@@ -1279,16 +1271,7 @@
12791271
"\n",
12801272
"1. **Resolution-Based Analysis**: This mode evaluates the resolution of the source images to determine the optimal patch sizes and overlaps. It is faster but may not yield the highest quality results because it does not take into account the actual objects present in the images.\n",
12811273
"\n",
1282-
"2. **Neural Network-Based Analysis**: This advanced mode employs a neural network to analyze the images. The algorithm performs a standard inference of the network on the entire image and identifies the largest detected objects. Based on the sizes of these objects, the algorithm selects patch parameters to ensure that the largest objects are fully contained within a patch, and overlapping patches ensure comprehensive coverage. In this mode, it is necessary to input the model that will be used for patch-based inference in the subsequent steps.\n",
1283-
"\n",
1284-
"Possible arguments of the ```auto_calculate_crop_values``` function:\n",
1285-
"| **Argument** | **Type** | **Default** | **Description** |\n",
1286-
"|-----------------------|------------------------|--------------|----------------------------------------------------------------------------------------------------------------|\n",
1287-
"| image | np.ndarray | | \tThe input image in BGR format. |\n",
1288-
"| mode | str | \"network_based\" | The type of analysis to perform. Can be \"resolution_based\" for Resolution-Based Analysis or \"network_based\" for Neural Network-Based Analysis.|\n",
1289-
"| model | ultralytics model | YOLO(\"yolov8m.pt\") | Pre-initialized model object for \"network_based\" mode. If not provided, the default YOLOv8m model will be used.|\n",
1290-
"| classes_list | list | None | A list of class indices to consider for object detection in \"network_based\" mode. If None, all classes will be considered. |\n",
1291-
"| conf | float | 0.25 | The confidence threshold for detection in \"network_based\" mode. |"
1274+
"2. **Neural Network-Based Analysis**: This advanced mode employs a neural network to analyze the images. The algorithm performs a standard inference of the network on the entire image and identifies the largest detected objects. Based on the sizes of these objects, the algorithm selects patch parameters to ensure that the largest objects are fully contained within a patch, and overlapping patches ensure comprehensive coverage. In this mode, it is necessary to input the model that will be used for patch-based inference in the subsequent steps."
12921275
]
12931276
},
12941277
{
@@ -1364,7 +1347,6 @@
13641347
" conf=0.5,\n",
13651348
" iou=0.7,\n",
13661349
" classes_list=[0, 1, 2, 3, 5, 7],\n",
1367-
" resize_initial_size=True,\n",
13681350
")\n",
13691351
"result = CombineDetections(element_crops, nms_threshold=0.35, sorter_bins=4)"
13701352
]

0 commit comments

Comments
 (0)