Skip to content

Commit ffac767

Browse files
committed
fix info new release
1 parent 8606b53 commit ffac767

3 files changed

Lines changed: 16 additions & 11 deletions

File tree

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -115,7 +115,7 @@ Class implementing cropping and passing crops through a neural network for detec
115115
| model | ultralytics model | None | Pre-initialized model object. If provided, the model will be used directly instead of loading from model_path. |
116116
| imgsz | int | 640 | Size of the input image for inference YOLO. |
117117
| conf | float | 0.25 | Confidence threshold for detections YOLO. |
118-
| iou | float | 0.7 | IoU threshold for non-maximum suppression YOLOv8 of single crop. |
118+
| iou | float | 0.7 | IoU threshold for non-maximum suppression YOLO of single crop. |
119119
| classes_list | List[int] or None | None | List of classes to filter detections. If None, all classes are considered. |
120120
| segment | bool | False | Whether to perform segmentation (if the model supports it). |
121121
| shape_x | int | 700 | Size of the crop in the x-coordinate. |

patched_yolo_infer/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ __Check this Colab examples:__
2727

2828
Patch-Based-Inference Example - [**Open in Colab**](https://colab.research.google.com/drive/1XCpIYLMFEmGSO0XCOkSD7CcD9SFHSJPA?usp=sharing)
2929

30-
Example of using various functions for visualizing basic YOLOv8/v9 inference results - [**Open in Colab**](https://colab.research.google.com/drive/1eM4o1e0AUQrS1mLDpcgK9HKInWEvnaMn?usp=sharing)
30+
Example of using various functions for visualizing basic YOLO inference results - [**Open in Colab**](https://colab.research.google.com/drive/1eM4o1e0AUQrS1mLDpcgK9HKInWEvnaMn?usp=sharing)
3131

3232

3333
## Usage
@@ -89,9 +89,9 @@ Class implementing cropping and passing crops through a neural network for detec
8989
- **model** (*ultralytics model*) Pre-initialized model object. If provided, the model will be used directly instead of loading from model_path.
9090
- **imgsz** (*int*): Size of the input image for inference YOLO.
9191
- **conf** (*float*): Confidence threshold for detections YOLO.
92-
- **iou** (*float*): IoU threshold for non-maximum suppression YOLOv8 of single crop.
92+
- **iou** (*float*): IoU threshold for non-maximum suppression YOLO of single crop.
9393
- **classes_list** (*List[int] or None*): List of classes to filter detections. If None, all classes are considered. Defaults to None.
94-
- **segment** (*bool*): Whether to perform segmentation (YOLOv8-seg).
94+
- **segment** (*bool*): Whether to perform segmentation (YOLO-seg).
9595
- **shape_x** (*int*): Size of the crop in the x-coordinate.
9696
- **shape_y** (*int*): Size of the crop in the y-coordinate.
9797
- **overlap_x** (*float*): Percentage of overlap along the x-axis.

setup.py

Lines changed: 12 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
long_description = "\n" + fh.read()
99

1010

11-
VERSION = '1.3.4'
11+
VERSION = '1.3.5'
1212
DESCRIPTION = '''Patch-Based-Inference for detection/segmentation of small objects in images.'''
1313

1414
setup(
@@ -30,25 +30,30 @@
3030
],
3131
keywords=[
3232
"python",
33-
"yolov8",
34-
"yolov9",
35-
"yolov10",
36-
"yolo11",
33+
"YOLOv8",
34+
"YOLOv9",
35+
"YOLOv10",
36+
"YOLO11",
37+
"YOLO-seg",
38+
"YOLO-pose",
3739
"rtdetr",
3840
"fastsam",
3941
"sahi",
4042
"object detection",
4143
"instance segmentation",
4244
"patch-based inference",
45+
"patch-based",
4346
"small object detection",
44-
"yolov8-seg",
4547
"image patching",
46-
"yolo visualization",
48+
"YOLO visualization",
49+
"YOLO-seg visualization",
50+
"YOLO-pose visualization",
4751
"slice-based inference",
4852
"slicing inference",
4953
"inference visualization",
5054
"patchify",
5155
"ultralytics",
56+
"computer-vision"
5257
],
5358
classifiers=[
5459
"Development Status :: 5 - Production/Stable",

0 commit comments

Comments
 (0)