Skip to content

Commit 282a78d

Browse files
committed
Enhance documentation for machine learning algorithms
1 parent a71618f commit 282a78d

3 files changed

Lines changed: 89 additions & 5 deletions

File tree

machine_learning/decision_tree.py

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,44 @@
1+
"""
2+
Decision Tree Regression Implementation
3+
4+
This module implements a basic regression decision tree using Python and NumPy.
5+
6+
📄 Overview:
7+
Decision Trees are a type of supervised learning algorithm used for both
8+
classification and regression tasks. In this implementation, the tree
9+
is specifically designed for regression: mapping continuous input features
10+
to continuous outputs.
11+
12+
📌 Features:
13+
- Supports one-dimensional datasets with continuous labels.
14+
- Splits data recursively based on minimizing mean squared error.
15+
- Can specify maximum tree depth and minimum leaf size.
16+
- Predicts outputs for unseen data using the trained tree.
17+
18+
🌟 Educational Value:
19+
This implementation is lightweight and designed for learning purposes.
20+
It demonstrates:
21+
- The core concept of recursive partitioning in decision trees.
22+
- How mean squared error guides splitting decisions.
23+
- The trade-off between tree depth and model complexity.
24+
25+
💡 Use Cases:
26+
- Understanding how regression trees work internally.
27+
- Testing and experimenting with small datasets.
28+
- Educational projects, tutorials, or self-learning exercises.
29+
30+
Audience:
31+
- Beginners learning machine learning concepts
32+
- Students studying regression algorithms
33+
- Anyone interested in understanding decision tree internals
34+
35+
📦 Dependencies:
36+
- numpy
37+
38+
Note: This implementation is meant for demonstration and learning purposes
39+
and may not be optimized for large-scale production datasets.
40+
"""
41+
142
"""
243
Implementation of a basic regression decision tree.
344
Input data set: The input data set must be 1-dimensional with continuous labels.

machine_learning/linear_regression.py

Lines changed: 27 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,30 @@
1+
"""
2+
Linear Regression Algorithm - Predictive Analysis (Enhanced Documentation)
3+
4+
This file implements linear regression using gradient descent to predict
5+
values based on a dataset. The example dataset here is from CSGO (ADR vs Rating).
6+
7+
Purpose:
8+
- Demonstrates basic linear regression with a small dataset.
9+
- Shows how to collect data from an online source.
10+
- Illustrates calculation of errors (sum of squares and mean absolute error).
11+
- Provides a step-by-step gradient descent implementation.
12+
13+
Audience:
14+
- Beginners learning machine learning algorithms.
15+
- Students who want to understand linear regression implementation in Python.
16+
17+
Dependencies:
18+
- Python >= 3.13
19+
- httpx
20+
- numpy
21+
22+
Notes:
23+
- The algorithm iteratively updates feature weights to minimize error.
24+
- Output includes the feature vector representing the best-fit line.
25+
- Designed for educational purposes; can be adapted for other datasets.
26+
"""
27+
128
"""
229
Linear regression is the most basic type of regression commonly used for
330
predictive analysis. The idea is pretty simple: we have a dataset and we have
@@ -15,7 +42,6 @@
1542
# "numpy",
1643
# ]
1744
# ///
18-
1945
import httpx
2046
import numpy as np
2147

machine_learning/logistic_regression.py

Lines changed: 21 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,10 +9,28 @@
99
# importing all the required libraries
1010

1111
"""
12-
Implementing logistic regression for classification problem
12+
Implementation of Logistic Regression from scratch.
13+
14+
Audience:
15+
- Beginners exploring supervised learning and classification algorithms
16+
- Students learning how logistic regression works mathematically and programmatically
17+
- Developers wanting to understand how to implement gradient descent for classification
18+
19+
Dependencies:
20+
- numpy
21+
- matplotlib
22+
- scikit-learn (for dataset demonstration)
23+
24+
Notes:
25+
- This implementation is for educational purposes and demonstrates logistic regression
26+
without using high-level libraries.
27+
- The training is done with gradient descent; for large datasets, optimization techniques
28+
like stochastic gradient descent may be preferable.
29+
- Visualization is based on the Iris dataset, focusing only on two features for simplicity.
30+
1331
Helpful resources:
14-
Coursera ML course
15-
https://medium.com/@martinpella/logistic-regression-from-scratch-in-python-124c5636b8ac
32+
- Coursera Machine Learning course
33+
- https://medium.com/@martinpella/logistic-regression-from-scratch-in-python-124c5636b8ac
1634
"""
1735

1836
import numpy as np
@@ -21,7 +39,6 @@
2139

2240
# get_ipython().run_line_magic('matplotlib', 'inline')
2341

24-
2542
# In[67]:
2643

2744
# sigmoid function or logistic function is used as a hypothesis function in

0 commit comments

Comments
 (0)