Commit f85bf28
authored
Add MLP activation function comparison script
Added a new script (mlp_activation_comparison.py) that demonstrates the effect of different activation functions
('relu', 'tanh', 'logistic') on a simple dataset using scikit-learn's MLPClassifier.
This helps visualize and understand how activation choices influence model performance.1 parent beb3cfd commit f85bf28
1 file changed
Lines changed: 23 additions & 0 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
0 commit comments