Skip to content

Commit 8ec861f

Browse files
committed
[DOC] Update guide.rst
1 parent a180128 commit 8ec861f

1 file changed

Lines changed: 30 additions & 1 deletion

File tree

docs/guide.rst

Lines changed: 30 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,43 @@
11
Guidance
22
========
33

4-
This page provides useful instructions on choosing the appropriate ensemble method.
4+
This page provides useful instructions on how to choose the appropriate ensemble method for your deep learning model.
55

66
Check Your Model
77
----------------
88

9+
A good rule-of-thumb is to check the performance of your deep learning model first. Below are two important aspects that you should pay attention to:
10+
11+
* What is the final performance of your model ?
12+
* Does your model suffer from the over-fitting problem ?
13+
14+
To check these two aspects, it is recommended to evaluate the performance of your model on the :obj:`test_loader` after each training epoch.
15+
16+
.. tip::
17+
Using Ensemble-PyTorch, you can pass your model to the :class:`Fusion` or :class:`Voting` with the argument ``n_estimators`` set to ``1``. The behavior of the ensemble should be the same as a single model.
18+
19+
If the performance of your model is relatively good, for example, the testing accuracy of your LeNet-5 CNN model on MNIST is over 99%, the conclusion on the first point is that it is not likely that your model suffers from the under-fitting problem. You could skip the section :ref:`under_fit`.
20+
21+
.. _under_fit:
22+
923
Under-fit
1024
---------
1125

26+
If the performance of your model is unsatisfactory, you can try out the :class:`Gradient Boosting` related ensemble methods. :class:`Gradient Boosting` focuses on reducing the bias term from the perspective of `Bias-Variance Decomposition <https://en.wikipedia.org/wiki/Bias%E2%80%93variance_tradeoff>`__, it usually works well when your deep learning model is a weak learner on the dataset.
27+
28+
Below is the pros and cons on using :class:`Gradient Boosting`:
29+
30+
* Pros:
31+
- You can have a much higher improvements than using other ensemble methods
32+
* Cons:
33+
- Relatively longer training time
34+
- Suffer from the over-fitting problem if the value of ``n_estimators`` is large
35+
36+
.. tip::
37+
:class:`Gradient Boosting` in Ensemble-PyTorch supports the early stopping to alleviate the over-fitting. To use early stopping, you need to set the input argument ``test_loader`` and ``early_stopping_rounds`` when calling the :meth:`fit` function of :class:`Gradient Boosting`. In additional, using a small ``shrinkage_rate`` when declaring the model also helps to alleviate the over-fitting problem.
38+
39+
.. _over_fit:
40+
1241
Over-fit
1342
--------
1443

0 commit comments

Comments
 (0)