Skip to content

Commit 9574c0c

Browse files
committed
doc: update README.rst
1 parent 4341216 commit 9574c0c

1 file changed

Lines changed: 14 additions & 28 deletions

File tree

README.rst

Lines changed: 14 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -20,10 +20,10 @@
2020
Ensemble PyTorch
2121
================
2222

23-
A unified ensemble framework for pytorch_ to easily improve the performance and robustness of your deep learning model.
23+
A unified ensemble framework for pytorch_ to easily improve the performance and robustness of your deep learning model. Ensemble-PyTorch is part of the `pytorch ecosystem <https://pytorch.org/ecosystem/>`__ which requires the project to be well maintained.
2424

2525
* `Document <https://ensemble-pytorch.readthedocs.io/>`__
26-
* `Source Code <https://github.com/xuyxu/Ensemble-Pytorch>`__
26+
* `Source Code <https://github.com/https://github.com/TorchEnsemble-Community/Ensemble-Pytorch/Ensemble-Pytorch>`__
2727
* `Experiment <https://ensemble-pytorch.readthedocs.io/en/stable/experiment.html>`__
2828

2929
Installation
@@ -39,7 +39,7 @@ Latest version (under development):
3939

4040
.. code:: bash
4141
42-
pip install git+https://github.com/xuyxu/Ensemble-Pytorch
42+
pip install git+https://github.com/TorchEnsemble-Community/Ensemble-Pytorch.git
4343
4444
Example
4545
-------
@@ -52,43 +52,33 @@ Example
5252
train_loader = DataLoader(...)
5353
test_loader = DataLoader(...)
5454
55-
'''
56-
[Step-1] Define the ensemble
57-
'''
58-
model = VotingClassifier(
55+
# Define the ensemble
56+
ensemble = VotingClassifier(
5957
estimator=base_estimator, # here is your deep learning model
6058
n_estimators=10, # number of base estimators
6159
)
6260
63-
'''
64-
[Step-2] Set the parameter optimizer
65-
'''
66-
model.set_optimizer(
61+
# Set the optimizer
62+
ensemble.set_optimizer(
6763
"Adam", # type of parameter optimizer
6864
lr=learning_rate, # learning rate of parameter optimizer
6965
weight_decay=weight_decay, # weight decay of parameter optimizer
7066
)
71-
72-
'''
73-
[Step-3] Set the learning rate scheduler
74-
'''
75-
model.set_scheduler(
67+
68+
# Set the learning rate scheduler
69+
ensemble.set_scheduler(
7670
"CosineAnnealingLR", # type of learning rate scheduler
7771
T_max=epochs, # additional arguments on the scheduler
7872
)
7973
80-
'''
81-
[Step-4] Train the ensemble
82-
'''
83-
model.fit(
74+
# Train the ensemble
75+
ensemble.fit(
8476
train_loader,
8577
epochs=epochs, # number of training epochs
8678
)
8779
88-
'''
89-
[Step-5] Evaluate the ensemble
90-
'''
91-
acc = model.predict(test_loader) # testing accuracy
80+
# Evaluate the ensemble
81+
acc = ensemble.predict(test_loader) # testing accuracy
9282
9383
Supported Ensemble
9484
------------------
@@ -104,8 +94,6 @@ Supported Ensemble
10494
+------------------------------+------------+---------------------------+
10595
| Gradient Boosting [3]_ | Sequential | gradient_boosting.py |
10696
+------------------------------+------------+---------------------------+
107-
| Soft Gradient Boosting [7]_ | Parallel | soft_gradient_boosting.py |
108-
+------------------------------+------------+---------------------------+
10997
| Snapshot Ensemble [4]_ | Sequential | snapshot_ensemble.py |
11098
+------------------------------+------------+---------------------------+
11199
| Adversarial Training [5]_ | Parallel | adversarial_training.py |
@@ -135,8 +123,6 @@ Reference
135123
136124
.. [6] Garipov, Timur, et al. Loss Surfaces, Mode Connectivity, and Fast Ensembling of DNNs. NeurIPS, 2018.
137125
138-
.. [7] Feng, Ji, et al. Soft Gradient Boosting Machine. arXiv, 2020.
139-
140126
.. _pytorch: https://pytorch.org/
141127

142128
.. _pypi: https://pypi.org/project/torchensemble/

0 commit comments

Comments
 (0)