Skip to content

Commit 77a10c2

Browse files
awaelchlilexierule
authored andcommitted
Fix type check for non-standard schedulers in horovod (#14215)
1 parent 52dec63 commit 77a10c2

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

src/pytorch_lightning/CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
88

99
### Fixed
1010

11+
- Fixed an assertion error when using a `ReduceOnPlateau` scheduler with the Horovod strategy ([#14215](https://github.com/Lightning-AI/lightning/pull/14215))
1112

1213

1314

src/pytorch_lightning/strategies/horovod.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,6 @@
3131
from pytorch_lightning.utilities.exceptions import MisconfigurationException
3232
from pytorch_lightning.utilities.imports import _HOROVOD_AVAILABLE
3333
from pytorch_lightning.utilities.rank_zero import rank_zero_only
34-
from pytorch_lightning.utilities.types import _LRScheduler
3534

3635
if _HOROVOD_AVAILABLE:
3736
import horovod.torch as hvd
@@ -114,8 +113,8 @@ def _unpack_lightning_optimizer(opt: Optimizer) -> Optimizer:
114113
lr_scheduler_configs = self.lr_scheduler_configs
115114
for config in lr_scheduler_configs:
116115
scheduler = config.scheduler
117-
assert isinstance(scheduler, _LRScheduler)
118-
scheduler.base_lrs = [lr * self.world_size for lr in scheduler.base_lrs]
116+
if hasattr(scheduler, "base_lrs"):
117+
scheduler.base_lrs = [lr * self.world_size for lr in scheduler.base_lrs] # type: ignore[union-attr]
119118

120119
assert self.lightning_module is not None
121120
# Horovod: broadcast parameters & optimizer state to ensure consistent initialization

0 commit comments

Comments
 (0)