Skip to content

Commit 5880e28

Browse files
authored
Merge branch 'master' into bugfix/logger-attributeerror
2 parents 7d5016d + 047f0aa commit 5880e28

File tree

8 files changed

+24
-150
lines changed

8 files changed

+24
-150
lines changed

.github/workflows/ci-pytorch-test-conda.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ jobs:
2727
- {python-version: "3.8", pytorch-version: "1.10"}
2828
- {python-version: "3.9", pytorch-version: "1.11"}
2929
- {python-version: "3.9", pytorch-version: "1.12"}
30-
timeout-minutes: 30
30+
timeout-minutes: 40
3131

3232
steps:
3333
- name: Workaround for https://github.com/actions/checkout/issues/760

.github/workflows/docs-deploy.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
name: "Deploy Docs"
22
on:
33
push:
4-
branches: [master]
4+
branches: ["release/app"]
55

66
jobs:
77
# https://github.com/marketplace/actions/deploy-to-github-pages

docs/source-app/workflows/build_lightning_app/from_scratch_content.rst

Lines changed: 0 additions & 61 deletions
Original file line numberDiff line numberDiff line change
@@ -58,64 +58,3 @@ Run the Lightning App on the cloud:
5858
.. code:: bash
5959
6060
lightning run app app.py --cloud
61-
62-
----
63-
64-
*************************************
65-
Build a Lightning App from a template
66-
*************************************
67-
If you didn't find an Lightning App similar to the one you need (in the `Lightning App gallery <https://lightning.ai/apps>`_), another option is to start from a template.
68-
The Lightning CLI can generate a template with built-in testing that can be easily published to the
69-
Lightning App Gallery.
70-
71-
Generate a Lightning App with our template generator:
72-
73-
.. code:: bash
74-
75-
lightning init app your-app-name
76-
77-
You'll see a print-out like this:
78-
79-
.. code:: bash
80-
81-
➜ lightning init app your-app-name
82-
83-
/Users/Your/Current/dir/your-app-name
84-
INFO: laying out app template at /Users/Your/Current/dir/your-app-name
85-
INFO:
86-
Lightning app template created!
87-
/Users/Your/Current/dir/your-app-name
88-
89-
run your app with:
90-
lightning run app your-app-name/app.py
91-
92-
run it on the cloud to share with your collaborators:
93-
lightning run app your-app-name/app.py --cloud
94-
95-
----
96-
97-
Modify the Lightning App template
98-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
99-
The command above generates a Lightning App file like this:
100-
101-
.. code:: python
102-
103-
from your_app_name import ComponentA, ComponentB
104-
105-
import lightning as L
106-
107-
108-
class LitApp(L.LightningFlow):
109-
def __init__(self) -> None:
110-
super().__init__()
111-
self.component_a = ComponentA()
112-
self.component_b = ComponentB()
113-
114-
def run(self):
115-
self.component_a.run()
116-
self.component_b.run()
117-
118-
119-
app = L.LightningApp(LitApp())
120-
121-
Now you can add your own components as you wish!

docs/source-app/workflows/build_lightning_component/from_scratch_component_content.rst

Lines changed: 0 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -151,50 +151,3 @@ run the app
151151
.. code:: bash
152152
153153
lightning run app app.py
154-
155-
----
156-
157-
*******************************************
158-
Build a Lightning component from a template
159-
*******************************************
160-
If you'd prefer a component template with built-in testing that can be easily published to the
161-
Lightning component gallery, generate it with our template generator:
162-
163-
.. code:: bash
164-
165-
lightning init component your-component-name
166-
167-
You'll see a print-out like this:
168-
169-
.. code:: bash
170-
171-
➜ lightning init component your-component-name
172-
INFO: laying out component template at /Users/williamfalcon/Developer/opensource/_/lightning/scratch/hello-world
173-
INFO:
174-
⚡ Lightning component template created!
175-
/Users/williamfalcon/Developer/opensource/_/lightning/scratch/hello-world
176-
177-
...
178-
179-
----
180-
181-
Modify the component template
182-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
183-
The command above generates a component file like this:
184-
185-
.. code:: python
186-
187-
import lightning as L
188-
189-
190-
class TemplateComponent(L.LightningWork):
191-
def __init__(self) -> None:
192-
super().__init__()
193-
self.value = 0
194-
195-
def run(self):
196-
self.value += 1
197-
print("welcome to your work component")
198-
print("this is running inside a work")
199-
200-
Now you can modify the component as you wish!

requirements/pytorch/base.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ torch>=1.9.*, <=1.12.0
66
tqdm>=4.57.0, <4.65.0
77
PyYAML>=5.4, <=6.0
88
fsspec[http]>=2021.05.0, !=2021.06.0, <2022.6.0
9-
tensorboard>=2.9.1, <2.10.0
9+
tensorboard>=2.9.1, <2.11.0
1010
torchmetrics>=0.7.0, <0.9.3 # needed for using fixed compare_version
1111
pyDeprecate>=0.3.1, <=0.3.2
1212
packaging>=17.0, <=21.3

src/pytorch_lightning/CHANGELOG.md

Lines changed: 16 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -8,15 +8,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
88

99
### Added
1010

11-
- Added `FullyShardedNativeNativeMixedPrecisionPlugin` to handle precision for `DDPFullyShardedNativeStrategy` ([#14092](https://github.com/Lightning-AI/lightning/pull/14092))
12-
1311

1412
- Added prefix to log message in `seed_everything` with rank info ([#13290](https://github.com/Lightning-AI/lightning/issues/13290))
1513

1614

17-
- Added profiling to these hooks: `on_before_batch_transfer`, `transfer_batch_to_device`, `on_after_batch_transfer`, `configure_gradient_clipping`, `clip_gradients` ([#14069](https://github.com/Lightning-AI/lightning/pull/14069))
18-
19-
2015
-
2116

2217

@@ -28,17 +23,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
2823
- Raised a `MisconfigurationException` if batch transfer hooks are overriden with `IPUAccelerator` ([#13961](https://github.com/Lightning-AI/lightning/pull/13961))
2924

3025

31-
- Updated compatibility for LightningLite to run with the latest DeepSpeed 0.7.0 ([13967](https://github.com/Lightning-AI/lightning/pull/13967))
32-
33-
3426
- Replaced the unwrapping logic in strategies with direct access to unwrapped `LightningModule` ([#13738](https://github.com/Lightning-AI/lightning/pull/13738))
3527

3628

37-
- The `WandbLogger.name` property no longer returns the name of the experiment, and instead returns the project's name ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))
38-
39-
40-
- The default project name in `WandbLogger` is now "lightning_logs" ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))
41-
4229

4330
### Deprecated
4431

@@ -77,46 +64,41 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
7764

7865
### Fixed
7966

80-
- Fixed a bug that caused spurious `AttributeError` when multiple `DataLoader` classes are imported ([#14117](https://github.com/Lightning-AI/lightning/pull/14117))
67+
- Fixed an assertion error when using a `ReduceOnPlateau` scheduler with the Horovod strategy ([#14215](https://github.com/Lightning-AI/lightning/pull/14215))
8168

8269

83-
- Fixed epoch-end logging results not being reset after the end of the epoch ([#14061](https://github.com/Lightning-AI/lightning/pull/14061))
70+
- Fixed an `AttributeError` when accessing `LightningModule.logger` and the Trainer has multiple loggers ([#14234](https://github.com/Lightning-AI/lightning/pull/14234))
8471

8572

86-
- Fixed resuming from a checkpoint when using Stochastic Weight Averaging (SWA) ([#9938](https://github.com/Lightning-AI/lightning/pull/9938))
73+
## [1.7.2] - 2022-08-17
8774

75+
### Added
8876

89-
- Fixed the device placement when `LightningModule.cuda()` gets called without specifying a device index and the current cuda device was not 0 ([#14128](https://github.com/Lightning-AI/lightning/pull/14128))
77+
- Added `FullyShardedNativeNativeMixedPrecisionPlugin` to handle precision for `DDPFullyShardedNativeStrategy` ([#14092](https://github.com/Lightning-AI/lightning/pull/14092))
78+
- Added profiling to these hooks: `on_before_batch_transfer`, `transfer_batch_to_device`, `on_after_batch_transfer`, `configure_gradient_clipping`, `clip_gradients` ([#14069](https://github.com/Lightning-AI/lightning/pull/14069))
9079

80+
### Changed
9181

92-
- Avoided false positive warning about using `sync_dist` when using torchmetrics ([#14143](https://github.com/Lightning-AI/lightning/pull/14143))
82+
- The `WandbLogger.name` property no longer returns the name of the experiment, and instead returns the project's name ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))
83+
- The default project name in `WandbLogger` is now "lightning_logs" ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))
84+
- Updated compatibility for LightningLite to run with the latest DeepSpeed 0.7.0 ([13967](https://github.com/Lightning-AI/lightning/pull/13967))
9385

86+
### Fixed
9487

88+
- Fixed a bug that caused spurious `AttributeError` when multiple `DataLoader` classes are imported ([#14117](https://github.com/Lightning-AI/lightning/pull/14117))
89+
- Fixed epoch-end logging results not being reset after the end of the epoch ([#14061](https://github.com/Lightning-AI/lightning/pull/14061))
90+
- Fixed resuming from a checkpoint when using Stochastic Weight Averaging (SWA) ([#9938](https://github.com/Lightning-AI/lightning/pull/9938))
91+
- Fixed the device placement when `LightningModule.cuda()` gets called without specifying a device index and the current cuda device was not 0 ([#14128](https://github.com/Lightning-AI/lightning/pull/14128))
92+
- Avoided false positive warning about using `sync_dist` when using torchmetrics ([#14143](https://github.com/Lightning-AI/lightning/pull/14143))
9593
- Avoid `metadata.entry_points` deprecation warning on Python 3.10 ([#14052](https://github.com/Lightning-AI/lightning/pull/14052))
96-
97-
9894
- Fixed epoch-end logging results not being reset after the end of the epoch ([#14061](https://github.com/Lightning-AI/lightning/pull/14061))
99-
100-
10195
- Avoid raising the sampler warning if num_replicas=1 ([#14097](https://github.com/Lightning-AI/lightning/pull/14097))
102-
103-
10496
- Fixed saving hyperparameters in a composition where the parent class is not a `LightningModule` or `LightningDataModule` ([#14151](https://github.com/Lightning-AI/lightning/pull/14151))
105-
106-
10797
- Avoided requiring the FairScale package to use precision with the fsdp native strategy ([#14092](https://github.com/Lightning-AI/lightning/pull/14092))
108-
109-
110-
- Fixed an `AttributeError` when accessing `LightningModule.logger` and the Trainer has multiple loggers ([#14234](https://github.com/Lightning-AI/lightning/pull/14234))
111-
112-
11398
- Fixed an issue in which the default name for a run in `WandbLogger` would be set to the project name instead of a randomly generated string ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))
114-
115-
11699
- Fixed not preserving set attributes on `DataLoader` and `BatchSampler` when instantiated inside `*_dataloader` hooks ([#14212](https://github.com/Lightning-AI/lightning/pull/14212))
117100

118101

119-
120102
## [1.7.1] - 2022-08-09
121103

122104
### Fixed

src/pytorch_lightning/serve/servable_module_validator.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ def __init__(
4747
server: Literal["fastapi", "ml_server", "torchserve", "sagemaker"] = "fastapi",
4848
host: str = "127.0.0.1",
4949
port: int = 8080,
50-
timeout: int = 10,
50+
timeout: int = 20,
5151
exit_on_failure: bool = True,
5252
):
5353
super().__init__()
@@ -109,7 +109,8 @@ def on_train_start(self, trainer: "pl.Trainer", servable_module: "pl.LightningMo
109109
except requests.exceptions.ConnectionError:
110110
pass
111111
if time.time() - t0 > self.timeout:
112-
raise Exception(f"The Server didn't start in {self.timeout}")
112+
process.kill()
113+
raise Exception(f"The server didn't start within {self.timeout} seconds.")
113114
time.sleep(0.1)
114115

115116
payload = servable_module.configure_payload()

src/pytorch_lightning/strategies/horovod.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,6 @@
3131
from pytorch_lightning.utilities.exceptions import MisconfigurationException
3232
from pytorch_lightning.utilities.imports import _HOROVOD_AVAILABLE
3333
from pytorch_lightning.utilities.rank_zero import rank_zero_only
34-
from pytorch_lightning.utilities.types import _LRScheduler
3534

3635
if _HOROVOD_AVAILABLE:
3736
import horovod.torch as hvd
@@ -114,8 +113,8 @@ def _unpack_lightning_optimizer(opt: Optimizer) -> Optimizer:
114113
lr_scheduler_configs = self.lr_scheduler_configs
115114
for config in lr_scheduler_configs:
116115
scheduler = config.scheduler
117-
assert isinstance(scheduler, _LRScheduler)
118-
scheduler.base_lrs = [lr * self.world_size for lr in scheduler.base_lrs]
116+
if hasattr(scheduler, "base_lrs"):
117+
scheduler.base_lrs = [lr * self.world_size for lr in scheduler.base_lrs] # type: ignore[union-attr]
119118

120119
assert self.lightning_module is not None
121120
# Horovod: broadcast parameters & optimizer state to ensure consistent initialization

0 commit comments

Comments
 (0)