Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 6 additions & 7 deletions src/lightning/pytorch/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -226,6 +226,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Removed the unused `lightning.pytorch.utilities.finite_checks.print_nan_gradients` function ([#16682](https://github.com/Lightning-AI/lightning/pull/16682))
- Removed the unused `lightning.pytorch.utilities.finite_checks.detect_nan_parameters` function ([#16682](https://github.com/Lightning-AI/lightning/pull/16682))
- Removed the unused `lightning.pytorch.utilities.parsing.flatten_dict` function ([#16744](https://github.com/Lightning-AI/lightning/pull/16744))
- Removed the unused `lightning.pytorch.utilities.metrics.metrics_to_scalars` function ([#16681](https://github.com/Lightning-AI/lightning/pull/16681))
- Removed the unused `lightning.pytorch.utilities.supporters.{SharedCycleIteratorState,CombinedLoaderIterator}` classes ([#16714](https://github.com/Lightning-AI/lightning/pull/16714))


- Tuner removal
Expand All @@ -241,8 +244,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Removed the `using_lbfgs` argument from `LightningModule.optimizer_step` hook ([#16538](https://github.com/Lightning-AI/lightning/pull/16538))


- Removed the `Trainer.data_parallel` property. Use `isinstance(trainer.strategy, ParallelStrategy)` instead ([#16703](https://github.com/Lightning-AI/lightning/pull/16703))


- Removed support for multiple optimizers in automatic optimization mode ([#16539](https://github.com/Lightning-AI/lightning/pull/16539))
* Removed `opt_idx` argument from `BaseFinetuning.finetune_function` callback method
* Removed `opt_idx` argument from `Callback.on_before_optimizer_step` callback method
Expand All @@ -265,13 +270,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Removed `PrecisionPlugin.dispatch` ([#16618](https://github.com/Lightning-AI/lightning/pull/16618))

- Removed the unused `lightning.pytorch.utilities.metrics.metrics_to_scalars` function ([#16681](https://github.com/Lightning-AI/lightning/pull/16681))

- Removed the unused `lightning.pytorch.utilities.supporters.{SharedCycleIteratorState,CombinedLoaderIterator}` classes ([#16714](https://github.com/Lightning-AI/lightning/pull/16714))

### Fixed

-
- Fixed an attribute error and improved input validation for invalid strategy types being passed to Trainer ([#16693](https://github.com/Lightning-AI/lightning/pull/16693))


- Fixed early stopping triggering extra validation runs after reaching `min_epochs` or `min_steps` ([#16719](https://github.com/Lightning-AI/lightning/pull/16719))
Expand All @@ -289,8 +290,6 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed the batch_sampler reference for DataLoaders wrapped with XLA's MpDeviceLoader ([#16571](https://github.com/Lightning-AI/lightning/pull/16571))
- Fixed an import error when `torch.distributed` is not available ([#16658](https://github.com/Lightning-AI/lightning/pull/16658))

- Fixed an attribute error and improved input validation for invalid strategy types being passed to Trainer ([#16693](https://github.com/Lightning-AI/lightning/pull/16693))


## [1.9.0] - 2023-01-17

Expand Down
2 changes: 1 addition & 1 deletion src/lightning/pytorch/utilities/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
_TORCHVISION_AVAILABLE,
)
from lightning.pytorch.utilities.parameter_tying import find_shared_parameters, set_shared_parameters # noqa: F401
from lightning.pytorch.utilities.parsing import AttributeDict, flatten_dict, is_picklable # noqa: F401
from lightning.pytorch.utilities.parsing import AttributeDict, is_picklable # noqa: F401
from lightning.pytorch.utilities.rank_zero import ( # noqa: F401
rank_zero_deprecation,
rank_zero_info,
Expand Down
13 changes: 0 additions & 13 deletions src/lightning/pytorch/utilities/parsing.py
Original file line number Diff line number Diff line change
Expand Up @@ -189,19 +189,6 @@ def collect_init_args(
return path_args


def flatten_dict(source: Dict[str, Any], result: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
if result is None:
result = {}

for k, v in source.items():
if isinstance(v, dict):
_ = flatten_dict(v, result)
else:
result[k] = v

return result


def save_hyperparameters(
obj: Any, *args: Any, ignore: Optional[Union[Sequence[str], str]] = None, frame: Optional[types.FrameType] = None
) -> None:
Expand Down
9 changes: 0 additions & 9 deletions tests/tests_pytorch/utilities/test_parsing.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@
AttributeDict,
clean_namespace,
collect_init_args,
flatten_dict,
get_init_args,
is_picklable,
lightning_getattr,
Expand Down Expand Up @@ -296,11 +295,3 @@ def test_attribute_dict(tmpdir):
ad = AttributeDict({"key1": 1})
ad.key1 = 123
assert ad.key1 == 123


def test_flatten_dict(tmpdir):
d = {"1": 1, "_": {"2": 2, "_": {"3": 3, "4": 4}}}

expected = {"1": 1, "2": 2, "3": 3, "4": 4}

assert flatten_dict(d) == expected