-
Notifications
You must be signed in to change notification settings - Fork 91
Closed
Description
In a quest to use the latest versions of the different dependencies (mainly for torch2) I have encountered an issue with torch-lightning.
The API has changed in some slight ways, in particular this reset line here has been deprecated:
torchmd-net/torchmdnet/module.py
Lines 160 to 163 in e02c076
if should_reset: | |
# reset validation dataloaders before and after testing epoch, which is faster | |
# than skipping test validation steps by returning None | |
self.trainer.reset_val_dataloader(self) |
There is a succint message refering to this in the docs:
* - used ``trainer.reset_*_dataloader()`` methods
- use ``Loop.setup_data()`` for the top-level loops
- `PR16726`_
Pointing to this PR: Lightning-AI/pytorch-lightning#16726
I do not know, however, what this line is doing exactly and I do not understand the comment.
Do you have some pointers?
cc @PhilippThoelke
Metadata
Metadata
Assignees
Labels
No labels