4/08/2022

Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance

 

Try #1

from pytorch_lightning.strategies.ddp import DDPStrategy
trainer = pl.Trainer(
strategy = DDPStrategy(find_unused_parameters=False),
accelerator = 'gpu',
devices = 3
)

..

Try #2

from pytorch_lightning.plugins import DDPPlugin

trainer = pl.Trainer(
val_check_interval=0.1,
gpus=-1,
accelerator="ddp",
callbacks=[checkpoint_callback, early_stop_callback],
plugins=DDPPlugin(find_unused_parameters=False),
precision=16,
)

..


Thank you.


No comments:

Post a Comment