Torch-Lightning library (draft)

How to visualize gradients with torch-lightning and tensorboard in your model class define a optimizer_step. class Model(pl.LightningModule): # ... def optimizer_step( self, epoch: int, batch_idx: int, optimizer, optimizer_idx: int, second_order_closure = None, ) -> None: if self.trainer.use_tpu and XLA_AVAILABLE: xm.optimizer_step(optimizer) elif isinstance(optimizer, torch.optim.LBFGS): optimizer.step(second_order_closure) else: optimizer.step() #### Gradient reporting start ### if batch_idx % 500 == 0: for tag, param in self.model.named_parameters(): self.logger.experiment.add_histogram('{}_grad'.format(tag), param.grad.cpu().detach()) #### Gradient reporting end ### # clear gradients optimizer....

April 29, 2000 · SergeM