Skip to content

Commit

Permalink
Fix typo in BasePredictionWriter documentation (#18381)
Browse files Browse the repository at this point in the history
  • Loading branch information
Borodin authored Aug 24, 2023
1 parent 9496d9a commit 9d7a284
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion docs/source-pytorch/deploy/production_basic.rst
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ By using the predict step in Lightning you get free distributed inference using
torch.save(batch_indices, os.path.join(self.output_dir, f"batch_indices_{trainer.global_rank}.pt"))
# or you can set `writer_interval="batch"` and override `write_on_batch_end` to save
# or you can set `write_interval="batch"` and override `write_on_batch_end` to save
# predictions at batch level
pred_writer = CustomWriter(output_dir="pred_path", write_interval="epoch")
trainer = Trainer(accelerator="gpu", strategy="ddp", devices=8, callbacks=[pred_writer])
Expand Down
2 changes: 1 addition & 1 deletion src/lightning/pytorch/callbacks/prediction_writer.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ def write_on_epoch_end(self, trainer, pl_module, predictions, batch_indices):
torch.save(batch_indices, os.path.join(self.output_dir, f"batch_indices_{trainer.global_rank}.pt"))
# or you can set `writer_interval="batch"` and override `write_on_batch_end` to save
# or you can set `write_interval="batch"` and override `write_on_batch_end` to save
# predictions at batch level
pred_writer = CustomWriter(output_dir="pred_path", write_interval="epoch")
trainer = Trainer(accelerator="gpu", strategy="ddp", devices=8, callbacks=[pred_writer])
Expand Down

0 comments on commit 9d7a284

Please sign in to comment.