lightning_trainer module

Full Documentation for hippynn.experiment.lightning_trainer module. Click here for a summary page.

Pytorch Lightning training interface.

This module is somewhat experimental. Using pytorch lightning successfully in a distributed context may require understanding and adjusting the various settings related to parallelism, e.g. multiprocessing context, torch ddp backend, and how they interact with your HPC environment.

Some features of hippynn experiments may not be implemented yet.
  • The plotmaker is currently not supported.

class HippynnDataModule(*args: Any, **kwargs: Any)[source]

Bases: LightningDataModule

test_dataloader()[source]
Returns:

train_dataloader()[source]
Returns:

val_dataloader()[source]
Returns:

class HippynnLightningModule(*args: Any, **kwargs: Any)[source]

Bases: LightningModule

A pytorch lightning module for running a hippynn experiment.

configure_optimizers()[source]
Returns:

classmethod from_experiment_setup(training_modules: TrainingModules, database: Database | None, setup_params: SetupParams, **kwargs)[source]

Create a lightning module using the same arguments as for hippynn.experiment.setup_and_train().

Parameters:
  • training_modules

  • database

  • setup_params

  • kwargs

Returns:

lightning_module, database

classmethod from_train_setup(training_modules: TrainingModules, database: Database | None, controller: Controller, metric_tracker: MetricTracker, callbacks=None, batch_callbacks=None, **kwargs)[source]

Create a lightning module from the same arguments as for hippynn.experiment.train_model().

Parameters:
  • training_modules

  • database

  • controller

  • metric_tracker

  • callbacks

  • batch_callbacks

  • kwargs

Returns:

lightning_module, database

classmethod load_from_checkpoint(checkpoint_path, map_location=None, structure_file=None, hparams_file=None, strict=True, **kwargs)[source]
Parameters:
  • checkpoint_path

  • map_location

  • structure_file

  • hparams_file

  • strict

  • kwargs

Returns:

on_load_checkpoint(checkpoint) None[source]
Parameters:

checkpoint

Returns:

on_save_checkpoint(checkpoint) None[source]
Parameters:

checkpoint

Returns:

on_test_end()[source]
Returns:

on_test_epoch_end()[source]
Returns:

on_train_epoch_start()[source]
Returns:

on_validation_end()[source]
Returns:

on_validation_epoch_end()[source]
Returns:

test_step(batch, batch_idx)[source]
Parameters:
  • batch

  • batch_idx

Returns:

training_step(batch, batch_idx)[source]
Parameters:
  • batch

  • batch_idx

Returns:

validation_step(batch, batch_idx)[source]
Parameters:
  • batch

  • batch_idx

Returns:

class LightingPrintStagesCallback(*args: Any, **kwargs: Any)[source]

Bases: Callback

This callback is for debugging only. It prints whenever a callback stage is entered in pytorch lightning.

k = '__weakref__'