ad_optim_lbfgs_mod

optim.ad_optim_lbfgs_mod.optimize_state(state, ctm_env_init, loss_fn, obs_fn=None, post_proc=None, main_args=<config.MAINARGS object>, opt_args=<config.OPTARGS object>, ctm_args=<config.CTMARGS object>, global_args=<config.GLOBALARGS object>)[source]
Parameters
  • state (IPEPS) – initial wavefunction

  • ctm_env_init (ENV) – initial environment corresponding to state

  • loss_fn (function(IPEPS,ENV,CTMARGS,OPTARGS,GLOBALARGS)->torch.tensor) – loss function

  • model (TODO Model base class) – model with definition of observables

  • local_args (argparse.Namespace) – parsed command line arguments

  • opt_args (OPTARGS) – optimization configuration

  • ctm_args (CTMARGS) – CTM algorithm configuration

  • global_args (GLOBALARGS) – global configuration

Optimizes initial wavefunction state with respect to loss_fn using optim.lbfgs_modified.LBFGS_MOD optimizer. The main parameters influencing the optimization process are given in config.OPTARGS. Calls to functions loss_fn, obs_fn, and post_proc pass the current configuration as dictionary {"ctm_args":ctm_args, "opt_args":opt_args}

optim.ad_optim_lbfgs_mod.store_checkpoint(checkpoint_file, state, optimizer, current_epoch, current_loss, verbosity=0)[source]
Parameters
  • checkpoint_file (str or Path) – target file

  • state (IPEPS) – ipeps wavefunction

  • optimizer (torch.optim.Optimizer) – Optimizer

  • current_epoch (int) – current epoch

  • current_loss (float) – current value of a loss function

  • verbosity (int) – verbosity

Store the current state of the optimization in checkpoint_file.