ad_optim_sgd_mod¶
- optim.ad_optim_sgd_mod.optimize_state(state, ctm_env_init, loss_fn, obs_fn=None, post_proc=None, main_args=<config.MAINARGS object>, opt_args=<config.OPTARGS object>, ctm_args=<config.CTMARGS object>, global_args=<config.GLOBALARGS object>)[source]¶
- Parameters:
state (IPEPS) – initial wavefunction
ctm_env_init (ENV) – initial environment corresponding to
state
loss_fn (function(IPEPS,ENV,CTMARGS,OPTARGS,GLOBALARGS)->torch.tensor) – loss function
model (TODO Model base class) – model with definition of observables
local_args (argparse.Namespace) – parsed command line arguments
opt_args (OPTARGS) – optimization configuration
ctm_args (CTMARGS) – CTM algorithm configuration
global_args (GLOBALARGS) – global configuration
Optimizes initial wavefunction
state
with respect toloss_fn
usingoptim.lbfgs_modified.SGD_MOD
optimizer. The main parameters influencing the optimization process are given inconfig.OPTARGS
. Calls to functionsloss_fn
,obs_fn
, andpost_proc
pass the current configuration as dictionary{"ctm_args":ctm_args, "opt_args":opt_args}
.
- optim.ad_optim_sgd_mod.store_checkpoint(checkpoint_file, state, optimizer, current_epoch, current_loss, verbosity=0)[source]¶
- Parameters:
checkpoint_file (str or Path) – target file
parameters (list[torch.tensor]) – wavefunction parameters
optimizer (torch.optim.Optimizer) – Optimizer
current_epoch (int) – current epoch
current_loss (float) – current value of a loss function
verbosity (int) – verbosity
Store the current state of the optimization in
checkpoint_file
.