ad_optim¶
-
optim.ad_optim.
optimize_state
(state, ctm_env_init, loss_fn, obs_fn=None, post_proc=None, main_args=<config.MAINARGS object>, opt_args=<config.OPTARGS object>, ctm_args=<config.CTMARGS object>, global_args=<config.GLOBALARGS object>)[source]¶ - Parameters
state (IPEPS) – initial wavefunction
ctm_env_init (ENV) – initial environment of
state
loss_fn (function(IPEPS,ENV,dict)->torch.tensor) – loss function
obs_fn (function(IPEPS,ENV,dict)->None) – optional function to evaluate observables
post_proc (function(IPEPS,ENV,dict)->None) – optional function for post-processing the state and environment
main_args (MAINARGS) – main configuration
opt_args (OPTARGS) – optimization configuration
ctm_args (CTMARGS) – CTM algorithm configuration
global_args (GLOBALARGS) – global configuration
Optimizes initial wavefunction
state
with respect toloss_fn
using LBFGS optimizer. The main parameters influencing the optimization process are given inconfig.OPTARGS
. Calls to functionsloss_fn
,obs_fn
, andpost_proc
pass the current configuration as dictionary{"ctm_args":ctm_args, "opt_args":opt_args}
.
-
optim.ad_optim.
store_checkpoint
(checkpoint_file, state, optimizer, current_epoch, current_loss)[source]¶ - Parameters
checkpoint_file (str or Path) – target file
state (IPEPS) – ipeps wavefunction
optimizer (torch.optim.Optimizer) – Optimizer
current_epoch (int) – current epoch
current_loss (float) – current value of a loss function
Store the current state of the optimization in
checkpoint_file
.