Helper Functions
Hamster.get_eigenvalues — Function
get_eigenvalues(ham::EffectiveHamiltonian, prof, local_inds, comm, conf=get_empty_config();
Nbatch=get_nbatch(conf), rank=0, nranks=1, verbosity=get_verbosity(conf))Computes eigenvalues and eigenvectors of the Hamiltonian for a set of structures, distributing the computation across MPI ranks.
Arguments
ham::EffectiveHamiltonian: The effective Hamiltonian object.prof: A profiling object that stores timing information for each step.local_inds: Indices of the local structures assigned to the current MPI rank.comm: The MPI communicator used for parallel execution.conf: Configuration object (default:get_empty_config()) containing parameters for diagonalization.Nbatch: The batch size for processing structures (default:get_nbatch(conf)).rank: The rank of the MPI process (default:0).nranks: Total number of MPI ranks (default:1).verbosity: Level of verbosity for printed output (default:get_verbosity(conf)).
Hamster.hyper_optimize — Function
hyper_optimize(param_values, params, comm, conf; rank=0, nranks=1, verbosity=get_verbosity(conf)) -> Float64Evaluate a given set of hyperparameters by updating a configuration and running an optimization calculation.
Arguments
param_values::Vector{Float64}: Numerical values for each parameter to be optimized.params::Vector{String}: List of parameter keys. Keys can be flat (e.g.,"alpha") or hierarchical (e.g.,"Ga_alpha").comm: MPI communicator.conf: Configuration object.rank::Int: MPI rank (default = 0).nranks::Int: Total number of MPI processes (default = 1).verbosity::Int: Verbosity level (default = pulled from configuration).
Returns
Float64: The minimum training loss obtained from the optimization calculation.