Angle Finding
JuliQAOA uses Enzyme.jl to enable automatic differentiation (also referred to as "AD" or "autodiff"). In short, AD allows for calculating the gradient of a function f(x1, x2,...) in the same time as a single evaluation of f with a (roughly) constant overhead, regardless of how many parameters f depends on.
The first time you run any gradient-based function (e.g. grad) there will likely be a delay of a few seconds while Enzyme does some precomputation and caching.
The first time you run grad with a General mixer (including mixer_clique and mixer_ring) you will likely get the warning:
Warning: Using fallback BLAS replacements, performance may be degradedThis is a known issue with Enzyme, and should hopefully be addressed soon. It does not indicate any incorrect results, just that things are a bit slower than they need to be until some kinks between BLAS and Enzyme are worked out.
JuliQAOA.find_angles_bhJuliQAOA.find_local_maximumJuliQAOA.find_local_minimumJuliQAOA.gradJuliQAOA.grad!JuliQAOA.grover_th
JuliQAOA.grad — Functiongrad(angles, mixer, obj_vals)Calculate the gradient of exp_value(angles, mixer, obj_vals).
Can be extended to incorporate non-standard initial state $|\psi_0\rangle$ and observables in the same way as exp_value.
grad does not currently work with Grover mixers. For unconstrained problems you can use the equivalent mixer_x(n, 0:n)/2^n, and for constrained problems you can use N = binom(n,k); mixer_general(dicke_states(n,k), ones(N,N)/N). These will both give you correct results, but are slower than the Grover implementation.
JuliQAOA.grad! — Functiongrad!(G, angles, mixer, obj_vals; flip_sign=false)Calculate the gradient of exp_value(angles, mixer, obj_vals), storing the result in the vector G.
The optional argument flip_sign adds an overall minus sign to G, which can be necessary to switch between using grad! to identify angles which minimize or maximize exp_value.
Can be extended to incorporate non-standard initial state $|\psi_0\rangle$ and observables in the same way as exp_value.
JuliQAOA.find_local_minimum — Functionfind_local_minimum(angles, mixer, obj_vals)Find $\{\beta_i,\gamma_i\}$ which minimize the $H_C$ defined by obj_vals, beginning at angles and doing local search with BFGS.
Angles are returned in the ranges $\beta_i \in [0,T_{H_M}],~\gamma_i \in [0,T_{H_C}]$, where the operator periods $T$ are calculated with get_operator_period.
Can be extended to incorporate non-standard initial state $|\psi_0\rangle$ and observables in the same way as exp_value.
JuliQAOA.find_local_maximum — Functionfind_local_maximum(angles, mixer, obj_vals)Find $\{\beta_i,\gamma_i\}$ which maximize the $H_C$ defined by obj_vals, beginning at angles and doing local search with BFGS.
Angles are returned in the ranges $\beta_i \in [0,T_{H_M}],~\gamma_i \in [0,T_{H_C}]$, where the operator periods $T$ are calculated with get_operator_period.
Can be extended to incorporate non-standard initial state $|\psi_0\rangle$ and observables in the same way as exp_value.
While many optimization packages, e.g. scipy.optimize, Optim.jl, only include minimization, we explicitly include both minimization and maximization. This is because the standard trick of flipping the sign of the objective function to switch between maximization and minimization doesn't work quite as smoothly with QAOA, as adding a minus sign in front of $H_C$ messess up the phases. Specifically, the fact that
exp_value(angles, mixer, obj_vals) != -exp_value(angles, mixer, -obj_vals)makes it so that $\{\beta_i, \gamma_i\}$ which minimize a given QAOA do not maximize that same QAOA under the replacement $H_C \to -H_C$. In order to avoid having to clarify whether some angles are good for e.g. "maximizing MaxCut" or "minimizing -MaxCut", we provide separate implementations for maximization and minimization which avoid this sign confusion.
JuliQAOA.find_angles_bh — Methodfind_angles_bh(p, mixer, obj_vals; max=true, niter=100, file=nothing, verbose=true)Find good angles up for the QAOA defined by mixer, obj_vals up to p rounds.
Uses an iterative, round-by-round angle finding algorithm that combines angle extrapolation and basinhopping to find high quality angles up to p rounds. See this paper, section "Angle & Threshold Finding" for more details.
Optional arguments:
max=false: determines whether the goal is to minimize or maximizeexp_valueniter=100: determines the number of basinhopping iterationsfile=nothing: save the resulting angles and expectation values in a plain textfileverbose=true: print a running log of the angle finding results
Additional options:
find_angles_bh(sv, p, mixer, obj_vals)Specify the custom initial state $|\psi_0\rangle$ = sv.
find_angles_bh(p, mixer, obj_vals, measure)Specify a cost function measure other than the default $\langle H_C \rangle$ to maximize/minimize.
find_angles_bh(p, mixer, obj_vals, measure)Specify both a custom initial state and observable to optimize.
JuliQAOA.grover_th — Functiongrover_th_ev(p, obj_vals; max=true)Returns the optimal expectation value for the Grover-Th QAOA variant introduced here.
Grover-Th represent a direct port of Grover's search algorithm in the QAOA context. In other words, the output of this function represents the expectation value at p rounds that one could recover by simply doing unstructured search for optimal states (technically it's a bit more complicated than this, see here for a more precise characterization). This can serve as a nice benchmark for QAOA performance, that is, QAOA should at least be able to beat unstructured search.
Set max=false for minimization problems.