Optimisers Submodule: Main use case
recursiveRouteChoice.optimisers.optimisers_file
Main user access point to optimisers module, note that important pieces are already directly
imported into the optimisers module namespace. The main class of interest is ScipyOptimiser
which provides an interface to the SciPy optimisation algorithms for the recursive logit.
- class recursiveRouteChoice.optimisers.optimisers_file.CustomOptimiserBase(hessian_type=OptimHessianType.BFGS, max_iter=4)
Bases:
OptimiserBase
Global wrapper object around all optim algs, delegate to sub classes for individual. Need to be clear about which properties are effectively read only and which store state.
Reviewed perspective after implementing Scipy support is that this should be allowed to know details of recursive logit - and it itself may wrap a generic algorithm.
- __init__(hessian_type=OptimHessianType.BFGS, max_iter=4)
- check_stopping_criteria()
- compute_relative_gradient_non_static(typf=1.0)
% Compute norm of relative gradient. Note this exists as we need this information in the stopping criteria, where getting the fields with explicit references would be messy
- iterate_step(model, verbose=False, output_file=None, debug_counter=None)
- set_beta_vec(beta_vec)
- set_current_value(value)
- class recursiveRouteChoice.optimisers.optimisers_file.LineSearchOptimiser(hessian_type=OptimHessianType.BFGS, max_iter=4)
Bases:
CustomOptimiserBase
- CURVATURE_CONDITION_PARAMETER = 0.9
- INITIAL_STEP_LENGTH = 1.0
- MAXIMUM_STEP_LENGTH = 1000
- METHOD_FLAG = 1
- MINIMUM_STEP_LENGTH = 0
- NEGATIVE_CURVATURE_PARAMETER = 0.0
- SUFFICIENT_DECREASE_PARAMETER = 0.0001
- X_TOLERANCE = 2.2e-16
- __init__(hessian_type=OptimHessianType.BFGS, max_iter=4)
- get_iteration_log(optim_vals: OptimFunctionState)
- iterate_step(optim_vals: OptimFunctionState, verbose=True, output_file=None, debug_counter=None)
Performs a single step of the line search iteration, evaluating the value function and taking a step based upon the gradient
- class recursiveRouteChoice.optimisers.optimisers_file.OptimType(value)
Bases:
Enum
An enumeration.
- LINE_SEARCH = 1
- TRUST_REGION = 2
- class recursiveRouteChoice.optimisers.optimisers_file.OptimiserBase
Bases:
ABC
Base class for both custom and scipy to inherit from
- LL_ERROR_VALUE = 99999
- METHOD_FLAG = None
- NUMERICAL_ERROR_THRESH = -0.001
- RESIDUAL = 0.001
- __init__()
- get_iteration_log(optim_vals: OptimFunctionState)
- class recursiveRouteChoice.optimisers.optimisers_file.ScipyOptimiser(method: str, options=None, fd_options=None)
Bases:
OptimiserBase
Wrapper around scipy.optimize.minimize to conform to the format we require.
- METHOD_FLAG = 'scipy-master'
- __init__(method: str, options=None, fd_options=None)
- Parameters:
method ({'cg', 'bfgs', 'newton-cg', 'l-bfgs-b', 'tnc', 'newton-cg', 'dogleg',) –
'trust-ncg' – or other scipy optimize valid flags
} ('trust-krylov') – or other scipy optimize valid flags
options –
fd_options –
- solve(optim_function_state: OptimFunctionState, verbose=True, output_file=None)
Solve for optimal beta which minimises the negative log likelihood. Analogous to iterate step method on custom alg, except that these methods don’t require step by step intervention.
- class recursiveRouteChoice.optimisers.optimisers_file.TrustRegionOptimiser(hessian_type=OptimHessianType.BFGS, max_iter=4)
Bases:
CustomOptimiserBase
- METHOD_FLAG = 2
- __init__(hessian_type=OptimHessianType.BFGS, max_iter=4)
- method = 2
Optimisers Submodule: Other files with minimal public API
recursiveRouteChoice.optimisers.extra_optim module
- class recursiveRouteChoice.optimisers.extra_optim.OptimFunctionState(value: float, grad: array, hessian: array, hessian_approx_type, val_and_grad_evaluation_function, beta_vec, function_evals_stat=None)
Bases:
object
Data Class to store elements of log likelihood state & grad which are relevant both in the computation of the log likelihood and also the global optimisation algorithm
“Bridge” between RL and the more general optimisation code
- __init__(value: float, grad: array, hessian: array, hessian_approx_type, val_and_grad_evaluation_function, beta_vec, function_evals_stat=None)
- Parameters:
value –
grad –
hessian –
val_and_grad_evaluation_function – function which takes in x_vec and returns the value and gradient at that particular x
beta_vec –
- val_grad_function(beta_vec=None)
recursiveRouteChoice.optimisers.hessian_approx module
- class recursiveRouteChoice.optimisers.hessian_approx.OptimHessianType(value)
Bases:
Enum
An enumeration.
- BFGS = 1
- recursiveRouteChoice.optimisers.hessian_approx.update_hessian_approx(approx_method, step, delta_grad, hessian)
recursiveRouteChoice.optimisers.line_search module
% More and Thuente Arc Search % This is a port of Tien Mai’s matlab code which is a modified implementation of the algorithm documented in: % % J. J. More and D. J. Thuente. Line Search Algorithms with Guaranteed % Sufficient Decrease. TOMS 20-3. September 1994. pg 286-307.
- class recursiveRouteChoice.optimisers.line_search.LineSearchFlags(value)
Bases:
Enum
An enumeration.
- TERMINATION_INTERVAL_TOO_SMALL = 4
- TERMINATION_MAX_FUNC_EVALS = 5
- TERMINATION_ROUNDING_ERROR = 1
- TERMINATION_STPMAX = 2
- TERMINATION_STPMIN = 3
- TERMINATION_STRONG_WOLFE_AND_STPMAX = 7
- TERMINATION_STRONG_WOLFE_MET = 6
- recursiveRouteChoice.optimisers.line_search.line_search_asrch(fcn, x, f, g, arc, stp, maxfev, ncur=0.0, ftol=0.0001, gtol=0.9, xtol=2.2e-16, stpmin=0, stpmax=1000, print_flag=False, fname=None, bisect=0.0)
outputs [x f g stp info_out_flag nfev] = % list of variables and parameters % extrap = parameter for extrapolations % is_bracketed = true once bracket containing solution is found % info_out_flag = status flag for output % nfev = number of function evaluations % s = displacement vector from arc % ds = derivative of displacement vector from arc % % stx = step size at “l” point % fx = function value at “l” point % dx = derivative of search function at “l” point % % sty = step size at “u” point % fy = function value at “u” point % dy = derivative of search function at “u” point % % stp = trial step size % fp = function value at trial step size % dp = derivative of search function at trial step size % % mfx = modified function value at “l” point % mdx = modified derivative value at “l” point % % mfp = modified function value at trial point % mdp = modified derivative value at trial point % % Note al and au define the bounds of the bracket if one is found. al and au % are the endpoints of the bracket, but are not ordered. % % finit = initial function value % ginit = initial gradient % amin = minimium step size % amax = maximum step size % ucase = arc search update case
- recursiveRouteChoice.optimisers.line_search.line_search_astep(stx, fx, dx, sty, fy, dy, stp, fp, dp, brackt, stpmin, stpmax)
% This function computes a safeguarded step for a search % procedure and updates an interval that contains a step that % satisfies a sufficient decrease and a curvature condition. % % The parameter stx contains the step with the least function % value. If brackt is set to true (1) then a minimizer has % been bracketed in an interval with endpoints stx and sty. % The parameter stp contains the current step. % The subroutine assumes that if brackt is set to true then % % min(stx,sty) < stp < max(stx,sty), % % and that the derivative at stx is negative in the direction % of the step. % % The subroutine statement is % % stf = line_search_astep(stx,fx,dx,sty,fy,dy,stp,fp,dp,brackt,stpmin,stpmax) % % where % % stx is a double precision variable. % On entry stx is the best step obtained so far and is an % endpoint of the interval that contains the minimizer. % % fx is a double precision variable. % On entry fx is the function at stx. % % dx is a double precision variable. % On entry dx is the derivative of the function at % stx. The derivative must be negative in the direction of % the step, that is, dx and stp - stx must have opposite % signs. % % sty is a double precision variable. % On entry sty is the second endpoint of the interval that % contains the minimizer. % % fy is a double precision variable. % On entry fy is the function at sty. % % dy is a double precision variable. % On entry dy is the derivative of the function at sty. % % stp is a double precision variable. % On entry stp is the current step. If brackt is set to true % then on input stp must be between stx and sty. % % fp is a double precision variable. % On entry fp is the function at stp % % dp is a double precision variable. % On entry dp is the the derivative of the function at stp. % % brackt is an logical variable. % On entry brackt specifies if a minimizer has been bracketed. % Initially brackt must be set to .false. % % stpmin is a double precision variable. % On entry stpmin is a lower bound for the step. % % stpmax is a double precision variable. % On entry stpmax is an upper bound for the step. % % MINPACK-1 Project. June 1983 % Argonne National Laboratory. % Jorge J. More’ and David J. Thuente. % % MINPACK-2 Project. November 1993. % Argonne National Laboratory and University of Minnesota. % Brett M. Averick and Jorge J. More’. :return: :rtype: