Optimisers Submodule: Main use case

recursiveRouteChoice.optimisers.optimisers_file

Main user access point to optimisers module, note that important pieces are already directly imported into the optimisers module namespace. The main class of interest is ScipyOptimiser which provides an interface to the SciPy optimisation algorithms for the recursive logit.

class recursiveRouteChoice.optimisers.optimisers_file.CustomOptimiserBase(hessian_type=OptimHessianType.BFGS, max_iter=4)

Bases: OptimiserBase

Global wrapper object around all optim algs, delegate to sub classes for individual. Need to be clear about which properties are effectively read only and which store state.

Reviewed perspective after implementing Scipy support is that this should be allowed to know details of recursive logit - and it itself may wrap a generic algorithm.

__init__(hessian_type=OptimHessianType.BFGS, max_iter=4)
check_stopping_criteria()
compute_relative_gradient_non_static(typf=1.0)

% Compute norm of relative gradient. Note this exists as we need this information in the stopping criteria, where getting the fields with explicit references would be messy

iterate_step(model, verbose=False, output_file=None, debug_counter=None)
set_beta_vec(beta_vec)
set_current_value(value)
class recursiveRouteChoice.optimisers.optimisers_file.LineSearchOptimiser(hessian_type=OptimHessianType.BFGS, max_iter=4)

Bases: CustomOptimiserBase

CURVATURE_CONDITION_PARAMETER = 0.9
INITIAL_STEP_LENGTH = 1.0
MAXIMUM_STEP_LENGTH = 1000
METHOD_FLAG = 1
MINIMUM_STEP_LENGTH = 0
NEGATIVE_CURVATURE_PARAMETER = 0.0
SUFFICIENT_DECREASE_PARAMETER = 0.0001
X_TOLERANCE = 2.2e-16
__init__(hessian_type=OptimHessianType.BFGS, max_iter=4)
get_iteration_log(optim_vals: OptimFunctionState)
iterate_step(optim_vals: OptimFunctionState, verbose=True, output_file=None, debug_counter=None)

Performs a single step of the line search iteration, evaluating the value function and taking a step based upon the gradient

class recursiveRouteChoice.optimisers.optimisers_file.OptimType(value)

Bases: Enum

An enumeration.

TRUST_REGION = 2
class recursiveRouteChoice.optimisers.optimisers_file.OptimiserBase

Bases: ABC

Base class for both custom and scipy to inherit from

LL_ERROR_VALUE = 99999
METHOD_FLAG = None
NUMERICAL_ERROR_THRESH = -0.001
RESIDUAL = 0.001
__init__()
get_iteration_log(optim_vals: OptimFunctionState)
class recursiveRouteChoice.optimisers.optimisers_file.ScipyOptimiser(method: str, options=None, fd_options=None)

Bases: OptimiserBase

Wrapper around scipy.optimize.minimize to conform to the format we require.

METHOD_FLAG = 'scipy-master'
__init__(method: str, options=None, fd_options=None)
Parameters:
  • method ({'cg', 'bfgs', 'newton-cg', 'l-bfgs-b', 'tnc', 'newton-cg', 'dogleg',) –

  • 'trust-ncg' – or other scipy optimize valid flags

  • } ('trust-krylov') – or other scipy optimize valid flags

  • options

  • fd_options

solve(optim_function_state: OptimFunctionState, verbose=True, output_file=None)

Solve for optimal beta which minimises the negative log likelihood. Analogous to iterate step method on custom alg, except that these methods don’t require step by step intervention.

class recursiveRouteChoice.optimisers.optimisers_file.TrustRegionOptimiser(hessian_type=OptimHessianType.BFGS, max_iter=4)

Bases: CustomOptimiserBase

METHOD_FLAG = 2
__init__(hessian_type=OptimHessianType.BFGS, max_iter=4)
method = 2

Optimisers Submodule: Other files with minimal public API

recursiveRouteChoice.optimisers.extra_optim module

class recursiveRouteChoice.optimisers.extra_optim.OptimFunctionState(value: float, grad: array, hessian: array, hessian_approx_type, val_and_grad_evaluation_function, beta_vec, function_evals_stat=None)

Bases: object

Data Class to store elements of log likelihood state & grad which are relevant both in the computation of the log likelihood and also the global optimisation algorithm

“Bridge” between RL and the more general optimisation code

__init__(value: float, grad: array, hessian: array, hessian_approx_type, val_and_grad_evaluation_function, beta_vec, function_evals_stat=None)
Parameters:
  • value

  • grad

  • hessian

  • val_and_grad_evaluation_function – function which takes in x_vec and returns the value and gradient at that particular x

  • beta_vec

val_grad_function(beta_vec=None)

recursiveRouteChoice.optimisers.hessian_approx module

class recursiveRouteChoice.optimisers.hessian_approx.OptimHessianType(value)

Bases: Enum

An enumeration.

BFGS = 1
recursiveRouteChoice.optimisers.hessian_approx.update_hessian_approx(approx_method, step, delta_grad, hessian)