Data Loading: (recursiveRouteChoice.data_loading)

File containing IO data loading and standard preprocessing steps to construct data matrices from input files

recursiveRouteChoice.data_loading.load_csv_to_sparse(fname, dtype=None, delim=None, square_matrix=True, shape=None) coo_matrix

IO function to load row, col, val CSV and return a sparse scipy matrix.

Parameters:
  • fname (str) –

  • dtype

  • delim (str, optional) –

  • square_matrix (bool) – means that the input should be square and we will try to square it by adding a row (this is commonly required in data)

  • shape

Return type:

COO matrix

recursiveRouteChoice.data_loading.load_obs_from_json(filename)
recursiveRouteChoice.data_loading.load_standard_path_format_csv(directory_path, delim=None, match_tt_shape=False, angles_included=True)

Returns the observations and list of matrices loaded from specified file directory :param directory_path: folder which contains files :type directory_path: str or os.PathLike[Any] :param delim: csv separator (i.e. “,”, “ “, …) :type delim: str :param match_tt_shape: trim all matrixes to have same shape as travel time mtrix :type match_tt_shape: bool :param angles_included: Boolean flag controlling whether angle file is expected or not. :type angles_included: bool :return: observations matrix and list of all data matrices :rtype: tuple[dok_matrix, list[dok_matrix]

recursiveRouteChoice.data_loading.load_tntp_node_formulation(net_fpath, columns_to_extract=None, sparse_format=True)

:param net_fpath path to network file :param columns_to_extract list of columns to keep. init_node and term_node are always kept # and form the basis of the arc-arc matrix. Currently only length is supported since the conversion from node to arc is not clear in this case. Legal columns to extract are: { capacity, length, free_flow_time, b, power, speed, critical_speed, toll, link_type, lanes } # Note that some of these will be constant across arcs and are redundant to include.

:return :rtype [ scipy.sparse.coo_matrix, list of str]

recursiveRouteChoice.data_loading.load_tntp_to_sparse_arc_formulation(net_fpath, columns_to_extract=None, use_file_order_for_arc_numbers=True, standardise=None)
Parameters:
  • net_fpath (str) – file path to read from

  • columns_to_extract (list of str) – name of network file attributes to extra

  • use_file_order_for_arc_numbers (bool) –

  • standardise (str, optional) –

Return type:

[dict, scipy.sparse.dok_matrix]

recursiveRouteChoice.data_loading.resize_to_dims(matrix: dok_matrix, expected_max_shape, matrix_name_debug='(Name not provided)')

Resizes matrix to specified dims, issues warning if this is losing data from the matrix. Application is more general than the current error message suggests. Note the fact that the matrix is sparse is essential, numpy resize behaves differently to scipy. Note also, this upcasts dimensions if too small

Note we use this since it is easier to read in a too large or small matrix from file and correct than limit the size from IO - exceptions get thrown

Parameters:
recursiveRouteChoice.data_loading.write_obs_to_json(filename, obs, allow_rewrite=False)